The intrinsic nature of tic disorders, characterized by symptom variability and fluctuation, poses challenges in clinical evaluations. Currently, tic assessments predominantly rely on subjective questionnaires administered periodically during clinical visits, thus lacking continuous quantitative evaluation. This study aims to establish an automatic objective measure of tic expression in natural behavioral settings. A custom-developed smartphone application was used to record selfie-videos of children and adolescents with tic disorders exhibiting facial motor tics. Facial landmarks were utilized to extract tic-related features from video segments. These features were then passed through a tandem of custom deep neural networks to learn spatial and temporal properties for tic classification. The model achieved a mean accuracy of 95% when trained on data across all subjects, and consistently exceeded 90% accuracy in leave-one-session-out and leave-one-subject-out cross validation training schemes. This automatic tic identification measure may provide a valuable tool for clinicians in facilitating diagnosis, patient follow-up, and treatment efficacy evaluation. Combining this measure with standard smartphone technology has the potential to revolutionize large-scale clinical studies, thereby expediting the development and testing of novel interventions.