Do the Eyes Tell the Story? AI, Eye Movement, and Autism Screening
Tuesday, November 25, 2025
For over a hundred years, scientists have tracked eye movements to learn about the cognitive processes behind the human gaze. Today, practitioners use tablet computers and tech-enabled glasses to track these telling eye movements, and they can use artificial intelligence to identify subtle differences in gaze patterns linked to autism. These methods work in children as young as 16 months old—years earlier than the average age of autism identification, which is around 5 years old. Researchers hope that the speed and ease of this technology will lead to earlier autism identification. That, in turn, could lead to earlier evaluations, earlier interventions, and better outcomes for autistic children.
How Do Eye-Tracking Devices Work?
Eye-movement tracking devices emit a beam of light the human eye cannot detect. When the light flashes off the cornea, a camera captures the precise location of the glint. By measuring the distance from the glint to the pupil, the device can determine exactly where someone focuses on a screen while a video or still image is shown. Data is captured rapidly—sometimes more than 100 images per second.
Eye-movement trackers gather data on one or more of these movement types:
- Saccades, which are short, quick shifts from one spot to another
- Fixations, or places where the eye stops moving
- Scan path, which is a succession of movements and fixations
- Blink rate, or how quickly the eyelids open and shut
Analyzing these eye movements, researchers can determine what holds someone’s attention, and for how long (Jeyarani & Senthilkumar, 2023). Studies have shown that attentional patterns often differ in autistic individuals. For example, in a 2023 study, a group of autistic children and a group of neurotypical children watched the same videos. The videos showed human beings and cartoon characters dancing, clapping, nodding, and performing other activities. Most autistic children spent more time looking at cartoon characters, while most neurotypical children spent more time looking at human faces and bodies (Meng et al., 2023).
On a more minute level, studies have shown that the distance the eye travels between points of fixation (saccadic amplitude) is often different in autistic people. The amount of time it takes to move from one point of fixation to another (saccadic duration) is also atypical in autistic people. These precise differences may allow clinicians to identify autistic profiles in gaze patterns (Setu, 2025).
How Are Gaze Patterns Linked to Autism?
Eye gaze patterns have been linked to core characteristics of autism such as intense preferred interests, repetitive behaviors, and differences in social behavior. Here’s a quick look at what research tells us.
Eye Gaze and Social Attention
Numerous eye-tracking studies have shown that autistic individuals tend to pay less attention to social elements in videos and more attention to nonsocial elements (Nayar et al., 2022). Similarly, some eye-tracking studies have concluded that the more time autistic people spend looking at faces, the better their social functioning tends to be. It’s important to point out that many of these studies included mostly male, mostly white participants, and that some studies did not find a strong connection between face-looking and social ability (Riddiford et al., 2022).
Eye Gaze and Repetitive Movement
In another study, a researcher explored whether potentially autistic children would pay more attention to repetitive, predictable movements than they did to random movements. The study showed potentially autistic children had a “preference for predictable movements over random movements, particularly during the second half of the stimulus presentation” (Omori, 2025).
An earlier study, conducted in 2020, reached a similar conclusion. Shown videos of random movements and repetitive movements, most autistic children preferred to look at repetitive movements—and that held true whether the moving object was biological or geometric. One important note: Researchers urged caution “when using visual preferences to different types of movements to screen children with ASD due to the high heterogeneity” among autistic children (Li et al., 2020).
Eye Gaze and Special Interests
Fewer studies have focused on what eye gaze patterns tell us about another core trait of autism: intense preferred interests. A 2023 study used eye-movement data to measure toddlers’ interest in different groups of objects (Sun et al., 2023). The study involved images of 24 objects, half of which were “non-social objects such as means of transportation and electrical appliances, which are likely to appeal to individuals with ‘autistic’ interests.” The other half of the images were considered neutral interest objects such as hats and balloons.
Children viewed the objects, randomly arranged, for a period of 10 seconds each. Some children’s eye movements showed much more interest in the nonsocial objects. Those children were found to have higher total scores on the Autism Diagnostic Observation Scale, Second Edition (ADOS®-2), an assessment the researchers said is “well recognized as a gold standard diagnostic tool for ASD.” The study participants also had higher scores on the ADOS-2 sub-scale that measures restricted and repetitive behaviors (Sun et al., 2023).
This study was not large enough to reliably evaluate the role gender may have played in the results. Researchers noted, “…restricted interests in female patients with ASD are less frequent and more difficult to observe compared with male subjects” (Sun et al., 2023).
Learn more about special interests in autistic girls and women.
What’s the Role of Machine Learning in Identifying Autism?
Machine learning is a kind of artificial intelligence. It enables systems to learn from large sets of data, finding subtle patterns and “learning” as it gains information and experience. Machine learning models have been used to visualize eye-movement data from many people and identify gaze patterns linked to autism (Al-Adhaileh et al., 2025).
In recent studies, researchers have trained several advanced deep learning models with publicly available eye-movement datasets. In one study, eye-movement data from 547 individuals was converted into visual images, which were then clarified and converted into pixel values. These values are compatible with deep learning algorithms trained to “recognize” the characteristics of an image.
Researchers trained the models using 1/3 of the data, validated the results using another third, and tested its generalization on the remaining third. Some of their models were able to improve the accuracy of autism identification significantly (Alsharif et al., 2024). Future studies with larger samples may enable AI models to identify autism in people whose traits are more nuanced or differently presented.
No. While eye-movement tracking devices may become an important tool for clinicians and practitioners, they are one tool among many. Comprehensive autism evaluations conducted by multidisciplinary teams are necessary to understand which interventions and supports will build on a child’s unique strengths and meet their individual needs. Read more about whole child evaluation here.
What Do These Breakthroughs Mean for Practitioners?
Researchers say eye-movement tracking systems could benefit practitioners and families in several ways:
- Eye-tracking systems have been effective in children as young as 16 months. Earlier identification may mean children can access services and interventions during peak periods of neuroplasticity.
- The availability of this technology may mean that in areas where waiting lists are long, and practitioners are scarce or overburdened, patients may be able to access early intervention services sooner.
- The technology may reduce the amount of time it takes to screen for autism and thereby lighten some of the workload for practitioners.
- Eye movements are considered by many experts to be objective biomarkers of autism.
- Practitioners may see an increase in the number of referrals for comprehensive autism evaluations for younger children. When interventions can begin at an earlier age, outcomes for children are often better (Al-Adhaileh et al., 2025).
Potential Risks
Some risks and downsides do exist. One potential downside is the cost of the equipment, especially for solo clinical practitioners or those in school-based settings where budgets are often constrained.
The use of artificial intelligence presents its own risks, including the potential for bias in training the models, and privacy and security concerns any time data is processed via the Internet. Some experts have also wondered whether increased reliance on technology for clinical decision-making could cause clinical reasoning and assessment skills to degrade over time.
Complexities
Perhaps the most common concern is the possibility of misdiagnosis or missed diagnoses in complex cases. Some of the conditions that often co-occur with autism can also affect gaze patterns. Autism and developmental language disorder (DLD) both feature atypical eye movements, for example. Researchers say eye-movement tracking technologies are better at differentiating typically developing individuals from those who have either condition. The results are less reliable when it comes to distinguishing autistic gaze patterns from DLD patterns. For that reason, researchers say the technology “is more effective as a screening tool than for differential diagnosis” (Antolí et al., 2025).
More research also needs to be done to understand the way factors like gender may affect eye-movement patterns in autistic youth. Most eye-movement tracking research has been conducted on male participants, yet some studies suggest there are subtle differences in eye-movement patterns in girls and boys with characteristics of autism (Zhang et al., 2025).
All of which brings us to the need for contextual human interpretation.
How Do We Build an Integrated Approach to Autism Evaluation?
Professional guidance is clear: no diagnosis should be based on a single instrument, whether that instrument is an AI-based tool or a validated assessment administered by a clinical professional. This is particularly true of autism, where even core traits can present in highly individual ways, and where co-occurring disorders, mental health conditions, and trauma histories can all affect eye-movement patterns in similar ways (Unruh et al., 2021; Lazarov et al., 2021)
Whole Child Approach
After a positive screening for autism, a comprehensive evaluation, conducted by a multidisciplinary team equipped with a range of diagnostic tools, can fill in the gaps, differentiate co-occurring conditions, and complete the picture of a child’s autistic traits, strengths, interests, and needs.
Depending on the child in question, you may want to include assessments that evaluate:
- autistic traits
- adaptive behavior
- learning disabilities
- language skills
- auditory processing
- sensory processing
- executive function
- mental health
Each of those assessment areas can be affected by autism, and autism can also affect the symptoms of those conditions. To understand daily functioning and design effective supports, you may also want to review school records and interview those closest to the child.
Learn more about how to build cultural competence in assessment.
In an article published in Brain Sciences, authors encouraged practitioners to consider technology as more than just a potential diagnostic tool, but as a context in which people increasingly live their lives. Communicating through social media, extending real world relationships into online environments, gaming, and interacting with conversational AI are all settings in which adaptive behavior, communication skills, and autistic traits are on display. That makes them rich environments to explore in an autism evaluation (Shen & Yu, 2025).
Key Messages
Eye-movement tracking devices, augmented with artificial intelligence, may lead to earlier intervention and better outcomes for young people. Good outcomes will be more likely when practitioners integrate tech with validated assessments, sound and sensitive clinical judgment, and evidence-based practice.
Autism has inspired innovation for decades. As researchers develop technologies to aid in faster, more accurate autism evaluations—and as practitioners discover their benefits and limits in real life settings—the importance of balancing innovation, ethics, and clinical reasoning will become easier to see.

ProLearn® Live Webinar
Artificial Intelligence in Psychological Practice: Ethical Integration and Applications
Discover how to ethically and effectively integrate AI into psychological practice—boosting efficiency, enhancing services, and navigating risks with confidence in a rapidly evolving field.
Register Today >

WPS Shop
Autism Diagnostic Observation Schedule, Second Edition (ADOS®-2)
This webinar reviews current autism diagnostic criteria, contrasts medical and social models, and explores research on diverse presentations across populations.
Shop Now >

WPS Shop
Autism Diagnostic Interview–Revised (ADI®-R)
A structured, caregiver-based interview designed to assess social interaction, communication, and repetitive behaviours in individuals suspected of having autism spectrum disorder in order to support diagnosis and guide treatment planning.
Shop Now >
FAQs:
1. How reliable are AI-based eye-tracking tools for identifying autism?
Sensitivity and specificity are generally very good, though they may vary from system to system. It may be best to view AI-based eye-tracking tools as screeners.
2. Can AI eye-tracking tools replace trusted autism assessments like the ADOS-2 or ADI-R?
No. AI-based eye-tracking systems can streamline referrals and identify autism earlier, but validated assessments are essential for differential diagnosis and effective, individualized intervention plans.
3. How can clinicians integrate AI screening results into team discussions?
View the data from eye-movement tracking systems as one source of information in a comprehensive evaluation. It’s important to gather information from caregivers, families, teachers, and health professionals. Team discussions can bring together assessment scores, caregiver expertise, and observations so all the child’s needs are addressed.
4. What training do you need to interpret AI-generated autism screening results?
Data from eye-movement tools can usually be gathered by anyone who has been trained to use the device in a clinical setting. Interpreting the data is a different matter. Interpreting reports and results should be carried out by a qualified clinical or educational professional trained to identify autism.
Research and Resources:
Al-Adhaileh, M. H., Alsubari, S. N. M., Al-Nefaie, A. H., Ahmad, S., & Alhamadi, A. A. (2025). Diagnosing autism spectrum disorder based on eye tracking technology using deep learning models. Frontiers in Medicine, 12, 1690177. https://doi.org/10.3389/fmed.2025.1690177
Alsharif, N., Al-Adhaileh, M. H., Al-Yaari, M., Farhah, N., & Khan, Z. I. (2024). Utilizing deep learning models in an intelligent eye-tracking system for autism spectrum disorder diagnosis. Frontiers in Medicine, 11, 1436646. https://doi.org/10.3389/fmed.2024.1436646
Antolí, A., Rodriguez-Lozano, F. J., Juan Cañas, J., Vacas, J., Cuadrado, F., Sánchez-Raya, A., Pérez-Dueñas, C., & Gámez-Granados, J. C. (2025). Using explainable machine learning and eye-tracking for diagnosing autism spectrum and developmental language disorders in social attention tasks. Frontiers in Neuroscience, 19, 1558621. https://doi.org/10.3389/fnins.2025.1558621
Jeyarani, R.A. & Senthilkumar, R. (2023). Eye tracking biomarkers for autism spectrum disorder detection using machine learning and deep learning techniques: Review. Research in Autism Spectrum Disorders, 10, 102228. https://doi.org/10.1016/j.rasd.2023.102228
Lazarov, A., Suarez-Jimenez, B., Zhu, X., Pine, D. S., Bar-Haim, Y., & Neria, Y. (2021). Attention allocation in posttraumatic stress disorder: an eye-tracking study. Psychological Medicine, 1–10. Advance online publication. https://doi.org/10.1017/S0033291721000581
Li, T., Li, Y., Hu, Y., Wang, Y., Lam, C. M., Ni, W., Wang, X., & Yi, L. (2021). Heterogeneity of visual preferences for biological and repetitive movements in children with autism spectrum disorder. Autism Research, 14(1), 102–111. https://doi.org/10.1002/aur.2366
Meng, F., Li, F., Wu, S., Yang, T., Xiao, Z., Zhang, Y., Liu, Z., Lu, J., & Luo, X. (2023). Machine learning-based early diagnosis of autism according to eye movements of real and artificial faces scanning. Frontiers in Neuroscience, 17, 1170951. https://doi.org/10.3389/fnins.2023.1170951
Nayar, K., Shic, F., Winston, M., & Losh, M. (2022). A constellation of eye-tracking measures reveals social attention differences in ASD and the broad autism phenotype. Molecular Autism, 13(1), 18. https://doi.org/10.1186/s13229-022-00490-w
Omori M. (2025). Increased observation of predictable visual stimuli in children with potential autism spectrum disorder. Scientific Reports, 15(1), 4572. https://doi.org/10.1038/s41598-025-89171-1
Riddiford, J. A., Enticott, P. G., Lavale, A., & Gurvich, C. (2022). Gaze and social functioning associations in autism spectrum disorder: A systematic review and meta-analysis. Autism Research, 15(8), 1380–1446. https://doi.org/10.1002/aur.2729
Setu, D. M. (2025). An analytics-driven model for identifying autism spectrum disorder using eye tracking. Health Analytics, 8, 100409. https://doi.org/10.1016/j.health.2025.100409
Shen, Z., & Yu, C. L. (2025). How technology advances research and practice in autism spectrum disorder: A narrative review on early detection, subtype stratification, and intervention. Brain Sciences, 15(8), 890. https://doi.org/10.3390/brainsci15080890
Sun, B., Wang, B., Wei, Z., Feng, Z., Wu, Z. L., Yassin, W., Stone, W. S., Lin, Y., & Kong, X. J. (2023). Identification of diagnostic markers for ASD: a restrictive interest analysis based on EEG combined with eye tracking. Frontiers in Neuroscience, 17, 1236637. https://doi.org/10.3389/fnins.2023.1236637
Unruh, K. E., Bodfish, J. W., & Gotham, K. O. (2020). Adults with autism and adults with depression show similar attentional biases to social-affective images. Journal of Autism and Developmental Disorders, 50(7), 2336–2347. https://doi.org/10.1007/s10803-018-3627-5
Zhang, L., Guan, X., Xue, H., Liu, X., Zhang, B., Liu, S., & Ming, D. (2025). Sex-specific patterns in social visual attention among individuals with autistic traits. BMC Psychiatry, 25(1), 440. https://doi.org/10.1186/s12888-025-06896-z