Andrews, T. J. & Coppola, D. M. Idiosyncratic characteristics of saccadic eye movements when viewing different visual environments. Vision Res. 39, 2947–2953 (1999).
Henderson, J. M. Human gaze control during real-world scene perception. Trends Cogn. Sci. 7, 498–504 (2003).
Otero-Millan, J., Macknik, S. L., Langston, R. E. & Martinez-Conde, S. An oculomotor continuum from exploration to fixation. Proc. Natl Acad. Sci. USA 110, 6175–6180 (2013).
Wolfe, J. M., Alvarez, G. A., Rosenholtz, R., Kuzmova, Y. I. & Sherman, A. M. Visual search for arbitrary objects in real scenes. Atten. Percept. Psychophys. 73, 1650–1671 (2011).
Constantino, J. N. et al. Infant viewing of social scenes is under genetic control and is atypical in autism. Nature 547, 340–344 (2017).
Hoffman, J. E. in Attention (Pashler, H. et al.) Ch. 3 (Psychology Press, 1998).
Rahal, R.-M. & Fiedler, S. Understanding cognitive and affective mechanisms in social psychology through eye-tracking. J. Exp. Soc. Psychol. 85, 103842 (2019).
Orquin, J. L. & Mueller Loose, S. Attention and choice: a review on eye movements in decision making. Acta Psychol. 144, 190–206 (2013).
Hannula, D. E. et al. Worth a glance: using eye movements to investigate the cognitive neuroscience of memory. Front. Hum. Neurosci. 4, 166 (2010).
Coiner, B. et al. Functional neuroanatomy of the human eye movement network: a review and atlas. Brain Struct. Funct. 224, 2603–2617 (2019).
Desimone, R. & Duncan, J. Neural mechanisms of selective visual attention. Annu. Rev. Neurosci. 18, 193–222 (1995).
Kim, N. Y. & Kastner, S. A biased competition theory for the developmental cognitive neuroscience of visuo-spatial attention. Curr. Opin. Psychol. 29, 219–228 (2019).
Itti, L. & Koch, C. Computational modelling of visual attention. Nat. Rev. Neurosci. 2, 194–203 (2001).
Wang, S. et al. Atypical visual saliency in autism spectrum disorder quantified through model-based eye tracking. Neuron 88, 604–616 (2015).
Haskins, A. J. et al. Reduced social attention in autism is magnified by perceptual load in naturalistic environments. Autism Res. 15, 2310–2323 (2022).
Nayar, K., Shic, F., Winston, M. & Losh, M. A constellation of eye-tracking measures reveals social attention differences in ASD and the broad autism phenotype. Mol. Autism 13, 18 (2022).
Yiend, J. & Mathews, A. Anxiety and attention to threatening pictures. Q. J. Exp. Psychol. A 54, 665–681 (2001).
MacLeod, C. & Mathews, A. Anxiety and the allocation of attention to threat. Q. J. Exp. Psychol. A 40, 653–670 (1988).
Türkan, B. N., Amado, S., Ercan, E. S. & Perçinel, I. Comparison of change detection performance and visual search patterns among children with/without ADHD: evidence from eye movements. Res. Dev. Disabil. 49–50, 205–215 (2016).
Caldani, S. et al. The effect of dual task on attentional performance in children with ADHD. Front. Integr. Neurosci. 12, 67 (2019).
Bischof, W. F., Anderson, N. C. & Kingstone, A. in Eye Movement Research 407–448 (Springer, 2019); https://doi.org/10.1007/978-3-030-20085-5_10
Oberwelland, E. et al. Look into my eyes: investigating joint attention using interactive eye-tracking and fMRI in a developmental sample. NeuroImage 130, 248–260 (2016).
Griffin, J. W. et al. Spatiotemporal eye movement dynamics reveal altered face prioritization in early visual processing among autistic children. Biol. Psychiatry Cogn. Neurosci. Neuroimaging 10, 45–57 (2025).
Veale, R., Hafed, Z. M. & Yoshida, M. How is visual salience computed in the brain? Insights from behaviour, neurobiology and modelling. Phil. Trans. R. Soc. B 372, 20160113 (2017).
Koch, C. & Ullman, S. in Matters of Intelligence: Conceptual Structures in Cognitive Neuroscience (ed. Vaina, L. M.) 115–141 (Springer, 1987); https://doi.org/10.1007/978-94-009-3833-5_5
Treisman, A. M. & Gelade, G. A feature-integration theory of attention. Cogn. Psychol. 12, 97–136 (1980).
Itti, L., Koch, C. & Niebur, E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).
Judd, T., Ehinger, K., Durand, F. & Torralba, A. Learning to predict where humans look. In Proc. 2009 IEEE 12th International Conference on Computer Vision 2106–2113 (IEEE, 2009); https://doi.org/10.1109/ICCV.2009.5459462.
Harel, J., Koch, C. & Perona, P. Graph-based visual saliency. In Proc. 20th International Conference on Neural Information Processing Systems (NIPS’06) 545–552 (MIT Press, Cambridge, MA).
Hou, X. & Zhang, L. Saliency detection: a spectral residual approach. In Proc. 2007 IEEE Conference on Computer Vision and Pattern Recognition 1–8 (IEEE, Minneapolis, MN, 2007); https://doi.org/10.1109/CVPR.2007.383267.
Zhang, L., Tong, M. H., Marks, T. K., Shan, H. & Cottrell, G. W. SUN: a Bayesian framework for saliency using natural statistics. J. Vis. 8, 32 (2008).
Linardos, A., Kümmerer, M., Press, O. & Bethge, M. DeepGaze IIE: calibrated prediction in and out-of-domain for state-of-the-art saliency modeling. In Proc. 2021 IEEE/CVF International Conference on Computer Vision (ICCV) 12899–12908 (IEEE, 2021); https://doi.org/10.1109/ICCV48922.2021.01268
Huang, X., Shen, C., Boix, X. & Zhao, Q. SALICON: reducing the semantic gap in saliency prediction by adapting deep neural networks. In Proc. 2015 IEEE International Conference on Computer Vision (ICCV) 262–270 (IEEE, 2015); https://doi.org/10.1109/ICCV.2015.38
Kruthiventi, S. S. S., Ayush, K. & Babu, R. V. DeepFix: a fully convolutional neural network for predicting human eye fixations. IEEE Trans. Image Process. 26, 4446–4456 (2017).
Kümmerer, M. et al. MIT/Tübingen Saliency Benchmark https://saliency.tuebingen.ai/
Jain, S. et al. ViNet: pushing the limits of visual modality for audio-visual saliency prediction. In 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 3520–3527 (Prague, Czech Republic, 2021); https://doi.org/10.1109/IROS51168.2021.9635989
Tavakoli, H. R., Borji, A., Rahtu, E. & Kannala, J. DAVE: a Deep Audio-Visual Embedding for dynamic saliency prediction. Preprint at https://doi.org/10.48550/arXiv.1905.10693 (2020).
Kümmerer, M., Bethge, M. & Wallis, T. S. A. DeepGaze III: modeling free-viewing human scanpaths with deep learning. J. Vis. 22, 7 (2022).
Roth, N., Rolfs, M., Hellwich, O. & Obermayer, K. Objects guide human gaze behavior in dynamic real-world scenes. PLoS Comput. Biol. 19, e1011512 (2023).
Mengers, V., Roth, N., Brock, O., Obermayer, K. & Rolfs, M. A robotics-inspired scanpath model reveals the importance of uncertainty and semantic object cues for gaze guidance in dynamic scenes. J. Vis. 25, 6 (2025).
Zanca, D., Melacci, S. & Gori, M. Gravitational laws of focus of attention. IEEE Trans. Pattern Anal. Mach. Intell. 42, 2983–2995 (2020).
Kümmerer, M. & Bethge, M. State-of-the-art in human scanpath prediction. Preprint at https://doi.org/10.48550/arXiv.2102.12239 (2021).
Judd, T., Durand, F. & Torralba, A. A Benchmark of Computational Models of Saliency to Predict Human Fixations. Report no. MIT-CSAIL-TR-2012-001 (MIT, 2012); https://dspace.mit.edu/handle/1721.1/68590 (2012).
Borji, A. & Itti, L. CAT2000: a large scale fixation dataset for boosting saliency research. Preprint at https://doi.org/10.48550/arXiv.1505.03581 (2015).
Strauch, C. et al. Saliency models perform best for women’s and young adults’ fixations. Commun. Psychol. 1, 34 (2023).
Adámek, P. et al. The gaze of schizophrenia patients captured by bottom-up saliency. Schizophrenia 10, 21 (2024).
Bast, N. et al. Sensory salience processing moderates attenuated gazes on faces in autism spectrum disorder: a case-control study. Mol. Autism 14, 5 (2023).
Dziemian, S. et al. Saliency models reveal reduced top-down attention in attention-deficit/hyperactivity disorder: a naturalistic eye-tracking study. JAACAP Open 3, 192–204 (2024).
de Haas, B., Iakovidis, A. L., Schwarzkopf, D. S. & Gegenfurtner, K. R. Individual differences in visual salience vary along semantic dimensions. Proc. Natl Acad. Sci. USA 116, 11687–11692 (2019).
Broda, M. D. & de Haas, B. Individual differences in human gaze behavior generalize from faces to objects. Proc. Natl Acad. Sci. USA 121, e2322149121 (2024).
Falck-Ytter, T. The breakdown of social looking. Neurosci. Biobehav. Rev. 161, 105689 (2024).
Broda, M. D. & De Haas, B. Individual differences in looking at persons in scenes. J. Vis. 22, 9 (2022).
Wegner-Clemens, K., Rennig, J., Magnotti, J. F. & Beauchamp, M. S. Using principal component analysis to characterize eye movement fixation patterns during face viewing. J. Vis. 19, 2 (2019).
Liu, W., Li, M. & Yi, L. Identifying children with autism spectrum disorder based on their face processing abnormality: a machine learning framework. Autism Res. 9, 888–898 (2016).
Hessels, R. S., Kemner, C., van den Boomen, C. & Hooge, I. T. C. The area-of-interest problem in eyetracking research: a noise-robust solution for face and sparse stimuli. Behav. Res. 48, 1694–1712 (2016).
Masulli, P. et al. Data-driven analysis of gaze patterns in face perception: methodological and clinical contributions. Cortex 147, 9–23 (2022).
Klötzl, D. et al. NMF-based analysis of mobile eye-tracking data. In Proc. 2024 Symposium on Eye Tracking Research and Applications (ETRA ’24) Article 76, 1–9 (Assoc. Comp. Machine., New York, NY); https://doi.org/10.1145/3649902.3653518
Hsiao, J. H., Lan, H., Zheng, Y. & Chan, A. B. Eye movement analysis with hidden Markov models (EMHMM) with co-clustering. Behav. Res. 53, 2473–2486 (2021).
Hsiao, J. H., An, J., Hui, V. K. S., Zheng, Y. & Chan, A. B. Understanding the role of eye movement consistency in face recognition and autism through integrating deep neural networks and hidden Markov models. npj Sci. Learn. 7, 28 (2022).
Hsiao, J. H. Understanding human cognition through computational modeling. Top. Cogn. Sci. 16, 349–376 (2024).
Chuk, T., Chan, A. B. & Hsiao, J. H. Understanding eye movements in face recognition using hidden Markov models. J. Vis. 14, 8 (2014).
Coviello, E., Chan, A. B. & Lanckriet, G. R. G. Clustering hidden Markov models with variational HEM. J. Mach. Learn. Res. 15, 697–747 (2014).
Chan, C. Y. H., Chan, A. B., Lee, T. M. C. & Hsiao, J. H. Eye-movement patterns in face recognition are associated with cognitive decline in older adults. Psychon. Bull. Rev. 25, 2200–2207 (2018).
Karvelis, P., Paulus, M. P. & Diaconescu, A. O. Individual differences in computational psychiatry: a review of current challenges. Neurosci. Biobehav. Rev. 148, 105137 (2023).
Hauser, T. U., Skvortsova, V., De Choudhury, M. & Koutsouleris, N. The promise of a model-based psychiatry: building computational models of mental ill health. Lancet Digit. Health 4, e816–e828 (2022).
Washington, P. et al. Data-driven diagnostics and the potential of mobile artificial intelligence for digital therapeutic phenotyping in computational psychiatry. Biol. Psychiatry Cogn. Neurosci. Neuroimaging 5, 759–769 (2020).
Jones, W. et al. Eye-tracking-based measurement of social visual engagement compared with expert clinical diagnosis of autism. JAMA 330, 854–865 (2023).
Jones, W. et al. Development and replication of objective measurements of social visual engagement to aid in early diagnosis and assessment of autism. JAMA Netw. Open 6, e2330145 (2023).
Perochon, S. et al. Early detection of autism using digital behavioral phenotyping. Nat. Med. 29, 2489–2497 (2023).
Nazari, S. et al. Large-scale examination of early-age sex differences in neurotypical toddlers and those with autism spectrum disorder or other developmental conditions. Nat. Hum. Behav. 9, 1697–1709 (2025).
O’Driscoll, G. A. & Callahan, B. L. Smooth pursuit in schizophrenia: a meta-analytic review of research since 1993. Brain Cogn. 68, 359–370 (2008).
Athanasopoulos, F., Saprikis, O.-V., Margeli, M., Klein, C. & Smyrnis, N. Towards clinically relevant oculomotor biomarkers in early schizophrenia. Front. Behav. Neurosci. 15, 688683 (2021).
Shishido, E. et al. Application of eye trackers for understanding mental disorders: cases for schizophrenia and autism spectrum disorder. Neuropsychopharmacol. Rep. 39, 72–77 (2019).
Alcañiz, M. et al. Eye gaze as a biomarker in the recognition of autism spectrum disorder using virtual reality and machine learning: a proof of concept for diagnosis. Autism Res. 15, 131–145 (2022).
Benson, P. J. et al. Simple viewing tests can detect eye movement abnormalities that distinguish schizophrenia cases from controls with exceptional accuracy. Biol. Psychiatry 72, 716–724 (2012).
Lee, D. Y. et al. Use of eye tracking to improve the identification of attention-deficit/hyperactivity disorder in children. Sci. Rep. 13, 14469 (2023).
Li, J. et al. Classifying ASD children with LSTM based on raw videos. Neurocomputing 390, 226–238 (2020).
Zheng, Z. et al. Diagnosing and tracking depression based on eye movement in response to virtual reality. Front. Psychiatry 15, 1280935 (2024).
Kim, M. et al. Development of an eye-tracking system based on a deep learning model to assess executive function in patients with mental illnesses. Sci. Rep. 14, 18186 (2024).
Kacur, J., Polec, J., Smolejova, E. & Heretik, A. An analysis of eye-tracking features and modelling methods for free-viewed standard stimulus: application for schizophrenia detection. IEEE J. Biomed. Health Inform. 24, 3055–3065 (2020).
Wei, Q., Cao, H., Shi, Y., Xu, X. & Li, T. Machine learning based on eye-tracking data to identify Autism Spectrum Disorder: a systematic review and meta-analysis. J. Biomed. Inform. 137, 104254 (2023).
Mendez-Encinas, D., Sujar, A., Bayona, S. & Delgado-Gomez, D. Attention and impulsivity assessment using virtual reality games. Sci. Rep. 13, 13689 (2023).
Wiebe, A. et al. Virtual reality-assisted prediction of adult ADHD based on eye tracking, EEG, actigraphy and behavioral indices: a machine learning analysis of independent training and test samples. Transl. Psychiatry 14, 508 (2024).
Revers, M. C. et al. Classification of autism spectrum disorder severity using eye tracking data based on visual attention model. In Proc. 2021 IEEE 34th International Symposium on Computer-Based Medical Systems (CBMS) 142–147 (IEEE, 2021); https://doi.org/10.1109/CBMS52027.2021.00062
Tseng, P.-H. et al. High-throughput classification of clinical populations from natural viewing eye movements. J. Neurol. 260, 275–284 (2013).
Coutrot, A., Hsiao, J. H. & Chan, A. B. Scanpath modeling and classification with hidden Markov models. Behav. Res. 50, 362–379 (2018).
Lundberg, S. M. & Lee, S.-I. A unified approach to interpreting model predictions. In Proc. 31st International Conference on Neural Information Processing Systems 4768–4777 (Curran Associates Inc., 2017).
Ribeiro, M. T., Singh, S. & Guestrin, C. ‘Why should I trust you?’: explaining the predictions of any classifier. In Proc. 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 1135–1144 (ACM, 2016); https://doi.org/10.1145/2939672.2939778
Fisher, R. A. The use of multiple measurements in taxonomic problems. Ann. Eugenics 7, 179–188 (1936).
Shannon, C. E. A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423 (1948).
Guyon, I., Weston, J., Barnhill, S. & Vapnik, V. Gene selection for cancer classification using support vector machines. Mach. Learn. 46, 389–422 (2002).
Peng, H., Long, F. & Ding, C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1226–1238 (2005).
Kanhirakadavath, M. R. & Chandran, M. S. M. Investigation of eye-tracking scan path as a biomarker for autism screening using machine learning algorithms. Diagnostics (Basel) 12, 518 (2022).
Jaradat, A. S., Wedyan, M., Alomari, S. & Barhoush, M. M. Using machine learning to diagnose autism based on eye tracking technology. Diagnostics (Basel) 15, 66 (2024).
Deng, S. et al. in Machine Learning and Knowledge Discovery in Databases Vol. 13718 (eds Amini, M.-R. et al.) 403–418 (Springer Nature, 2023).
Song, Y. et al. EMS: a large-scale eye movement dataset, benchmark, and new model for schizophrenia recognition. In Proc. IEEE Transactions on Neural Networks and Learning Systems 1–12 (IEEE, 2024); https://doi.org/10.1109/TNNLS.2024.3441928
Dalrymple, K. A., Jiang, M., Zhao, Q. & Elison, J. T. Machine learning accurately classifies age of toddlers based on eye tracking. Sci. Rep. 9, 6255 (2019).
Jiang, M. & Zhao, Q. Learning visual attention to identify people with autism spectrum disorder. In Proc. 2017 IEEE International Conference on Computer Vision (ICCV) 3287–3296 (IEEE, 2017); https://doi.org/10.1109/ICCV.2017.354
Ahmed, Z. A. T. et al. Applying eye tracking with deep learning techniques for early-stage detection of autism spectrum disorders. Data 8, 168 (2023).
Tao, Y. & Shyu, M.-L. SP-ASDNet: CNN-LSTM based ASD classification model using observer ScanPaths. In Proc. 2019 IEEE International Conference on Multimedia & Expo Workshops (ICMEW) 641–646 (2019); https://doi.org/10.1109/ICMEW.2019.00124
Cheekaty, S. & Muneeswari, G. Enhanced multilevel autism classification for children using eye-tracking and hybrid CNN-RNN deep learning models. In Neural Computing & Applications Vol. 37, 27631–27654 (ACM, 2024); https://doi.org/10.1007/s00521-024-10633-0
Elmadjian, C., Gonzales, C., Costa, R. L. da & Morimoto, C. H. Online eye-movement classification with temporal convolutional networks. Behav. Res. 55, 3602–3620 (2023).
Wang, Q. et al. Interactive eye tracking for gaze strategy modification. In Proc. 14th International Conference on Interaction Design and Children 247–250 (ACM, 2015); https://doi.org/10.1145/2771839.2771888
Keles, U. et al. Atypical gaze patterns in autistic adults are heterogeneous across but reliable within individuals. Mol. Autism 13, 39 (2022).
Campbell, D. J., Shic, F., Macari, S. & Chawarska, K. Gaze response to dyadic bids at 2 years related to outcomes at 3 years in autism spectrum disorders: a subtyping analysis. J. Autism Dev. Disord. 44, 431–442 (2014).
Zangrossi, A., Cona, G., Celli, M., Zorzi, M. & Corbetta, M. Visual exploration dynamics are low-dimensional and driven by intrinsic factors. Commun. Biol. 4, 1100 (2021).
Elbattah, M., Carette, R., Dequen, G., Guérin, J.-L. & Cilia, F. Learning clusters in autism spectrum disorder: image-based clustering of eye-tracking scanpaths with deep autoencoder. In Proc. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 1417–1420 (IEEE, 2019); https://doi.org/10.1109/EMBC.2019.8856904
Chen, X., Jiang, M. & Zhao, Q. Beyond average: individualized visual scanpath prediction. In Proc. 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 25420–25431 (IEEE, 2024); https://doi.org/10.1109/CVPR52733.2024.02402
Dorr, M., Martinetz, T., Gegenfurtner, K. R. & Barth, E. Variability of eye movements when viewing dynamic natural scenes. J. Vis. 10, 28 (2010).
Shepherd, S. V., Steckenfinger, S. A., Hasson, U. & Ghazanfar, A. A. Human-monkey gaze correlations reveal convergent and divergent patterns of movie viewing. Curr. Biol. 20, 649–656 (2010).
Kennedy, D. P. et al. Genetic influence on eye movements to complex scenes at short timescales. Curr. Biol. 27, 3554–3560.e3 (2017).
Avni, I. et al. Children with autism observe social interactions in an idiosyncratic manner. Autism Res. 13, 935–946 (2020).
Madsen, J., Júlio, S. U., Gucik, J., Steinberg, R. & Parra, L. C. Synchronized eye movements predict test scores in online video education. Proc. Natl Acad. Sci. USA 118, e2016980118 (2021).
Gu, C., Peng, Y., Nastase, S. A., Mayer, R. E. & Li, P. Onscreen presence of instructors in video lectures affects learners’ neural synchrony and visual attention during multimedia learning. Proc. Natl Acad. Sci. USA 121, e2309054121 (2024).
Hou, W., Cheng, R., Zhao, Z., Liao, H. & Li, J. Atypical and variable attention patterns reveal reduced contextual priors in children with autism spectrum disorder. Autism Res. 17, 1572–1585 (2024).
Hedger, N. & Chakrabarti, B. Autistic differences in the temporal dynamics of social attention. Autism 25, 1615–1626 (2021).
Cristino, F., Mathôt, S., Theeuwes, J. & Gilchrist, I. D. ScanMatch: a novel method for comparing fixation sequences. Behav. Res. Methods 42, 692–700 (2010).
Dewhurst, R. et al. It depends on how you look at it: Scanpath comparison in multiple dimensions with MultiMatch, a vector-based approach. Behav. Res. 44, 1079–1100 (2012).
Kriegeskorte, N., Mur, M. & Bandettini, P. A. Representational similarity analysis – connecting the branches of systems neuroscience. Front. Syst. Neurosci 2, 4 (2008).
Nakano, T. et al. Atypical gaze patterns in children and adults with autism spectrum disorders dissociated from developmental changes in gaze behaviour. Proc. R. Soc. B 277, 2935–2943 (2010).
Segal, A. et al. Embracing variability in the search for biological mechanisms of psychiatric illness. Trends Cogn. Sci. 29, 85–99 (2025).
Wolfers, T. et al. Mapping the heterogeneous phenotype of schizophrenia and bipolar disorder using normative models. JAMA Psychiatry 75, 1146–1155 (2018).
Ge, R. et al. Normative modelling of brain morphometry across the lifespan with CentileBrain: algorithm benchmarking and model optimisation. Lancet Digit. Health 6, e211–e221 (2024).
Ma, C. et al. Multimodal fusion with LLMs for engagement prediction in natural conversation. In Companion Proc. 27th International Conference on Multimodal Interaction (ICMI Companion ’25) 244–259 (Assoc. Comput. Machine., New York, NY, 2025); https://doi.org/10.1145/3747327.3764904
Wu, M. et al. Hypergraph multi-modal large language model: exploiting EEG and eye-tracking modalities to evaluate heterogeneous responses for video understanding. In Proc. 32nd ACM International Conference on Multimedia 7316–7325 (ACM, 2024); https://doi.org/10.1145/3664647.3680810
Papagiannopoulou, E. A., Chitty, K. M., Hermens, D. F., Hickie, I. B. & Lagopoulos, J. A systematic review and meta-analysis of eye-tracking studies in children with autism spectrum disorders. Soc. Neurosci. 9, 610–632 (2014).
Suslow, T., Hußlack, A., Kersting, A. & Bodenschatz, C. M. Attentional biases to emotional information in clinical depression: a systematic and meta-analytic review of eye tracking findings. J. Affect. Disord. 274, 632–642 (2020).
Dunn, M. J. et al. Minimal reporting guideline for research involving eye tracking (2023 edition). Behav. Res. Methods 56, 4351–4357 (2024).
Wilson, R. C. & Collins, A. G. Ten simple rules for the computational modeling of behavioral data. eLife 8, e49547 (2019).
Linka, M., Karimpur, H. & de Haas, B. Protracted development of gaze behaviour. Nat. Hum. Behav. 9, 1887–1897 (2025).
Strauch, C., Hoogerbrugge, A. J. & Ten Brink, A. F. Gaze data of 4,243 participants shows link between leftward and superior attention biases and age. Exp. Brain Res. 242, 1327–1337 (2024).
Kleberg, J. L. et al. Autistic traits and symptoms of social anxiety are differentially related to attention to others’ eyes in social anxiety disorder. J. Autism Dev. Disord. 47, 3814–3821 (2017).
Mansell, Clark, W., Ehlers, D. avidM., Anke, & & Chen, Y.-P. Social anxiety and attention away from emotional faces. Cogn. Emot. 13, 673–690 (1999).
Armstrong, T. & Olatunji, B. O. Eye tracking of attention in the affective disorders: a meta-analytic review and synthesis. Clin. Psychol. Rev. 32, 704–723 (2012).
Fusar-Poli, P. et al. Transdiagnostic psychiatry: a systematic review. World Psychiatry 18, 192–207 (2019).
Xu, H. & Shuttleworth, K. M. J. Medical artificial intelligence and the black box problem: a view based on the ethical principle of ‘do no harm’. Intell. Med. 4, 52–57 (2024).
Reddy, S. Explainability and artificial intelligence in medicine. Lancet Digit. Health 4, e214–e215 (2022).
Hasson, U., Nir, Y., Levy, I., Fuhrmann, G. & Malach, R. Intersubject synchronization of cortical activity during natural vision. Science 303, 1634–1640 (2004).
Franchak, J. M., Heeger, D. J., Hasson, U. & Adolph, K. E. Free viewing gaze behavior in infants and adults. Infancy 21, 262–287 (2016).
Byrge, L., Dubois, J., Tyszka, J. M., Adolphs, R. & Kennedy, D. P. Idiosyncratic brain activation patterns are associated with poor social comprehension in autism. J. Neurosci. 35, 5837–5850 (2015).
Hasson, U. et al. Shared and idiosyncratic cortical activation patterns in autism revealed under continuous real-life viewing conditions. Autism Res. 2, 220–231 (2009).
Dinstein, I. et al. Unreliable evoked responses in autism. Neuron 75, 981–991 (2012).
Wu, Q. et al. 156. Modeling eye gaze to videos using dynamic trajectory variability analysis. Biol. Psychiatry 93, S157 (2023).
Kiat, J. E. et al. Linking patterns of infant eye movements to a neural network model of the ventral stream using representational similarity analysis. Dev. Sci. 25, e13155 (2022).
Borovska, P. & de Haas, B. Individual gaze shapes diverging neural representations. Proc. Natl Acad. Sci. USA 121, e2405602121 (2024).
Kollenda, D., Reher, A.-S. & de Haas, B. Individual gaze predicts individual scene descriptions. Sci. Rep. 15, 9443 (2025).