Since listeners usually look at the speaker's face, gestural information has to be absorbed through peripheral visual perception. In the literature, it has been suggested that listeners look at gestures under certain circumstances: 1) when the articulation of the gesture is peripheral; 2) when the speech channel is insufficient for comprehension; and 3) when the speaker him- or herself indicates that the gesture is worthy of attention. The research here reported employs eye tracking techniques to study the perception of gestures in face-to-face interaction. The improved control over the listener's visual channel allows us to test the validity of the above claims. We present preliminary findings substantiating claims 1 and 3, and relate them to theoretical proposals in the literature and to the issue of how visual and cognitive attention are related.
Cited by (66)
Cited by 66 other publications
Giannitzi, Eleni, Vladislav Maraev, Erik Lagerstedt & Christine Howes
2025. Laughter in Sight: How Gaze and Laughter Affect Perceptions of a Social Robot. In Human-Computer Interaction [Lecture Notes in Computer Science, 15769], ► pp. 249 ff.
Hessels, Roy S., Toshiki Iwabuchi, Diederick C. Niehorster, Ren Funawatari, Jeroen S. Benjamins, Sayaka Kawakami, Marcus Nyström, Momoka Suda, Ignace T.C. Hooge, Motofumi Sumiya, Julie I.P. Heijnen, Martin K. Teunisse & Atsushi Senju
2025. Gaze behavior in face-to-face interaction: A cross-cultural investigation between Japan and The Netherlands. Cognition 263 ► pp. 106174 ff.
Reynolds, Reed M., Lucy Popova, Bo Yang, Jordan Louviere & James F. Thrasher
2025. Discrete choice experiments: a primer for the communication researcher. Frontiers in Communication 10
Devanga, Suma R. & Mili Mathew
2024. Exploring the use of co-speech hand gestures as treatment outcome measures for aphasia. Aphasiology► pp. 1 ff.
Liu, Tingting & Vahid Aryadoust
2024. Does modality matter? A meta-analysis of the effect of video input in L2 listening assessment. System 120 ► pp. 103191 ff.
Olszanowski, Michal & Monika Wróbel
2024. Why We Mimic Emotions Even When No One is Watching: Limited Visual Contact and Emotional Mimicry. Emotion Review 16:1 ► pp. 16 ff.
2024. 2024 IEEE International Conference on Advanced Robotics and Its Social Impacts (ARSO), ► pp. 128 ff.
Arbona, Eléonore, Kilian G. Seeber & Marianne Gullberg
2023. Semantically related gestures facilitate language comprehension during simultaneous interpreting. Bilingualism: Language and Cognition 26:2 ► pp. 425 ff.
Arbona, Eléonore, Kilian G. Seeber & Marianne Gullberg
2024. The role of semantically related gestures in the language comprehension of simultaneous interpreters in noise. Language, Cognition and Neuroscience 39:5 ► pp. 584 ff.
Heyd-Metzuyanim, Einat, Eeva S. H. Haataja, Markku S. Hannula & Enrique Garcia Moreno-Esteva
2023. What can eye-tracking, combined with discourse analysis, teach us about the ineffectiveness of a group of students solving a geometric problem?. Instructional Science 51:3 ► pp. 363 ff.
Saunders, Emily & David Quinto-Pozos
2023. Comprehension benefits of visual-gestural iconicity and spatial referencing. Second Language Research 39:2 ► pp. 363 ff.
Schreiter, Tim, Lucas Morillo-Mendez, Ravi T. Chadalavada, Andrey Rudenko, Erik Billing, Martin Magnusson, Kai O. Arras & Achim J. Lilienthal
2023. 2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), ► pp. 293 ff.
2023. Supporting Complex Decision-Making: Evidence from an Eye Tracking Study on In-Person and Remote Collaboration. ACM Transactions on Computer-Human Interaction 30:5 ► pp. 1 ff.
Holler, Judith
2022. Visual bodily signals as core devices for coordinating minds in interaction. Philosophical Transactions of the Royal Society B: Biological Sciences 377:1859
Niu, Jin, Chih-Fu Wu, Xiao Dou & Kai-Chieh Lin
2022. Designing Gestures of Robots in Specific Fields for Different Perceived Personality Traits. Frontiers in Psychology 13
Rasmussen, Gitte & Elisabeth Dalby Kristiansen
2022. The sociality of minimizing involvement in self-service shops in Denmark: Customers’ multi-modal practices of being, getting, and staying out of the way. Discourse & Communication 16:2 ► pp. 200 ff.
Alcaraz Carrión, Daniel & Javier Valenzuela
2021. Distant time, distant gesture: speech and gesture correlate to express temporal distance. Semiotica 2021:241 ► pp. 159 ff.
Drijvers, Linda, Ole Jensen & Eelke Spaak
2021. Rapid invisible frequency tagging reveals nonlinear integration of auditory and visual information. Human Brain Mapping 42:4 ► pp. 1138 ff.
2021. Covert Attention to Gestures Is Sufficient for Information Uptake. Frontiers in Psychology 12
Salminen-Saari, Jessica F. A., Enrique Garcia Moreno-Esteva, Eeva Haataja, Miika Toivanen, Markku S. Hannula & Anu Laine
2021. Phases of collaborative mathematical problem solving and joint attention: a case study utilizing mobile gaze tracking. ZDM – Mathematics Education 53:4 ► pp. 771 ff.
2020. How does gaze to faces support face-to-face interaction? A review and perspective. Psychonomic Bulletin & Review 27:5 ► pp. 856 ff.
Özer, Demet & Tilbe Göksun
2020. Gesture Use and Processing: A Review on Individual Differences in Cognitive Resources. Frontiers in Psychology 11
Debreslioska, Sandra, Joost van de Weijer & Marianne Gullberg
2019. Addressees Are Sensitive to the Presence of Gesture When Tracking a Single Referent in Discourse. Frontiers in Psychology 10
Drijvers, Linda, Julija Vaitonytė & Asli Özyürek
2019. Degree of Language Experience Modulates Visual Attention to Visible Speech and Iconic Gestures During Clear and Degraded Speech Comprehension. Cognitive Science 43:10
Fedorova, O.V. & I.Y. Zherdev
2019. Follow the hands of the interlocutor! (on strategies for the distribution of visual attention). Experimental Psychology (Russia) 12:1 ► pp. 98 ff.
Mastrantuono, Eliana, Michele Burigo, Isabel R. Rodríguez-Ortiz & David Saldaña
2019. The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing. Journal of Speech, Language, and Hearing Research 62:6 ► pp. 1625 ff.
McDonough, Kim, Pavel Trofimovich, Libing Lu & Dato Abashidze
2019. THE OCCURRENCE AND PERCEPTION OF LISTENER VISUAL CUES DURING NONUNDERSTANDING EPISODES. Studies in Second Language Acquisition 41:5 ► pp. 1151 ff.
Kamiya, Nobuhiro
2018. The effect of learner age on the interpretation of the nonverbal behaviors of teachers and other students in identifying questions in the L2 classroom. Language Teaching Research 22:1 ► pp. 47 ff.
Kamiya, Nobuhiro
2019. What Factors Affect Learners’ Ability to Interpret Nonverbal Behaviors in EFL Classrooms?. Journal of Nonverbal Behavior 43:3 ► pp. 283 ff.
Preisig, Basil C., Noëmi Eggenberger, Dario Cazzoli, Thomas Nyffeler, Klemens Gutbrod, Jean-Marie Annoni, Jurka R. Meichtry, Tobias Nef & René M. Müri
2018. Multimodal Communication in Aphasia: Perception and Production of Co-speech Gestures During Face-to-Face Conversation. Frontiers in Human Neuroscience 12
Mastrantuono, Eliana, David Saldaña & Isabel R. Rodríguez-Ortiz
2017. An Eye Tracking Study on the Perception and Comprehension of Unimodal and Bimodal Linguistic Inputs by Deaf Adolescents. Frontiers in Psychology 8
Wessler, Janet & Jochim Hansen
2017. Temporal Closeness Promotes Imitation of Meaningful Gestures in Face-to-Face Communication. Journal of Nonverbal Behavior 41:4 ► pp. 415 ff.
Yücel, Zeynep, Francesco Zanlungo & Masahiro Shiomi
2017. Walk the Talk: Gestures in Mobile Interaction. In Social Robotics [Lecture Notes in Computer Science, 10652], ► pp. 220 ff.
Eggenberger, Noëmi, Basil C. Preisig, Rahel Schumacher, Simone Hopfner, Tim Vanbellingen, Thomas Nyffeler, Klemens Gutbrod, Jean-Marie Annoni, Stephan Bohlhalter, Dario Cazzoli, René M. Müri & Antoni Rodriguez-Fornells
2016. Comprehension of Co-Speech Gestures in Aphasic Patients: An Eye Movement Study. PLOS ONE 11:1 ► pp. e0146583 ff.
Gurney, Daniel J., Louise R. Ellis & Emily Vardon-Hynard
2016. The saliency of gestural misinformation in the perception of a violent crime. Psychology, Crime & Law 22:7 ► pp. 651 ff.
Kovářová, Dominika
2015. Kinezika ve výuce cizím jazykům: přehledová studie. Pedagogická orientace 25:3 ► pp. 413 ff.
McDonough, Kim, Dustin Crowther, Paula Kielstra & Pavel Trofimovich
2015. Exploring the potential relationship between eye gaze and English L2 speakers’ responses to recasts. Second Language Research 31:4 ► pp. 563 ff.
Preisig, Basil C., Noëmi Eggenberger, Giuseppe Zito, Tim Vanbellingen, Rahel Schumacher, Simone Hopfner, Thomas Nyffeler, Klemens Gutbrod, Jean-Marie Annoni, Stephan Bohlhalter & René M. Müri
2015. Perception of co-speech gestures in aphasic patients: A visual exploration study during the observation of dyadic conversations. Cortex 64 ► pp. 157 ff.
Rowbotham, Samantha, Donna M. Lloyd, Judith Holler & Alison Wearden
2015. Externalizing the Private Experience of Pain: A Role for Co-Speech Gestures in Pain Communication?. Health Communication 30:1 ► pp. 70 ff.
Vanbellingen, Tim, Rahel Schumacher, Noëmi Eggenberger, Simone Hopfner, Dario Cazzoli, Basil C. Preisig, Manuel Bertschi, Thomas Nyffeler, Klemens Gutbrod, Claudio L. Bassetti, Stephan Bohlhalter & René M. Müri
2015. Different visual exploration of tool-related gestures in left hemisphere brain damaged patients is associated with poor gestural imitation. Neuropsychologia 71 ► pp. 158 ff.
2013. Gaze and turn-taking behavior in casual conversational interactions. ACM Transactions on Interactive Intelligent Systems 3:2 ► pp. 1 ff.
Tsui, Katherine M. & Holly A. Yanco
2013. Design Challenges and Guidelines for Social Interaction Using Mobile Telepresence Robots. Reviews of Human Factors and Ergonomics 9:1 ► pp. 227 ff.
Dewhurst, Richard, Marcus Nyström, Halszka Jarodzka, Tom Foulsham, Roger Johansson & Kenneth Holmqvist
2012. It depends on how you look at it: Scanpath comparison in multiple dimensions with MultiMatch, a vector-based approach. Behavior Research Methods 44:4 ► pp. 1079 ff.
Kuhlen, Anna K., Alexia Galati & Susan E. Brennano
2012. Gesturing integrates top-down and bottom-up information: Joint effects of speakers' expectations and addressees' feedback. Language and Cognition 4:1 ► pp. 17 ff.
Holler, Judith & Katie Wilkin
2011. An experimental investigation of how addressee feedback affects co-speech gestures accompanying speakers’ responses. Journal of Pragmatics 43:14 ► pp. 3522 ff.
Salem, Maha, Katharina Rohlfing, Stefan Kopp & Frank Joublin
2011. 2011 RO-MAN, ► pp. 247 ff.
Alibali, Martha W. & Autumn B. Hostetter
2010. Mimicry and simulation in gesture comprehension. Behavioral and Brain Sciences 33:6 ► pp. 433 ff.
Beattie, Geoffrey, Kate Webster & Jamie Ross
2010. The Fixation and Processing of the Iconic Gestures That Accompany Talk. Journal of Language and Social Psychology 29:2 ► pp. 194 ff.
Gullberg, Marianne & Sotaro Kita
2009. Attention to Speech-Accompanying Gestures: Eye Movements and Information Uptake. Journal of Nonverbal Behavior 33:4 ► pp. 251 ff.
THOMPSON, ROBIN L., KAREN EMMOREY & ROBERT KLUENDER
2009. Learning to look: The acquisition of eye gaze agreement during the production of ASL verbs. Bilingualism: Language and Cognition 12:4 ► pp. 393 ff.
Becvar, Amaya, James Hollan & Edwin Hutchins
2008. Representational Gestures as Cognitive Artifacts for Developing Theories in a Scientific Laboratory. In Resources, Co-Evolution and Artifacts [Computer Supported Cooperative Work, ], ► pp. 117 ff.
Hewig, Johannes, Ralf H. Trippe, Holger Hecht, Thomas Straube & Wolfgang H. R. Miltner
2008. Gender Differences for Specific Body Regions When Looking at Men and Women. Journal of Nonverbal Behavior 32:2 ► pp. 67 ff.
Treffner, Paul, Mira Peter & Mark Kleidon
2008. Gestures and Phases: The Dynamics of Speech-Hand Communication. Ecological Psychology 20:1 ► pp. 32 ff.
Chafai, Nicolas Ech, Catherine Pelachaud & Danielle Pelé
2007. A case study of gesture expressivity breaks. Language Resources and Evaluation 41:3-4 ► pp. 341 ff.
2007. Proceedings of the 19th Conference on l'Interaction Homme-Machine, ► pp. 207 ff.
Chafai, Nicolas Ech, Catherine Pelachaud, Danielle Pelé & Gaspard Breton
2006. Gesture Expressivity Modulations in an ECA Application. In Intelligent Virtual Agents [Lecture Notes in Computer Science, 4133], ► pp. 181 ff.
De Filippo, Carol Lee & Charissa R. Lansing
2006. Eye Fixations of Deaf and Hearing Observers in Simultaneous Communication Perception. Ear & Hearing 27:4 ► pp. 331 ff.
Fussell, Susan R., Leslie D. Setlock, Jie Yang, Jiazhi Ou, Elizabeth Mauer & Adam D. I. Kramer
2004. Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks. Human–Computer Interaction 19:3 ► pp. 273 ff.
Gullberg, Marianne
2003. Eye Movements and Gestures in Human Face-to-face Interaction. In The Mind's Eye, ► pp. 685 ff.
Zülch, Gert & Sascha Stowasser
2003. Eye Tracking for Evaluating Industrial Human-Computer Interfaces. In The Mind's Eye, ► pp. 531 ff.
Gullberg, Marianne & Kenneth Holmqvist
2002. Visual Attention towards Gestures in Face-to-Face Interaction vs. on Screen. In Gesture and Sign Language in Human-Computer Interaction [Lecture Notes in Computer Science, 2298], ► pp. 206 ff.
This list is based on CrossRef data as of 29 november 2025. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.