Does seeing gesture lighten or increase the load?
Effects of processing gesture on verbal and visuospatial cognitive load
Published online: 26 June 2019
https://doi.org/10.1075/gest.17017.hos
https://doi.org/10.1075/gest.17017.hos
Abstract
We examined the cognitive resources
involved in processing speech with gesture compared to the same speech without
gesture across four studies using a dual-task paradigm. Participants viewed videos of a woman describing
spatial arrays either with gesture or without. They then attempted to choose the
target array from among four choices. Participants’ cognitive load was measured
as they completed this comprehension task by measuring how well they could
remember the location and identity of digits in a secondary task. We found that addressees experience additional visuospatial load when processing gestures compared to speech alone, and that the load primarily comes when addressees attempt to use their memory of the descriptions with gesture to choose the target array. However,
this cost only occurs when gestures about horizontal spatial relations (i.e.,
left and right) are produced from the speaker’s egocentric perspective.
Article outline
- Study 1
- Method
- Participants
- Materials and measures
- Primary comprehension task
- Secondary memory task
- Procedure
- Results and discussion
- Method
- Study 2
- Method
- Participants
- Procedure
- Results and discussion
- Method
- Study 3
- Method
- Participants
- Procedure
- Results and discussion
- Method
- Study 4
- Method
- Participants
- Procedure
- Results and discussion
- Method
- General discussion
- Conclusion
References
References (23)
Alibali, Martha W. (2005). Gesture in spatial cognition: Expressing, communicating, and thinking about spatial information. Spatial Cognition & Computation, 51, 307–331.
Goldin-Meadow, Susan, Howard Nusbaum, Spencer D. Kelly, & Susan Wagner (2001). Explaining math: Gesturing lightens the load. Psychological Science, 121, 516–522.
Holler, Judith, Heather Shovelton, & Geoffrey Beattie (2009). Do iconic hand gestures really contribute to the communication of semantic information in a face-to-face context? Journal of Nonverbal Behavior, 331, 73–88.
Hostetter, Autumn B. (2011). When do gestures communicate? A meta-analysis. Psychological Bulletin, 1371, 297–315.
Kelly, Spencer D., Peter Creigh, & James Bartolotti (2009). Integrating speech and iconic gestures in a Stroop-like task: Evidence for automatic processing. Journal of Cognitive Neuroscience, 221, 683–694.
Kelly, Spencer D. & Leslie H. Goldsmith (2004). Gesture and right hemisphere involvement in evaluating lecture material. Gesture, 41, 25–42.
Kelly, Spencer, Meghan Healey, Aslı Özyürek, & Judith Holler (2015). The processing of speech, gesture, and action during language comprehension. Psychonomic Bulletin & Review, 221, 517–523.
Kelly, Spencer D., Corinne Kravitz, & Michael Hopkins (2004). Neural correlates of bimodal speech and gesture comprehension. Brain and Language, 891, 253–260.
Kelly, Spencer D., Aslı Özyürek, & Eric Maris (2010). Two sides of the same coin: Speech and gesture mutually interact to enhance comprehension. Psychological Science, 211, 260–267.
Kelly, Spencer D., Sarah Ward, Peter Creigh, & James Bartolotti (2007). An intentional stance modulates the integration of gesture and speech during comprehension. Brain and Language, 1011, 222–233.
Kendon, Adam (1994). Do gestures communicate? A review. Research on Language and Social Interaction, 271, 175–200.
Lakens, Daniël (2013). Calculating and reporting effect sizes to facilitate cumulative science: A practical primer for t-tests and ANOVAs. Frontiers in Psychology, 41, 863.
Maricchiolo, Fridanna, Augusto Gnisci, Marino Bonaiuto, & Gianluca Ficca (2009). Effects of different types of hand gestures in persuasive speech on receivers’ evaluations. Language and Cognitive Processes, 241, 239–266.
Paivio, Allan (1991). Dual coding theory: Retrospect and current status. Canadian Journal of Psychology, 451, 255–287.
Post, Lysanne S., Tamara van Gog, Fred Paas, & Rolf A. Zwaan (2013). Effects of simultaneously observing and making gestures while studying grammar animations on cognitive load and learning. Computers in Human Behavior, 291, 1450–1455.
Pyers, Jennie E., Pamela Perniss, & Karen Emmorey (2015). Viewpoint in the visual-spatial modality: The coordination of spatial perspective. Spatial Cognition & Computation: An Interdisciplinary Journal, 151, 143–169.
Rueckert, Linda, Ruth Breckinridge Church, Andrea Avila, & Theresa Trejo (2017). Gesture enhances learning of a complex statistical concept. Cognitive Research, 21, 1–6.
Sueyoshi, Ayano & Debra M. Hardison (2005). The role of gestures and facial cues in second language listening comprehension. Language Learning, 551, 661–699.
Sweller, John (2005). The redundancy principle in multimedia learning. In Richard E. Mayer (Ed.), The Cambridge handbook of multimedia learning (pp. 159–168). Cambridge University Press: Cambridge, UK.
Wagner, Susan M., Howard Nusbaum, & Susan Goldin-Meadow (2004). Probing the mental representation of gesture: Is hand-waving spatial? Journal of Memory and Language, 501, 395–407.
Cited by (7)
Cited by seven other publications
Orakçı-Beyaztaş, Elif & Dilay Z. Karadöller
Özer, Demet, Aslı Özyürek & Tilbe Göksun
Yuan, Qingshu, Keming Chen, Qihao Yang, Zhigeng Pan, Jin Xu & Zhengwei Yao
Wu, Ying Choon, Horst M. Müller & Seana Coulson
Yuan, Qingshu, Ruonan Wang, Zhigeng Pan, QianYu Meng, Shuchang Xu, Zihan Wang & Jiaxin Liu
Özer, Demet & Tilbe Göksun
Nathan, Mitchell J., Amelia Yeo, Rebecca Boncoddo, Autumn B. Hostetter & Martha W. Alibali
This list is based on CrossRef data as of 9 december 2025. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.
