Article published In: Gesture
Vol. 21:2/3 (2022) ► pp.296–319
Iconic gestures serve as primes for both auditory and visual word forms
Published online: 24 August 2023
https://doi.org/10.1075/gest.20019.san
https://doi.org/10.1075/gest.20019.san
Abstract
Previous studies using cross-modal semantic priming have found that iconic gestures prime target words that are related with the gestures. In the present study, two analogous experiments examined this priming effect presenting prime and targets in high synchrony. In Experiment 1, participants performed an auditory primed lexical decision task where target words (e.g., “push”) and pseudowords had to be discriminated, primed by overlapping iconic gestures that could be semantically related (e.g., moving both hands forward) or not with the words. Experiment 2 was similar but with both gestures and words presented visually. The grammatical category of the words was also manipulated: they were nouns and verbs. It was found that words related to gestures were recognized faster and with fewer errors than the unrelated ones in both experiments and similarly for both types of words.
Keywords: iconic gesture, semantic priming, word processing, cross-modal priming
Article outline
- Experiment 1
- Method
- Participants
- Materials and design
- Procedure
- Results and discussion
- Experiment 2
- Method
- Participants
- Materials and design
- Procedure
- Results and discussion
- General discussion
References
References (56)
Aguasvivas, J. A., Carreiras, M., Brysbaert, M., Mandera, P., Keuleers, E., & Duñabeitia, J. A. (2018). SPALEX: A Spanish lexical decision database from a massive online data collection. Frontiers in Psychology, 91, 2156.
Alibali, M. W. (2005). Gesture in spatial cognition: Expressing, communicating, and thinking about spatial information. Spatial Cognition and Computation, 51, 307–331.
Alibali, M. W., Heath, D. C., & Myers, H. J. (2001). Effects of visibility between speaker and listener on gesture production: Some gestures are meant to be seen. Journal of Memory and Language, 44 (2), 169–188.
Anderson, J. E., & Holcomb, P. J. (1995). Auditory and visual semantic priming using different stimulus onset asynchronies: An event-related brain potential study. Psychophysiology, 32 (2), 177–190.
Baayen, R. H., Davidson, D. J., & Bates, D. M. (2008). Mixed-effects modeling with crossed random effects for subjects and items. Journal of Memory and Language, 591, 390–412.
Barr, D. J., Levy, R., Scheepers, C., & Tily, H. J. (2013). Random effects structure for confirmatory hypothesis testing: Keep it maximal. Journal of Memory and Language, 68 (3), 255–278.
Barsalou, L. W., Santos, A., Simmons, W. K., & Wilson, C. D. (2008). Language and simulation in conceptual processing. In M. de Vega, A. Glenberg, & A. Graesser. (Eds.), Symbols and embodiment: Debates on meaning and cognition (pp. 245–283). Oxford: Oxford University Press.
Bates, D. & Maechler, B. (2009). Lme 4: Linear mixed-effects models using S4 classes. R package version 0.999375-27.
Beattie, G. & Shovelton, H. (2002). An experimental investigation of some properties of individual iconic gestures that mediate their communicative power. British Journal of Psychology, 931, 179–192.
Bernardis, P., Salillas, E., & Caramelli, N. (2008). Behavioural and neuropsychological evidence of semantic interaction between iconic gestures and words. Cognitive Neuropsychology, 251, 1114–1128.
De Marco, D., De Stefani, E., & Gentilucci, M. (2015). Gesture and word analysis: the same or different processes? Neuroimage, 1171, 375–385.
Driskell, J. E. & Radtke, P. H. (2003). The effect of gesture on speech production and comprehension. Human Factors, 45 (3), 445–454.
Forster, K. I. (1981). Priming and the effects of sentence and lexical contexts on naming time: Evidence for autonomous lexical processing. The Quarterly Journal of Experimental Psychology, 33 (4), 465–495.
Gentner, D. (1982). Why nouns are learned before verbs: Linguistic relativity versus natural partitioning. Center for the Study of Reading Technical Report, No. 257.
Goh, W. D., Suárez, L., Yap, M. J., & Tan, S. H. (2009). Distributional analyses in auditory lexical decision: Neighborhood density and word-frequency effects. Psychonomic Bulletin & Review, 16 (5), 882–887.
Goldin-Meadow, S. & Wagner, S. M. (2005). How our hands help us learn. Trends in Cognitive Sciences, 9 (5), 234–241.
Goodwin, M. H. & Goodwin, C. (1986). Gesture and coparticipation in the activity of searching for a word. Semiotica, 621, 51–76.
Habets, B., Kita, S., Shao, Z., Özyürek, A., & Hagoort, P. (2011). The role of synchrony and ambiguity in speech–gesture integration during comprehension. Journal of Cognitive Neuroscience, 23 (8), 1845–1854.
Hernández-Cabrera, J. A. (2011). ULLRToolbox. [URL]
Holcomb, P. J. & Anderson, J. E. (1993). Cross-modal semantic priming: A time-course analysis using event-related brain potentials. Language and Cognitive Processes, 8 (4), 379–411.
Holcomb, P. J. & Neville, H. J. (1990). Auditory and visual semantic priming in lexical decision: A comparison using event-related brain potentials. Language and Cognitive Processes, 5 (4), 281–312.
Hostetter, A. B. (2011). When do gestures communicate? A meta-analysis. Psychological Bulletin, 137 (2), 297–315.
Hubbard, A. L., Wilson, S. M., Callan, D. E., & Dapretto, M. (2009). Giving speech a hand: Gesture modulates activity in auditory cortex during speech perception. Human brain mapping, 30 (3), 1028–1037.
Jonides, J. & Mack, R. (1984). On the cost and benefit of cost and benefit. Psychological Bulletin, 961, 29–44.
Kelly, S. D., Creigh, P., & Bartolotti, J. (2010). Integrating speech and iconic gestures in a Stroop-like task: evidence for automatic processing. Journal of Cognitive Neuroscience, 22 (4), 683–694.
Kelly, S. D., Manning, S. M., & Rodak, S. (2008). Gesture gives a hand to language and learning: Perspectives from cognitive neuroscience, developmental psychology and education. Language and Linguistics Compass, 2 (4), 569–588.
Kelly, S. D., Özyürek, A., & Maris, E. (2010). Two sides of the same coin: Speech and gesture mutually interact to enhance comprehension. Psychological Science, 21 (2), 260–267.
Kelly, S., Healey, M., Özyürek, A., & Holler, J. (2015). The processing of speech, gesture, and action during language comprehension. Psychonomic Bulletin & Review, 22 (2), 517–523.
Kendon, A. (1980). Gesticulation and speech: Two aspects of the process of utterance. In M. R. Key. (Ed.), The relationship of verbal and nonverbal communication (pp. 207–227). The Hague, the Netherlands: Mouton.
(1994). Do gestures communicate? A review. Research on Language and Social Interaction, 27 (3), 175–200.
Kita, S. & Öyürek, A. (2003). What does cross-linguistic variation in semantic coordination of speech and gesture reveal? Evidence for an interface representation of spatial thinking and speaking. Journal of Memory and Language, 481, 16–32.
Krahmer, E. & Swerts, M. (2007). The effects of visual beats on prosodic prominence: Acoustic analyses, auditory perception and visual perception. Journal of Memory and Language, 57 (3), 396–414.
Krauss, R. M. (1998). Why do we gesture when we speak? Current Directions in Psychological Science, 71, 54–59.
Krauss, R. M., Dushay, R. A., Chen, Y., & Rauscher, F. (1995). The communicative value of conversational hand gestures. Journal of Experimental Social Psychology, 311, 533–552.
Krauss, R. M., Morrel-Samuels, P., & Colasante, C. (1991). Do conversational hand gestures communicate? Journal of Personality and Social Psychology, 611, 743–754.
Krauss, R. M., Chen, Y., & Gottesman, R. F. (2000). Lexical gestures and lexical access: a process model. In D. McNeill. (Ed.), Language and gesture (pp. 261–283). New York: Cambridge University Press.
Martuzzi, R., van der Zwaag, W., Farthouat, J., Gruetter, R., & Blanke, O. (2014). Human finger somatotopy in areas 3b, 1, and 2: a 7T fMRI study using a natural stimulus. Human Brain Mapping, 35 (1), 213–226.
Morrel-Samuels, P. & Krauss, R. M. (1992). Word familiarity predicts temporal asynchrony of hand gestures and speech. Journal of Experimental Psychology: Learning, Memory, and Cognition, 181, 615–623.
Ng, M. M., Goh, W. D., Yap, M. J., Tse, C. S., & So, W. C. (2017). How we think about temporal words: A gestural priming study in English and Chinese. Frontiers in Psychology, 81, 974.
Perea, M. & Rosa, E. (2002). The effects of associative and semantic priming in the lexical decision task. Psychological Research, 66 (3), 180–194.
Schneider, W., Eschmann, A., & Zuccolotto, A. (2002). E-Prime user’s guide. Pittsburgh, PA: Psychology Software Tools, Inc.
Sidhu, D. M., Vigliocco, G., & Pexman, P. M. (2020). Effects of iconicity in lexical decision. Language and Cognition, 12 (1), 164–181.
Slobin, D. I. (1987). Thinking for speaking. In Annual Meeting of the Berkeley Linguistics Society, 131, 435–445.
(1996). From ‘‘thought and language’’ to ‘‘thinking for speaking’’. In J. J. Gumperz & S. C. Levinson. (Eds.), Rethinking linguistic relativity (pp. 70–96). Cambridge: Cambridge University Press.
So, W. C., Low, A., Yap, D. F., Kheng, E., & Yap, M. (2013). Iconic gestures prime words: comparison of priming effects when gestures are presented alone and when they are accompanying speech. Frontiers in Psychology, 41, 779.
Vitevitch, M. S. & Luce, P. A. (1998). When words compete: Levels of processing in perception of spoken words. Psychological Science, 9 (4), 325–329.
Wilson, M. & Emmorey, K. (2006). No difference in short-term memory span between sign and speech. Psychological Science – Cambridge, 17 (12), 1093–1094.
Wilson, M. & Knoblich, G. (2005). The case for motor involvement in perceiving conspecifics. Psychological Bulletin, 131 (3), 460–473.
Wu, Y. C. & Coulson, S. (2005). Meaningful gestures: Electrophysiological indices of iconic gesture comprehension. Psychophysiology, 421, 654–667.
(2007a). Iconic gestures prime related concepts: An ERP study. Psychonomic Bulletin and Review, 14 (1), 57–63.
