Article published In: Interaction Studies
Vol. 21:3 (2020) ► pp.329–352
“Alexa, how are you feeling today?”
Mind perception, smart speakers, and uncanniness
Published online: 9 February 2021
https://doi.org/10.1075/is.19015.tay
https://doi.org/10.1075/is.19015.tay
Abstract
‘Smart’ devices are becoming increasingly ubiquitous. While these sophisticated machines are useful for various purposes,
they sometimes evoke feelings of eeriness or discomfort that constitute uncanniness, a much-discussed phenomenon in robotics research. Adult
participants (N = 115) rated the uncanniness of a hypothetical future smart speaker that was described as possessing the
mental capacities for experience, agency, neither, or both. The novel condition prompting participants to attribute both agency and
experience to the speaker filled an important theoretical gap in the literature. Consistent with the mind perception hypothesis of
uncanniness (MPH; Gray, K., & Wegner, D. M. (2012). Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 1251, 125–130. ), participants in the with-experience condition rated
the device significantly higher in uncanniness than those in the control condition and the with-agency condition. Participants in the
with-both (experience and agency) condition also rated the device higher in uncanniness than those in the control condition and the
with-agency condition, although this latter difference only approached statistical significance.
Article outline
- 1.Introduction
- 1.1Uncanniness and the uncanny valley
- 1.2The mind perception hypothesis of uncanniness
- 1.3Current study
- 1.3.1Hypotheses
- 1.3.2Exploratory analysis
- 2.Method
- 2.1Participants
- 2.2Procedure
- 2.3Plan of analysis
- 3.Results
- 3.1Manipulation checks
- 3.2Primary analysis
- 3.3Exploratory analysis
- 4.Discussion
- 4.1Exploratory analysis
- 4.2Limitations and future directions
- 4.3Conclusions
- Open science
- Supplementary materials
- SM1.Paragraph describing smart speakers and listing their current features; accompanying images of currently popular smart speakers (presented to all participants)
- SM2.Items presented to participants (dependent variables)
- Primary analyses
- Secondary analysis
- SM3.Regression analyses for uncanniness index
References
References (50)
Appel, M., Izydorczyk, D., Weber, S., Mara, M., & Lischetzke, T. (2020). The uncanny of mind in a machine: Humanoid robots as tools, agents, and experiencers. Computers in Human Behavior, 1021, 274–286.
Apple. (2018, January 23). HomePod arrives February 9, available to order this friday [Press release]. Retrieved from [URL]
Brink, K. A., Gray, K., & Wellman, H. M. (2019). Creepiness creeps in: Uncanny valley feelings are acquired in childhood. Child Development, 901, 1202–1214.
Broadbent, E., Kumar, V., Li, X., Sollers, J., III, Stafford, R. Q., MacDonald, B. A., & Wegner, D. M. (2013). Robots with display screens: a robot with a more humanlike face display is perceived to have more mind and a better personality. PloS One, 81, e72589.
Broadbent, E., Kuo, I. H., Lee, Y. I., Rabindran, J., Kerse, N., Stafford, R., & MacDonald, B. A. (2010). Attitudes and reactions to a healthcare robot. Telemedicine and e-Health, 161, 608–613.
Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny valley: An experimental study of human–chatbot interaction. Future Generation Computer Systems, 921, 539–548.
Crandall, C. S., & Sherman, J. W. (2016). On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 661, 93–99.
Creed, C., & Beale, R. (2012). User interactions with an affective nutritional coach. Interacting with Computers, 241, 339–350.
Creed, C., Beale, R., & Cowan, B. (2015). The impact of an embodied agent’s emotional expressions over multiple interactions. Interacting with Computers, 271, 172–188.
Deng, E., Mutlu, B., & Mataric, M. J. (2019). Embodiment in socially interactive robots. Foundations and Trends in Robotics, 71, 251–356.
Ferrey, A. E., Burleigh, T. J., & Fenske, M. J. (2015). Stimulus-category competition, inhibition, and affective devaluation: A novel account of the uncanny valley. Frontiers in Psychology, 61, 249.
Gray, K., Jenkins, A. C., Heberlein, A. S., & Wegner, D. M. (2011). Distortions of mind perception in psychopathology. Proceedings of the National Academy of Sciences, 1081, 477–479.
Gray, K., & Wegner, D. M. (2012). Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 1251, 125–130.
Gray, K., Young, L., & Waytz, A. (2012). Mind perception is the essence of morality. Psychological Inquiry, 231, 101–124.
Ho, C. C., MacDorman, K. F., & Pramono, Z. D. (2008, March). Human emotion and the uncanny valley: A GLM, MDS, and Isomap analysis of robot video ratings. In Proceedings of the 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 169–176). Amsterdam, the Netherlands.
Kätsyri, J., Förger, K., Mäkäräinen, M., & Takala, T. (2015). A review of empirical evidence on different uncanny valley hypotheses: Support for perceptual mismatch as one road to the valley of eeriness. Frontiers in Psychology, 61, 390.
Kawabe, T., Sasaki, K., Ihaya, K., & Yamada, Y. (2017). When categorization-based stranger avoidance explains the uncanny valley: A comment on MacDorman and Chattopadhyay (2016). Cognition, 1611, 129–131.
Knobe, J., & Prinz, J. (2008). Intuitions about consciousness: Experimental studies. Phenomenology and the Cognitive Sciences, 71, 67–83.
Kupferberg, A., Glasauer, S., Huber, M., Rickert, M., Knoll, A., & Brandt, T. (2011). Biological movement increases acceptance of humanoid robots as human partners in motor interaction. AI & Society, 261, 339–345.
Lau, J., Zimmerman, B., & Schaub, F. (2018). Alexa, are you listening? Privacy perceptions, concerns and privacy-seeking behaviors with smart speakers. Proceedings of the ACM on Human-Computer Interaction, 21, 102.
Liu, B., & Sundar, S. S. (2018). Should machines express sympathy and empathy? Experiments with a health advice chatbot. Cyberpsychology, Behavior, and Social Networking, 211, 625–636.
Lynch, J. G. Jr., Bradlow, E. T., Huber, J. C., & Lehmann, D. R. (2015). Reflections on the replication corner: In praise of conceptual replications. International Journal of Research in Marketing, 321, 333–342.
MacDorman, K. F., & Chattopadhyay, D. (2016). Reducing consistency in human realism increases the uncanny valley effect; increasing category uncertainty does not. Cognition, 1461, 190–205.
MacDorman, K. F., & Entezari, S. O. (2015). Individual differences predict sensitivity to the uncanny valley. Interaction Studies, 161, 141–172.
MacDorman, K. F., Green, R. D., Ho, C. C., & Koch, C. T. (2009). Too real for comfort? Uncanny responses to computer generated faces. Computers in Human Behavior, 251, 695–710.
MacDorman, K. F., & Ishiguro, H. (2006). The uncanny advantage of using androids in cognitive and social science research. Interaction Studies, 71, 297–337.
MacDorman, K. F., Vasudevan, S. K., & Ho, C. C. (2009). Does Japan really have robot mania? Comparing attitudes by implicit and explicit measures. AI & Society, 231, 485–510.
Mitchell, W. J., Szerszen, K. A. Sr., Lu, A. S., Schermerhorn, P. W., Scheutz, M., & MacDorman, K. F. (2011). A mismatch in the human realism of face and voice produces an uncanny valley. i-Perception, 21, 10–12.
Moore, R. K. (2012). A Bayesian explanation of the ‘Uncanny Valley’ effect and related psychological phenomena. Nature Scientific Reports, 21, 864.
Mori, M. (1970/2005). The uncanny valley. (K. F. MacDorman, & T. Minato, Trans.). Energy, 71, 33–35.
Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley [from the field]. Robotics & Automation Magazine, IEEE, 191, 98–100.
Myers, C. M., Furqan, A., & Zhu, J. (2019, May). The impact of user characteristics and preferences on performance with an unfamiliar voice user interface. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1–9). Glasgow, Scotland.
NPR & Edison Research. (2018). The smart audio report, winter 2018. Retrieved from [URL]
Pollick, F. E. (2010). In search of the uncanny valley. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, 401, 69–78.
R Core Team (2020). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. [URL]
Ramey, C. H. (2006). An inventory of reported characteristics for home computers, robots, and human beings: Applications for android science and the uncanny valley. In MacDorman, K. F., & Ishiguro, H. (Eds.), Proceedings of the ICCS/CogSci 2006 Long Symposium: ‘Toward Social Mechanisms of Android Science’ (pp. 21–25). Vancouver, Canada.
Rosenthal-von der Pütten, A., & Weiss, A. (2015). The uncanny valley phenomenon: Does it affect all of us? Interaction Studies, 161, 206–214.
Schein, C., & Gray, K. (2015). The eyes are the window to the uncanny valley: Mind perception, autism and missing souls. Interaction Studies, 161, 173–179.
Seyama, J. I., & Nagayama, R. S. (2007). The uncanny valley: Effect of realism on the impression of artificial human faces. Presence: Teleoperators and Virtual Environments, 161, 337–351.
Stafford, R. Q., Broadbent, E., Jayawardena, C., Unger, U., Kuo, I. H., Igic, A., Wong, R., Kerse, N., Watson, C., & MacDonald, B. A. (2010, September). Improved robot attitudes and emotions at a retirement home after meeting a robot. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication (pp. 82–87). Viareggio, Italy.
Stafford, R. Q., MacDonald, B. A., Jayawardena, C., Wegner, D. M., & Broadbent, E. (2014). Does the robot have a mind? Mind perception and attitudes towards robots predict use of an eldercare robot. International Journal of Social Robotics, 61, 17–32.
Stein, J. P., & Ohler, P. (2017). Venturing into the uncanny valley of mind – The influence of mind attribution on the acceptance of human-like characters in a virtual reality setting. Cognition, 1601, 43–50.
Tharp, M., Holtzman, N. S., & Eadeh, F. R. (2017). Mind perception and individual differences: A replication and extension. Basic and Applied Social Psychology, 391, 68–73.
Waddell, K. (2017, April 21). Chatbots have entered the uncanny valley. The Atlantic. Retrieved from [URL]
Wang, S., Lilienfeld, S. O., & Rochat, P. (2015). The uncanny valley: Existence and explanations. Review of General Psychology, 191, 393–407.
Wang, X., & Krumhuber, E. G. (2018). Mind perception of robots varies with their economic versus social function. Frontiers in Psychology, 91, 1230.
Waytz, A., Cacioppo, J., & Epley, N. (2010). Who sees human? The stability and importance of individual differences in anthropomorphism. Perspectives on Psychological Science, 51, 219–232.
Wilson, M. (2018, March 9). Alexa’s creepy laughter is a bigger problem than amazon admits. Fast Company. Retrieved from [URL]
Cited by (9)
Cited by nine other publications
Chen, Allison, Sunnie S. Y. Kim, Amaya Dharmasiri, Olga Russakovsky & Judith E. Fan
Messingschlager, Tanja Veronika & Markus Appel
Grundke, Andrea
Grundke, Andrea, Markus Appel & Jan-Philipp Stein
Guingrich, Rose E. & Michael S. A. Graziano
MacDorman, Karl F.
Grundke, Andrea, Jan-Philipp Stein & Markus Appel
Carolus, Astrid & Carolin Wienrich
This list is based on CrossRef data as of 17 march 2026. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.
