Article published In: Interaction Studies
Vol. 16:2 (2015) ► pp.173–179
The eyes are the window to the uncanny valley
Mind perception, autism and missing souls
Published online: 26 November 2015
https://doi.org/10.1075/is.16.2.02sch
https://doi.org/10.1075/is.16.2.02sch
Horror movies have discovered an easy recipe for making people creepy: alter their eyes. Instead of normal eyes, zombies’ eyes are vacantly white, vampires’ eyes glow with the color of blood, and those possessed by demons are cavernously black. In the Academy Award winning Pan’s Labyrinth, director Guillermo del Toro created the creepiest of all creatures by entirely removing its eyes from its face, placing them instead in the palms of its hands. The unease induced by altering eyes may help to explain the uncanny valley, which is the eeriness of robots that are almost—but not quite—human (Mori, M. (1970). The uncanny valley. Energy, 7(4), 33–35. ). Much research has explored the uncanny valley, including the research reported by MacDorman & Entezari (in press), which focuses on individual differences that might predict the eeriness of humanlike robots. In their paper, they suggest that a full understanding of this phenomenon needs to synthesize individual differences with features of the robot. One theory that links these two concepts is mind perception, which past research highlights as essential to the uncanny valley (Gray, K., & Wegner, D.M. (2012). Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 125(1), 125–130. ). Mind perception is linked to both individual differences—autism—and to features of the robot—the eyes—and can provide a deeper understanding of this arresting phenomenon. In this paper, we present original data that links uncanniness to the eyes through aberrant perceptions of mind.
Keywords: robots, emotion, human-computer interaction, theory of mind
Article outline
- 1Mind perception and the uncanny valley
- 2Original research: Eyes, mind perception and uncanniness
- 3Autism and the uncanny valley
- 4Conclusion
References
References (18)
Baron-Cohen, S., Wheelwright, S., Jolliffe, & Therese. (1997). Is there a“ language of the eyes”? Evidence from normal adults, and adults with autism or Asperger syndrome. Visual Cognition, 4(3), 311–331.
Dautenhahn, K., Nehaniv, C.L., Walters, M.L., Robins, B., Kose-Bagci, H., Mirza, N.A., & Blow, M. (2009). KASPAR–a minimally expressive humanoid robot for human–robot interaction research. Applied Bionics and Biomechanics, 6(3–4), 369–397.
Destephe, M., Zecca, M., Hashimoto, & Takanishi, A. (2014, October). Paper presented at
IIEEE-Robotics and Biomimetics Conference
, Bali, Indonesia.
Gray, K., Jenkins, A.C., Heberlein, A.S., & Wegner, D.M. (2011). Distortions of mind perception in psychopathology. PNAS, 108(2), 477–479.
Gray, K., & Wegner, D.M. (2012). Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 125(1), 125–130.
Looser, C.E., & Wheatley, T. (2010). The tipping point of animacy. How, when, and where we perceive life in a face. Psychological Science, 21(12), 1854–1862.
MacDorman, K.F., Green, R.D., Ho, C.-C., & Koch, C.T. (2009). Too real for comfort? Uncanny responses to computer generated faces. Computers in Human Behavior, 25(3), 695–710.
Pioggia, G., Ahluwalia, A., Carpi, F., Marchetti, A., Ferro, M., Rocchia, W., & Rossi, D.D. (2004). FACE: Facial automaton for conveying emotions. Applied Bionics and Biomechanics, 1(2), 91–100.
Robins, B., Dautenhahn, K., & Dubowski, J. (2006). Does appearance matter in the interaction of children with autism with a humanoid robot? Interaction Studies, 7(3), 509–542.
Scassellati, B. (2007). How social robots will help us to diagnose, treat, and understand autism. In Robotics research (pp. 552–563).
Senju, A., & Johnson, M.H. (2009). Atypical eye contact in autism: Models, mechanisms and development. Neuroscience & Biobehavioral Reviews, 33(8), 1204–1214.
Seyama, J., & Nagayama, R.S. (2007). The uncanny valley: Effect of realism on the impression of artificial human faces. Presence: Teleoperators and Virtual Environments, 16(4), 337–351.
Cited by (18)
Cited by 18 other publications
Brucks, Melanie S., Jacqueline R. Rifkin & Jeff S. Johnson
Hietanen, Jari K., Samuli Linnunsalo & Dennis Küster
Guingrich, Rose E & Michael S A Graziano
Li, Gang, Tingting Wang, Miaomiao Yang & Fengjie Guo
Sun, Binbin, Shan Pei, Qingjin Wang & Xuelei Meng
Benjamin, Rachele & Steven J. Heine
Delgado, Naira, Simone Mattavelli, Marco Brambilla, Laura Rodríguez-Gómez & Lasana T. Harris
Kjeldgaard-Christiansen, Jens & Mathias Clasen
Cassidy, Brittany S., Robert W. Wiley, Mattea Sim & Kurt Hugenberg
Saneyoshi, Ayako, Matia Okubo, Hikaru Suzuki, Takato Oyama & Bruno Laeng
Yam, Kai Chi, Yochanan Bigman & Kurt Gray
Taylor, Jebediah, Staci Meredith Weiss & Peter J. Marshall
2020. “Alexa, how are you feeling today?”. Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systems 21:3 ► pp. 329 ff.
Strait, Megan, Heather L. Urry & Paul Muentener
Feng, Shuyuan, Xueqin Wang, Qiandong Wang, Jing Fang, Yaxue Wu, Li Yi, Kunlin Wei & Nicholas Greatorex Riches
Stein, Jan-Philipp & Peter Ohler
Deska, Jason C. & Kurt Hugenberg
Strait, Megan K., Victoria A. Floerke, Wendy Ju, Keith Maddox, Jessica D. Remedios, Malte F. Jung & Heather L. Urry
This list is based on CrossRef data as of 16 march 2026. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.
