Article published In: Interaction Studies
Vol. 17:3 (2016) ► pp.408–437
Robots Showing Emotions
Emotion Representation with no bio-inspired body
Published online: 30 March 2017
https://doi.org/10.1075/is.17.3.06ang
https://doi.org/10.1075/is.17.3.06ang
Abstract
Robots should be able to represent emotional states to interact with people as social agents. There are cases where robots cannot have bio-inspired bodies, for instance because the task to be performed requires a special shape, as in the case of home cleaners, package carriers, and many others. In these cases, emotional states have to be represented by exploiting movements of the body. In this paper, we present a set of case studies aimed at identifying specific values to convey emotion trough changes in linear and angular velocities, which might be applied on different non-anthropomorphic bodies. This work originates from some of the most considered emotion expression theories and from emotion coding for people. We show that people can recognize some emotional expressions better than others, and we propose some directions to express emotions exploiting only bio-neutral movement.
Article outline
- Introduction
- Related work
- System
- Robotic platform
- Case studies
- First case study: Empirical emotion parameters
- Experimental design
- Emotion description
- Experimental setup
- Results
- Second case study: Addition of context
- Experimental design
- Experimental setup
- Results
- Third case study: Literature emotional parameters
- Experimental design
- Emotion description
- Experimental setup
- Results
- Fourth case study: Experiment cross-validation
- Experimental design
- Emotion descriptions
- Experimental setup
- Results
- Conclusions and further work
- Note
References
References (32)
Angel-Fernandez, J. M., & Bonarini, A. (2016). Identifying values to express emotions with a non-anthropomorphic platform. (Manuscript in preparation)
Barakova, E. I., & Lourens, T. (2010). Expressing and interpreting emotional movements in social games with robots. Personal and Ubiquitous Computing, 14(5), 457–467.
Beck, A., Cañamero, L., & Bard, K. (2010). Towards an affect space for robots to display emotional body language. In Ieee roman conference.
Beck, A., Hiolle, A., Mazel, A., & Cañamero, L. (2010). Interpretation of emotional body language displayed by robots. In 3rd acm workshop on affective interaction in natural environments (p. 37–42).
Brown, L., & Howard, A. M. (2014, Aug). Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. In Robot and human interactive communication, 2014 ro-man: The 23rd ieee international symposium on robot and human interactive communication (p. 471–476).
Crane, E., & Gross, M. (2013). Effort-shape characteristics of emotion-related body movement. Journal of Nonverbal Behavior, 37(2), 91–105.
Destephe, M., Zecca, M., Hashimoto, K., & Takanishi, A. (2013, Dec). Perception of emotion and emotional intensity in humanoid robots gait. In Robotics and biomimetics (robio), 2013 ieee international conference on (p. 1276–1281).
Ekman, P. (2004). Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life. Owl Books.
Embgen, S., Luber, M., Becker-Asano, C., Ragni, M., Evers, V., & Arras, K. O. (2012, September). Robot-specific social cues in emotional body language. In Robot and human interactive communication, 2012 ro-man: The 21st ieee international symposium on robot and human interactive communication (pp. 1019–1025). USA: IEEE Computer Society.
from Jed Wing, M. K. C., Weston, S., Williams, A., Keefer, C., & Engelhardt, A. (2012). caret: Classification and regression training [Computer software manual]. Retrieved from [URL] (R package version 51.15–044)
Hayes, A. F., & Krippendorff, K. (2007). Answering the call for a standard reliability measure for coding data. Communication Methods and Measures, 1(1), 77–89.
Hiah, L., Beursgens, L., Haex, R., Romero, L. P., Teh, Y., ten Bhömer, M., … Barakova, E. I. (2013). Abstract robots with an attitude: Applying interpersonal relation models to human-robot interaction. In Robot and human interactive communication, 2013 ro-man: The 22nd ieee international symposium on robot and human interactive communication, gyeongju, korea (south), august 26–29, 2013 (pp. 37–44).
Kaya, N., & Epps, H. H. (2004, September 01). Relationship between color and emotion: a study of college students. College student journal, 38(3). Retrieved from [URL]
Kleinsmith, A., & Bianchi-Berthouze, N. (2013). Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing, 4(1), 15–33.
Laban, R., & Ullmann, L. (1968). Modern educational dance (2d ed., rev. by Lisa Ullmann. ed.) [Book]. Praeger New York.
Lakatos, G., Gacsi, M., Konok, V., Bruder, I., Berczky, B., Korondi, P., & Miklosi, A. (2014, December). Emotion attribution to a non-humanoid robot in different social situations. PLos ONE.
Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (2008). International affective picture system (IAPS): Affective ratings of pictures and instruction manual (Tech. Rep. No. A-8). Gainesville, FL: The Center for Research in Psychophysiology, University of Florida. Retrieved from [URL]
Li, J., & Chignell, M. H. (2011). Communication of emotion in social robots through simple head and arm movements. I. J. Social Robotics, 3(2), 125–142.
Marsella, S., Gratch, J., & Petta, P. (2010). A blueprint for affective computing. In (p. 21–46). Oxford University Press.
Nam, T.-J., Lee, J.-H., Park, S., & Suk, H.-J. (2014). Understanding the relation between emotion and physical movements. International Journal of Affective Engineering, 13(3), 217–226.
Novikova, J., & Watts, L. (2015). Towards artificial emotions to assist social coordination in hri. International Journal of Social Robotics, 7(1), 77–88.
Pratto, F., Sidanius, J., Stallworth, L. M., & Malle, B. F. (1994). Social dominance orientation: A personality variable predicting social and political attitudes. Journal of personality and social psychology, 67(4), 741.
Roether, C. L., Omlor, L., Christensen, A., & Giese, M. A. (2009). Critical features for the perception of emotion from gait. Journal of Vision, 9(6), 15.
Russell, J. A., Bachorowski, J. A., & Dols, J. M. F. (2003). Facial and Vocal Expressions of Emotion. Annual Review of Psychology, 54(1), 329–349. Retrieved from
Saerbeck, M., & Bartneck, C. (2010). Perception of affect elicited by robot motion. In Proceedings of the 5th acm/ieee international conference on human-robot interaction (pp. 53–60). Piscataway, NJ, USA: IEEE Press.
Saerbeck, M., & van Breemen, A. J. N. (2007). Design guidelines and tools for creating believable motion for personal robots. In Ro-man (p. 386–391).
Sharma, M., Hildebrandt, D., Newman, G., Young, J. E., & Eskicioglu, R. (2013). Communicating affect via flight path: Exploring use of the laban effort system for designing affective locomotion paths. In Proceedings of the 8th acm/ieee international conference on human-robot interaction (pp. 293–300). Piscataway, NJ, USA: IEEE Press.
Cited by (8)
Cited by eight other publications
Ali, Sara, Faisal Mehmood, Khawaja Fahad Iqbal, Yasar Ayaz, Muhammad Sajid, Muhammad Baber Sial, Muhammad Faiq Malik & Kashif Javed
Bonarini, Andrea
Park, Soomi, Patrick G. T. Healey & Antonios Kaniadakis
Hoggenmueller, Marius, Jiahao Chen & Luke Hespanhol
Gracia, Luis, J. Ernesto Solanes, Pau Muñoz-Benavent, Jaime Valls Miro, Carlos Perez-Vidal & Josep Tornero
2019. Human-robot collaboration for surface treatment tasks. Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systems 20:1 ► pp. 148 ff.
Marmpena, Mina, Angelica Lim & Torbjørn S. Dahl
Rincon, Liz, Enrique Coronado, Hansen Hendra, Julyando Phan, Zur Zainalkefli & Gentiane Venture
This list is based on CrossRef data as of 17 march 2026. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.
