Article published In: Interaction Studies
Vol. 22:2 (2021) ► pp.141–176
Impact of nonverbal robot behaviour on human teachers’ perceptions of a learner robot
Available under the Creative Commons Attribution-NonCommercial (CC BY-NC) 4.0 license.
For any use beyond this license, please contact the publisher at rights@benjamins.nl.
Published online: 28 February 2022
https://doi.org/10.1075/is.20036.ali
https://doi.org/10.1075/is.20036.ali
Abstract
How do we perceive robots practising a task that we have taught them? While learning, human trainees usually provide nonverbal cues that reveal their level of understanding and interest in the task. Similarly, nonverbal social cues of trainee robots that can be interpreted naturally by humans can enhance robot learning. In this article, we investigated a scenario in which a robot is practising a physical task in front of the human teachers (i.e., participants), who were asked to assume that they had previously taught the robot to perform that task. Through an online experiment with 167 participants, we examined the effects of different gaze patterns and arm movements with multiple speeds and various kinds of pauses on human teachers’ perception of different attributes of the robot. We found that the perception of a trainee robot’s attributes (e.g., confidence and eagerness to learn) can be systematically affected by its behaviours. Findings of this study can inform designing more successful nonverbal social interactions for intelligent robots.
Keywords: nonverbal behaviour, kinesics, gaze, perceived robot attributes, social learning
Article outline
- 1.Introduction
- 2.Related work
- 2.1Nonverbal behaviour for robots
- 2.2Impact of robots’ behavioural parameters
- 3.Research question and hypotheses
- 4.Experiment
- 4.1Overall design
- 4.2Technical implementation
- 4.3Procedure
- 4.4Participants
- 4.5Statistical analysis
- 5.Results
- 5.1Perceived confidence of the robot
- 5.2Perceived calmness of the robot
- 5.3Robot’s perceived liking the task
- 5.4Perceived robot’s attention to the task
- 5.5Perceived proficiency of the robot
- 5.6Robot’s perceived eagerness to learn
- 5.7Robot perceived as goal-driven
- 5.8Perceived robot’s attention to the teacher
- 6.Discussion
- 6.1Shaping human teachers’ perceptions
- 6.2Interaction design implications
- 7.Limitations and future work
- 8.Conclusion
- Acknowledgements
- Note
References
References (67)
Admoni, H., Hayes, B., Feil-Seifer, D., Ullman, D., & Scassellati, B. (2013). Are you looking at me? Perception of robot attention is mediated by gaze type and group size. 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 389–395.
Admoni, H., & Scassellati, B. (2017). Social eye gaze in human-robot interaction: A review. Journal of Human-Robot Interaction, 6(1), 25.
Ahmadzadeh, S. R., Paikan, A., Mastrogiovanni, F., Natale, L., Kormushev, P., & Caldwell, D. G. (2015). Learning symbolic representations of actions from human demonstrations. 2015 IEEE International Conference on Robotics and Automation (ICRA), 3801–3808.
Aliasghari, P., Ghafurian, M., Nehaniv, C. L., & Dautenhahn, K. (2021). Effects of gaze and arm motion kinesics on a humanoid’s perceived confidence, eagerness to learn, and attention to the task in a teaching scenario. HRI ’21: Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, 197–206.
Andrist, S., Mutlu, B., & Tapus, A. (2015). Look like me: Matching robot personality via gaze to increase motivation. CHI ’15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 3603–3612.
Barrick, M. R., & Mount, M. K. (1991). The big five personality dimensions and job performance: A meta-analysis. Personnel Psychology, 44(1), 1–26.
Bartneck, C., Duenser, A., Moltchanova, E., & Zawieska, K. (2015). Comparing the similarity of responses received from studies in Amazon Mechanical Turk to studies conducted online and with direct recruitment. PLOS ONE, 10(4), 1–23.
Bates, D., Mächler, M., Bolker, B. M., & Walker, S. C. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67(1).
Billard, A., Calinon, S., Dillmann, R., & Schaal, S. (2008). Robot programming by demonstration. Springer handbook of robotics (pp. 1371–1394). Springer.
Birdwhistell, R. L. (1983). Background to kinesics. ETC: A Review of General Semantics, 40(3), 352–361.
Biswas, M., Romeo, M., Cangelosi, A., & Jones, R. B. (2020). Are older people any different from younger people in the way they want to interact with robots? Scenario based survey. Journal on Multimodal User Interfaces, 14(1), 61–72.
Bozdogan, H. (1987). Model selection and Akaike’s Information Criterion (AIC): The general theory and its analytical extensions. Psychometrika, 52(3), 345–370.
Breazeal, C. (2009). Role of expressive behaviour for robots that learn from people. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3527–3538.
Brooks, A. G., & Arkin, R. C. (2007). Behavioral overlays for non-verbal communication expression on a humanoid robot. Autonomous Robots, 22(1), 55–74.
Bruneau, T. (2012). Chronemics: Time-binding and the construction of personal time. ETC: A Review of General Semantics, 69(1), 72–92.
Cangelosi, A., & Stramandinoli, F. (2018). A review of abstract concept learning in embodied agents and robots. Philosophical Transactions of the Royal Society B: Biological Sciences, 373(1752), 2–7.
Chao, C., Cakmak, M., & Thomaz, A. L. (2010). Transparent active learning for robots. 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 317–324.
Claret, J. A., Venture, G., & Basañez, L. (2017). Exploiting the robot kinematic redundancy for emotion conveyance to humans as a lower priority task. International Journal of Social Robotics, 9(2), 277–292.
Dautenhahn, K. (2007). Socially intelligent robots: Dimensions of human-robot interaction. Philosophical Transactions of the Royal Society B: Biological Sciences, 362(1480), 679–704.
Di Cesare, G. (2020). The importance of the affective component of movement in action understanding. Modelling human motion: From human perception to robot design (pp. 103–116). Springer.
Dragan, A., & Srinivasa, S. (2013). Generating legible motion. Proceedings of Robotics: Science and Systems.
Emery, N. (2000). The eyes have it: The neuroethology, function and evolution of social gaze. Neuroscience Biobehavioral Reviews, 24(6), 581–604.
English, B. A., Coates, A., & Howard, A. (2017). Recognition of gestural behaviors expressed by humanoid robotic platforms for teaching affect recognition to children with autism – A healthy subjects pilot study. International Conference on Social Robotics, 567–576.
Farroni, T., Csibra, G., Simion, F., & Johnson, M. H. (2002). Eye contact detection in humans from birth. Proceedings of the National Academy of Sciences, 99(14), 9602–9605.
Feil-Seifer, D., Haring, K. S., Rossi, S., Wagner, A. R., & Williams, T. (2020). Where to next? the impact of covid-19 on human-robot interaction research. ACM Transactions on Human-Robot Interaction, 10(1).
Fischer, K., & Saunders, J. (2012). Getting acquainted with a developing robot. HBU 12: Proceedings of the Third international conference on Human Behavior Understanding, 125–133.
Funke, F., & Reips, U.-D. (2012). Why semantic differentials in web-based research should be made from visual analogue scales and not from 5-point scales. Field Methods, 24(3), 310–327.
Ghafurian, M., Budnarain, N., & Hoey, J. (2019). Improving humanness of virtual agents and users’ cooperation through emotions. [URL]
Glowinski, D., Dael, N., Camurri, A., Volpe, G., Mortillaro, M., & Scherer, K. (2011). Toward a minimal representation of affective gestures. IEEE Transactions on Affective Computing, 2(2), 106–118.
Hietanen, J. K., Leppänen, J. M., Peltola, M. J., Linna-aho, K., & Ruuhiala, H. J. (2008). Seeing direct and averted gaze activates the approach-avoidance motivational brain systems. Neuropsychologia, 46(9), 2423–2430.
Hortensius, R., Hekele, F., & Cross, E. S. (2018). The perception of emotion in artificial agents. IEEE Transactions on Cognitive and Developmental Systems, 10(4), 852–864.
Hosseinpanah, A., Kramer, N. C., & Straßmann, C. (2018). Empathy for everyone?: The effect of age when evaluating a virtual agent. HAI ’18: Proceedings of the 6th International Conference on Human-Agent Interaction, 184–190.
Huang, S. H., Huang, I., Pandya, R., & Dragan, A. D. (2020). Nonverbal robot feedback for human teachers. Proceedings of the Conference on Robot Learning, 1038–1051. [URL]
Ito, A., Hayakawa, S., & Terada, T. (2004). Why robots need body for mind communication – An attempt of eye-contact between human and robot. RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication, 473–478.
Jonell, P., Kucherenko, T., Torre, I., & Beskow, J. (2020). Can we trust online crowdworkers? comparing online and offline participants in a preference test of virtual agents. IVA ’20: Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents.
Joo, H., Simon, T., Cikara, M., & Sheikh, Y. (2019). Towards social artificial intelligence: Nonverbal social signal prediction in a triadic interaction. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 10865–10875.
Kaiser, F. G., Glatte, K., & Lauckner, M. (2019). How to make nonhumanoid mobile robots more likable: Employing kinesic courtesy cues to promote appreciation. Applied Ergonomics, 781, 70–75.
Kazuaki, T., Motoyuki, O., & Natsuki, O. (2010). The hesitation of a robot: A delay in its motion increases learning efficiency and impresses humans as teachable. 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 189–190.
Kim, J., Cauli, N., Vicente, P., Damas, B., Cavallo, F., & Santos-Victor, J. (2018). ICub, clean the table! A robot learning from demonstration approach using deep neural networks. 18th IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC 2018, 3–9.
Kim, J., Kwak, S. S., & Kim, M. (2009). Entertainment robot personality design based on basic factors of motions: A case study with ROLLY. RO-MAN 2009 – The 18th IEEE International Symposium on Robot and Human Interactive Communication, 803–808.
Koenig, N., Takayama, L., & Matarić, M. (2010). Communication and knowledge sharing in human-robot interaction and learning from demonstration. Neural Networks, 23(8–9), 1104–1112.
Kulić, D., & Croft, E. (2007). Physiological and subjective responses to articulated robot motion. Robotica, 25(1), 13–27.
Laird, J. E., Gluck, K., Anderson, J., Forbus, K. D., Jenkins, O. C., Lebiere, C., Salvucci, D., Scheutz, M., Thomaz, A., Trafton, G., et al. (2017). Interactive task learning. IEEE Intelligent Systems, 32(4), 6–21.
Maljkovic, V., & Nakayama, K. (1994). Priming of pop-out: I. Role of features. Memory & Cognition, 22(6), 657–672.
Matejka, J., Glueck, M., Grossman, T., & Fitzmaurice, G. (2016). The effect of visual appearance on the performance of continuous sliders and visual analogue scales. CHI ’16: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 5421–5432.
Mavridis, N. (2015). A review of verbal and non-verbal human-robot interactive communication. Robotics and Autonomous Systems, 63(P1), 22–35.
Metta, G., Fitzpatrick, P., & Natale, L. (2006). YARP: Yet another robot platform. International Journal of Advanced Robotic Systems, 3(1), 43–48.
Metta, G., Sandini, G., Vernon, D., Natale, L., & Nori, F. (2008). The iCub humanoid robot: An open platform for research in embodied cognition. PerMIS ’08: Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, 50–56.
Moon, A., Panton, B., Van der Loos, H. F. M., & Croft, E. A. (2010). Using hesitation gestures for safe and ethical human-robot interaction. IEEE Conference on Robotics and Automation: Workshop on Interactive Communication for Autonomous Intelligent Robots, 1–3.
Moon, A., Parker, C. A. C., Croft, E. A., & Van der Loos, H. F. M. (2013). Design and impact of hesitation gestures during human-robot resource conflicts. Journal of Human-Robot Interaction, 2(3), 18–40.
Muto, Y., Takasugi, S., Yamamoto, T., & Miyake, Y. (2009). Timing control of utterance and gesture in interaction between human and humanoid robot. RO-MAN 2009 – The 18th IEEE International Symposium on Robot and Human Interactive Communication, 1022–1028.
Nehaniv, C. L., Dautenhahn, K., Kubacki, J., Haegele, M., Parlitz, C., & Alami, R. (2005). A methodological approach relating the classification of gesture to identification of human intent in the context of human-robot interaction. ROMAN2005. IEEE International Workshop on Robot and Human Interactive Communication, 371–377.
Normoyle, A., Badler, J. B., Fan, T., Badler, N. I., Cassol, V. J., & Musse, S. R. (2013). Evaluating perceived trust from procedurally animated gaze. MIG ’13: Proceedings of Motion on Games, 119–126.
Paolacci, G., Chandler, J., & Ipeirotis, P. G. (2010). Running experiments on Amazon Mechanical Turk. Judgment and Decision Making, 5(5), 411–419.
Peters, R., Broekens, J., & Neerincx, M. A. (2017). Robots educate in style: The effect of context and non-verbal behaviour on children’s perceptions of warmth and competence. 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 449–455.
Pitsch, K., Lohan, K. S., Rohlfing, K., Saunders, J., Nehaniv, C. L., & Wrede, B. (2012). Better be reactive at the beginning. Implications of the first seconds of an encounter for the tutoring style in human-robot-interaction. 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, 974–981.
Robert Jr., L. P., Alahmad, R., Esterwood, C., Kim, S., You, S., & Zhang, Q. (2020). A review of personality in human-robot interactions. Foundations and Trends in Information Systems, 4(2), 107–212.
Robins, B., Dautenhahn, K., Nehaniv, C. L., Mirza, N. A., Franҫois, D., & Olsson, L. (2005). Sustaining interaction dynamics and engagement in dyadic child-robot interaction kinesics: Lessons learnt from an exploratory study. ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 20051., 716–722.
Saerbeck, M., & Bartneck, C. (2010). Perception of affect elicited by robot motion. 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 53–60.
Salem, M., Kopp, S., Wachsmuth, I., Rohlfing, K., & Joublin, F. (2012). Generation and evaluation of communicative robot gesture. International Journal of Social Robotics, 4(2), 201–217.
Saunderson, S., & Nejat, G. (2019). How robots influence humans: A survey of nonverbal communication in social human-robot interaction. International Journal ofSocial Robotics, 11(4), 575–608.
Treiblmaier, H., & Filzmoser, P. (2011). Benefits from using continuous rating scales in online survey research. Thirty Second International Conference on Information Systems, 2087–2099.
Vannucci, F., Di Cesare, G., Rea, F., Sandini, G., & Sciutti, A. (2018). A robot with style: Can robotic attitudes influence human actions? 2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids), 952–957.
Venture, G., & Kulić, D. (2019). Robot expressive motions: A survey of generation and evaluation methods. ACM Transactions on Human-Robot Interaction, 8(4).
Wallkötter, S., Stower, R., Kappas, A., & Castellano, G. (2020). A Robot by Any Other Frame: Framing and Behaviour Influence Mind Perception in Virtual but not Real-World Environments. HRI ’20: ACM/IEEE International Conference on Human-Robot Interaction, 609–618.
Cited by (3)
Cited by three other publications
Aliasghari, Pourya, Moojan Ghafurian, Chrystopher L. Nehaniv & Kerstin Dautenhahn
Aliasghari, Pourya, Moojan Ghafurian, Chrystopher L. Nehaniv & Kerstin Dautenhahn
This list is based on CrossRef data as of 17 march 2026. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.
