In:Gaze in Human-Robot Communication
Edited by Frank Broz, Hagen Lehmann, Bilge Mutlu and Yukiko Nakano
[Benjamins Current Topics 81] 2015
► pp. 131–158
Designing robot eyes for communicating gaze
Published online: 16 December 2015
https://doi.org/10.1075/bct.81.07onu
https://doi.org/10.1075/bct.81.07onu
Human eyes not only serve the function of enabling us “to see” something, but also perform the vital role of allowing us “to show” our gaze for non-verbal communication, such as through establishing eye contact and joint attention. The eyes of service robots should therefore also perform both of these functions. Moreover, they should be friendly in appearance so that humans may feel comfortable with the robots. Therefore we maintain that it is important to consider gaze communication capability and friendliness in designing the appearance of robot eyes. In this paper, we propose a new robot face with rear-projected eyes for changing their appearance while simultaneously realizing the showing of gaze by incorporating stereo cameras. Additionally, we examine which shape of robot eyes is most suitable for gaze reading and gives the friendliest impression, through experiments where we altered the shape and iris size of robot eyes.
Keywords: facial design, Gaze reading, projector camera system
References (29)
Blow, M., Dautenhahn, K., Appleby, A., Nehaniv, C.L., & Lee, D.C. (2006). Perception of robot smiles and dimensions for human-robot interaction design. Paper Presented at the
International Symposium on Robot and Human Interactive Communication
, Hatfield, Hertfordshire.
Chikaraishi, T., Nakamura, Y., Matumoto, Y., & Ishiguro, H. (2008). Gaze control for a natural idling motion in an android. Paper Presented at the
13th Robotics Symposia
, Kagawa, Japan. (In Japanese)
Delaunay, F., Greeff, J., & Belpaeme, T. (2010). A study of a retro-projected robotic face and its effectiveness for gaze reading by humans. Paper Presented at the
International Conference on Human-Robot Interaction
, Osaka, Japan.
Doshi, A., & Trivedi, M.M. (2009). Head and gaze dynamics in visual attention and context learning. Paper Presented at the
Computer Society Conference on Computer Vision and Pattern Recognition Workshops
, Miami, FL.
Holm, S. (1979). A simple sequentially rejective multiple test procedure. Scandinavian Journal of Statistics, 6(2), 65–70.
Honda Motor Co., Ltd. (n.d.). ASIMO – The honda worldwide ASIMO site. Retrieved from [URL] (accessed October 30, 2013).
Jonides, J. (1981). Voluntary versus automatic control over the mind’s eye’s movement. Attention and Performance, 9, 187–203.
Kendon, A. (1967). Some functions of gaze direction in social interaction. Acta Psychophysica, 25, 22–63.
Kingstone, A., Friesen, C.K., & Gazzaniga, M.S. (2000). Reflexive joint attention depends on lateralized cortical connections. Psychological Science, 11(2), 159–166.
Kobayashi, H., & Kohshima, S. (2001). Unique morphology of the human eye and its adaptive meaning: Comparative studies on external morphology of the primate eye. Journal of Human Evolution, 40, 419–435.
Kondo, Y., Kawamura, M., Takemura, K., Takamatsu, J., & Ogasawara, T. (2011). Gaze motion planning for android robot. Paper Presented at the
International Conference on Human-Robot Interaction
, Lausanne, Switzerland.
Langton, S.R.H., & Bruce, V. (1999). Reflexive visual orienting in response to the social attention of others. Visual Cognition, 6(5), 541–567.
Levoy, M., & Hanrahan, P. (1996). Light field rendering. Paper Presented at the
International Conference and Exhibition on Computer Graphics and Interactive Techniques
, New Orleans, LA.
Misawa, K., Ishiguro, Y., & Rekimoto, J. (2012). Ma petite cherie: What are you looking at?: A small telepresence system to support remote collaborative work for intimate communication. Paper Presented at the
Augmented Human Interactional Conference
, Megeve, France.
Muro, A., & Sato, T. (2005). Perception of gaze direction with faces generated by computer graphics. Technical Report of Information and Communication Engineers MVE2005-1, 105(106), 1–6. (In Japanese).
Mutlu, B., Forlizzi, J., & Hodgins, J. (2006). A storytelling robot: Modeling and evaluation of human-like gaze behavior. Paper Presented at the
International Conference on Humanoid Robots
, Genova, Italy.
National Institute of Advanced Industrial Science and Technology. (2010). Successful development of a robot with appearance and performance similar to humans. Retrieved from [URL] (accessed October 30, 2013, in Japanese).
NEC Co. (n.d.). Comunication robot PaPeRo: Product | NEC. Retrieved from [URL] (accessed October 30, 2013, in Japanese).
Sate, K., Kodama, S., & Azuma, S. (2005). A report on the effects of virtual character eye expression: Proposal of an interaction model based on psychophysical responses. Technical Report of Information and Communication Engineers, 105(165), 117–122. (In Japanese).
Sidner, C.L., Lee, C., Kidd, C.D. & Rich, C. (2005). Explorations in engagement for humans and robots. Artificial Intelligence, 166, 140–164.
Thurstone, L.L. (1927). Psychophysical analysis. The American Journal of Psychology, 38(3), 368–389.
Tsukida, K., & Gupta, M.R. (2011). How to analyze paired comparison data (No. UWEETR -2011-0004). Washington University, Seattle: Department of Electrical Engineering.
Vecera, S.P., & Johnson, M.H. (1995). Gaze detection and the cortical processing of faces: Evidence from infants and adults. Visual Cognition, 2(1), 59–87.
Vstone Co., Ltd. (n.d.). PRODUCTS | Vstone Co., Ltd. Retrieved from [URL] (accessed October 30, 2013).
Yamazaki, Y., Dong, F., Masuda, Y., Uehara, Y., Kormushev, P., Vu, H.A., Le, P.Q., & Hirota, K. (2007). Fuzzy inference based mentality estimation for eye robot agent. Paper Presented at the
23rd Fuzzy System Symposium
, Nagoya, Japan. (In Japanese).
