Article published In: Interaction Studies
Vol. 19:3 (2018) ► pp.487–498
Frustration in the face of the driver
A simulator study on facial muscle activity during frustrated driving
Published online: 13 March 2019
https://doi.org/10.1075/is.17005.ihm
https://doi.org/10.1075/is.17005.ihm
Abstract
Frustration in traffic is one of the causes of aggressive driving. Knowledge whether a driver is frustrated may be
utilized by future advanced driver assistance systems to counteract this source of crashes. One possibility to achieve this is to
automatically recognize facial expressions of drivers. However, only little is known about the facial expressions of frustrated
drivers. Here, we report the results of a driving simulator study investigating the facial muscle activity that comes along with
frustration. Twenty-eight participants were video-taped during frustrated and non-frustrated driving situations. Their facial
muscle activity was manually coded according to the Facial Action Coding System. Participants showed significantly more facial
muscle activity in the mouth region. Thus, recording facial muscle behavior potentially provides traffic researchers and
assistance system developers with the possibility to recognize frustration while driving.
Article outline
- Methods
- Participants
- Experimental set-up
- Experimental design and cover story
- Subjective ratings
- Video coding
- Results
- Ratings
- Activity in facial AUs
- Discussion
- Acknowledgements
References
References (24)
Barrett, L. F. (2016). The theory of constructed emotion: an active inference account of interoception and categorization. Social Cognitive and Affective Neuroscience, nsw154.
Bruce, V. (1992). What the human face tells the human mind: some challenges for the robot-human interface. In Proceedings of IEEE International Workshop on Robot and Human Communication (pp. 44–51).
Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1), 49–59.
D’Mello, S., Craig, S., Gholson, B., Franklin, S., Picard, R., & Graesser, A. (2005). Integrating affect sensors in an intelligent tutoring system. In Affective Interactions: The Computer in the Affective Loop Workshop at 2005 International Conference on Intelligent User Interfaces. New York: AMC Press.
Deffenbacher, J. L., Lynch, R. S., Oetting, E. R., & Swaim, R. C. (2002). The Driving Anger Expression Inventory: a measure of how people express their anger on the road. Behaviour Research and Therapy, 40(6), 717–737.
Donkor, R., Burnett, G., & Sharples, S. (2014). Measuring the emotional validity of driving simulators. Advances in Transportation Studies, (Special, Issue Special Vol1), 51–64.
Ekman, P., & Friesen, W. V. (2003). Unmasking the face: A guide to recognizing emotions from facial clues. Cambridge, MA: Malor Books.
Ekman, P., Friesen, W. V., & Hager, J. (2002). The Investigator’s Guide for the Facial Action Coding System. Salt Lake City: A Human face.
Gao, H., Yuce, A., & Thiran, J.-P. (2014). Detecting emotional stress from facial expressions for driving safety. In IEEE International Conference on Image Processing (ICIP) (pp. 5961–5965).
Gehrig, T., & Ekenel, H. K. (2011). A common framework for real-time emotion recognition and facial action unit detection. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR Workshops) (pp. 1–6).
Gosselin, P., Perron, M., & Beaupré, M. (2010). The voluntary control of facial action units in adults. Emotion, 10(2), 266–271.
Grafsgaard, J. F., Wiggins, J. B., Boyer, K. E., Wiebe, E. N., & Lester, J. C. (2013). Automatically recognizing facial indicators of frustration: A learning-centric analysis. In Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 159–165).
Hamm, J., Kohler, C. G., Gur, R. C., & Verma, R. (2011). Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders. Journal of Neuroscience Methods, 200(2), 237–256.
Hart, S. G., & Staveland, L. (1988). Development of the NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in Psychology, 521, 139–183.
Healey, J., & Picard, R. (2005). Detecting stress during real-world driving tasks using physiological sensors. IEEE Transactions on Intelligent Transportation Systems, 6(2), 156–166.
Hoque, M. E., McDuff, D. J., & Picard, R. W. (2012). Exploring temporal patterns in classifying frustrated and delighted smiles. IEEE Transactions on Affective Computing, 3(3), 323–334.
Lazarus, R. S. (1991). Progress on a cognitive-motivational-relational theory of emotion. American Psychologist, 46(8), 819–834.
Lee, Y.-C. (2010). Measuring drivers’ frustration in a driving simulator. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 54(19), 1531–1535.
Lee, Y.-C., & LaVoie, N. (2014). Relationship between frustration justification and vehicle control behaviors ? A simulator study. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 58(1), 2235–2239.
Malta, L., Miyajima, C., Kitaoka, N., & Takeda, K. (2011). Analysis of real-world driver’s frustration. IEEE Transactions on Intelligent Transportation Systems, 12(1), 109–118.
Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178.
Scherer, K. R. (2005). What are emotions? And how can they be measured? Social Science Information, 44(4), 695–729.
Tews, T.-K., Oehl, M., Siebert, F. W., Höger, R., & Faasch, H. (2011). Emotional human-machine interaction: cues from facial expressions. In D. Hutchison, T. Kanade, J. Kittler, J. M. Kleinberg, F. Mattern, J. C. Mitchell, … (Eds.), Lecture Notes in Computer Science. Human Interface and the Management of Information. Interacting with Information (Vol. 67711, pp. 641–650). Berlin: Springer.
Cited by (26)
Cited by 26 other publications
Seaman, Sean, Peihan Zhong, Linda Angell, Joshua Domeyer & John Lenneman
Karas, Vincent, Dagmar M. Schuller & Björn W. Schuller
Wang, Dingyu, Shaocheng Jia, Xin Pei, Chunyang Han, Danya Yao & Dezhi Liu
Zhou, Xin, Xing Chen, Liu Tang, Yi Wang, Jingyue Zheng & Wei Zhang
Bosch, Esther, Raquel Le Houcq Corbí, Klas Ihme, Stefan Hörmann, Meike Jipp & David Käthner
Bosch, Esther, David Käthner, Meike Jipp, Uwe Drewitz & Klas Ihme
Rao, Shruti, Surjya Ghosh, Gerard Pons Rodriguez, Thomas Röggla, Pablo Cesar & Abdallah El Ali
Zhao, Sicheng, Xiaopeng Hong, Jufeng Yang, Yanyan Zhao & Guiguang Ding
Bosch, Esther, Marie Klosterkamp, Angelo Guevara, David Kaethner, Alexandra Bendixen & Klas Ihme
Nadri, Chihab, Ignacio Alvarez, Esther Bosch, Michael Oehl, Michael Braun, Jennifer Healey, Christophe Jallais, Wendy Ju, Jingyi Li & Myounghoon Jeon
Oh, Geesung, Euiseok Jeong, Rak Chul Kim, Ji Hyun Yang, Sungwook Hwang, Sangho Lee & Sejoon Lim
Siddiqi, Muhammad Hameed, Khalil Khan, Rehan Ullah Khan & Amjad Alsirhani
Sukhavasi, Susrutha Babu, Suparshya Babu Sukhavasi, Khaled Elleithy, Ahmed El-Sayed & Abdelrahman Elleithy
Zhou, Xin, Liang Ma & Wei Zhang
Bustos, Cristina, Neska Elhaouij, Albert Solé-Ribalta, Javier Borge-Holthoefer, Agata Lapedriza & Rosalind Picard
Jipp, Meike & Jochen Steil
Krüger, Sandra, Esther Bosch, Klas Ihme & Michael Oehl
Lemmer, Karsten, Meike Jipp, Heiner Bubb, Hans-Jörg Vögel, Matthias Jung, Georg Laukart & Thomas Vorberg
Franz, Oliver, Uwe Drewitz & Klas Ihme
Nadri, Chihab, Jingyi Li, Esther Bosch, Michael Oehl, Ignacio Alvarez, Michael Braun & Myounghoon Jeon
Ortoncelli, Andre Roberto, Luciano Silva, Olga Regina Perreira Bellon, Tiago Mota de Oliveira & Juliana Daga
Pape, Anna-Antonia, Sonja Cornelsen, Victor Faeßler, Klas Ihme, Michael Oehl, Uwe Drewitz, Franziska Hartwich, Frank Schrödel, Andreas Lüdtke & Martin Schramm
Weidemann, Alexandra & Nele Rußwinkel
Zepf, Sebastian, Tobias Stracke, Alexander Schmitt, Florian van de Camp & Juergen Beyerer
This list is based on CrossRef data as of 17 march 2026. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.
