用户名: 密码: 验证码:
Real-Time Emotion Recognition from Natural Bodily Expressions in Child-Robot Interaction
详细信息    查看全文
  • 作者:Weiyi Wang (16)
    Georgios Athanasopoulos (16)
    Georgios Patsis (16)
    Valentin Enescu (16)
    Hichem Sahli (16) (17)

    16. Department of Electronics and Informatics (ETRO) - AVSP
    ; Vrije Universiteit Brussel (VUB) ; Brussels ; Belgium
    17. Interuniversity Microelectronics Centre (IMEC)
    ; Heverlee ; Belgium
  • 关键词:Spontaneous emotion recognition ; Child ; robot interaction ; Bodily expressions
  • 刊名:Lecture Notes in Computer Science
  • 出版年:2015
  • 出版时间:2015
  • 年:2015
  • 卷:8927
  • 期:1
  • 页码:424-435
  • 全文大小:1,740 KB
  • 参考文献:1. Aggarwal, J., Cai, Q.: Human motion analysis: a review. In: Proc. of Nonrigid and Articulated Motion Workshop, pp. 90鈥?02 (1997)
    2. Alaerts, K, Nackaerts, E, Meyns, P, Swinnen, SP, Wenderoth, N (2011) Action and emotion recognition from point light displays: an investigation of gender differences. PLoS ONE 6: pp. e20989 CrossRef
    3. Atkinson, A, Dittrich, W, Gemmell, A, Young, A (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33: pp. 717-746 CrossRef
    4. Baron-Cohen, S., Tead, T.: Mind Reading: the Interactive Guide to Emotions. Jessica Kingsley Publishers Ltd. (2003)
    5. Beck, A, Stevens, B, Bard, K, Canamero, L (2012) Emotional Body Language Displayed by Artificial Agents. ACM Transactions on Interactive Intelligent Systems 2: pp. 1-29 CrossRef
    6. Bernhardt, D.: Emotion inference from human body motion. Tech. Rep. 787, Computer Laboratory, University of Cambridge, Cambridge (2010)
    7. Bianchi-Berthouze, N, Kleinsmith, A (2003) A categorical approach to affective gesture recognition. Connection Science 15: pp. 259-269 CrossRef
    8. Csato, L, Opper, M (2002) Sparse On-line Gaussian Processes. Neural Computation 14: pp. 641-668 CrossRef
    9. De Silva, P., Osano, M., Marasinghe, A., Madurapperuma, A.: Towards recognizing emotion with affective dimensions through body gestures. In: Proceedings of 7th International Conference on Automatic Face and Gesture Recognition (FG 2006), pp. 269鈥?74. IEEE (2006)
    10. Dittrich, W, Troscianko, T, Lea, S, Morgan, D (1996) Perception of emotion from dynamic point-light displays represented in dance. Perception 25: pp. 727-738 CrossRef
    11. Ekman, P.: Basic emotions. In: Handbook of Cognition and Emotion, chap. 3. No. 1992 (1999)
    12. Gelder, B (2009) Why bodies? Twelve reasons for including bodily expressions in affective neuroscience.. Philosophical Tran. of the Royal Society B: Biological Sciences 364: pp. 3475-3484 CrossRef
    13. Gonzalez, I, Sahli, H, Enescu, V, Verhelst, W Context-independent facial action unit recognition using shape and gabor phase information. In: D鈥橫ello, S, Graesser, A, Schuller, B, Martin, J-C eds. (2011) Affective Computing and Intelligent Interaction. Springer, Heidelberg, pp. 548-557 CrossRef
    14. Gross, JJ, Levenson, RW (1995) Emotion Elicitation Using Films. Cognition and Emotion 9: pp. 87-108 CrossRef
    15. Gross, MM, Crane, EA, Fredrickson, BL (2010) Methodology for Assessing Bodily Expression of Emotion. Journal of Nonverbal Behavior 34: pp. 223-248 CrossRef
    16. Gunes, H, Piccardi, M (2009) Automatic temporal segment detection and affect recognition from face and body display. IEEE Transactions on Systems, Man, and Cybernetics. Part B, Cybernetics: A Publication of the IEEE Systems, Man, and Cybernetics Society 39: pp. 64-84 CrossRef
    17. Jiang, D, Cui, Y, Zhang, X, Fan, P, Ganzalez, I, Sahli, H Audio visual emotion recognition based on triple-stream dynamic bayesian network models. In: D鈥橫ello, S, Graesser, A, Schuller, B, Martin, J-C eds. (2011) Affective Computing and Intelligent Interaction. Springer, Heidelberg, pp. 609-618 CrossRef
    18. Kahol, K., Tripathi, P., Panchanathan, S.: Gesture segmentation in complex motion sequences. In: Proc. of International Conference on Image Processing (ICIP 2003) (2003)
    19. Kapur, A, Kapur, A, Virji-Babul, N, Tzanetakis, G, Driessen, PF Gesture-based affective computing on motion capture data. In: Tao, J, Tan, T, Picard, RW eds. (2005) Affective Computing and Intelligent Interaction. Springer, Heidelberg, pp. 1-7 CrossRef
    20. Kleinsmith, A, Bianchi-Berthouze, N (2013) Affective Body Expression Perception and Recognition: A Survey. IEEE Transactions on Affective Computing 4: pp. 15-33 CrossRef
    21. Larsen, JT, McGraw, AP (2011) Further evidence for mixed emotions. Journal of Personality and Social Psychology 100: pp. 1095-1110 CrossRef
    22. Mckeown, G, Valstar, M, Cowie, R, Pantic, M, Member, S, Schr, M (2012) The SEMAINE database: annotated multimodal records of emotionally coloured conversations between a person and a limited agent. IEEE Transactions on Affective Computing 3: pp. 5-17 CrossRef
    23. Metallinou, A., Katsamanis, A., Narayanan, S.: Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech information. Image and Vision Computing, September 2012
    24. N/A: Aldebaran Robotics. http://www.aldebaran.com
    25. N/A: Ipi Mocap Studio. http://ipisoft.com/
    26. Picard, R.: Affective computing. Tech. Rep. 321, MIT (1995)
    27. Preston, S, Waal, F (2002) Empathy: Its Ultimate and Proximate. Behavioral and Brian Sciences 252: pp. 1-72
    28. Roberts, N.A., Tsai, J.L., Coan, J.A.: Emotion elicitation using dyadic interaction tasks. In: Handbook of Emotion Elicitation and Assessment, pp. 106鈥?23 (2007)
    29. Russell, JA (1980) A Circumplex Model of Affect. Journal of Personality & Social Psychology 39: pp. 1161-1178 CrossRef
    30. Russell, JA (2003) Core affect and the psychological construction of emotion. Psychological Review 110: pp. 145-172 CrossRef
    31. Scherer, KR (2005) What Are Emotions? And How Can They Be Measured. Social Science Information 44: pp. 695-729 CrossRef
    32. Schindler, K, Gool, L, Gelder, B (2008) Recognizing emotions expressed by body pose: a biologically inspired neural model. Neural Networks: The Official Journal of the International Neural Network Society 21: pp. 1238-1246 CrossRef
    33. Soh, H.: Online spatio-temporal gaussian process experts with application to tactile classification. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (2012)
    34. Verhelst, W., Roelands, M.: An overlap-add technique based on waveform similarity (wsola) for high quality time-scale modification of speech. In: ICASSP 1993, vol. 2, pp. 554鈥?57 (1993)
    35. Wallbott, HG (1998) Bodily Expression of Emotion. European Journal of Social Psychology 28: pp. 879-896 CrossRef
    36. Wang, F, Verhelst, W, Sahli, H Relevance vector machine based speech emotion recognition. In: D鈥橫ello, S, Graesser, A, Schuller, B, Martin, J-C eds. (2011) Affective Computing and Intelligent Interaction. Springer, Heidelberg, pp. 111-120 CrossRef
    37. Wang, W., Athanasopoulos, G., Yilmazyildiz, S., Patsis, G., Enescu, V., Sahli, H., Verhelst, W., Hiolle, A., Lewis, M., Ca帽amero, L.: Natural emotion elicitation for emotion modeling in child-robot interactions. In: Proc. of Workshop on Child Computer Interaction (WOCCI 2014) (2014, to appear)
    38. Wang, W, Enescu, V, Sahli, H Towards real-time continuous emotion recognition from body movements. In: Salah, AA, Hung, H, Aran, O, Gunes, H eds. (2013) Human Behavior Understanding. Springer, Heidelberg, pp. 235-245 CrossRef
  • 作者单位:Computer Vision - ECCV 2014 Workshops
  • 丛书名:978-3-319-16198-3
  • 刊物类别:Computer Science
  • 刊物主题:Artificial Intelligence and Robotics
    Computer Communication Networks
    Software Engineering
    Data Encryption
    Database Management
    Computation by Abstract Devices
    Algorithm Analysis and Problem Complexity
  • 出版者:Springer Berlin / Heidelberg
  • ISSN:1611-3349
文摘
Emotion perception and interpretation is one of the key desired capabilities of assistive robots, which could largely enhance the quality and naturalness in human-robot interaction. According to psychological studies, bodily communication has an important role in human social behaviours. However, it is very challenging to model such affective bodily expressions, especially in a naturalistic setting, considering the variety of expressive patterns, as well as the difficulty of acquiring reliable data. In this paper, we investigate the spontaneous dimensional emotion prediction problem in a child-robot interaction scenario. The paper presents emotion elicitation, data acquisition, 3D skeletal representation, feature design and machine learning algorithms. Experimental results have shown good predictive performance on the variation trends of emotional dimensions, especially the arousal dimension.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700