Main Article Content
Background. As several studies already have shown, different emotions are associated with music and in some of the studies it is suggested that music and speech share the same acoustical features to induce and/or evoke emotions. Most of the studies rely on self-report and therefore solely subjective data. Other studies also include objective measurements as for example skin-conductance to complement and support subjective data.
Especially for cross-cultural studies, these methods can become difficult when it comes to self-report because there are different concepts for emotions and cultural differences in adequacy to show these emotions. In additions, the words used to describe emotions are arbitrary (even when carefully translated) because of cultural and perhaps also individual differences.
As there is also high evidence that facial expressions regarding emotions are universal across different cultures, I would like to suggest a new approach to study emotions through the facial expressions of the listener.
Method. The idea is, that music communicates emotions to the listener and that these emotions can be seen in the listener's face. With help of facial EMG or a face-recognition software such as the Facial Action Coding System (FACS), developed by Ekman and Friesen (1978), different muscle movements that are underlying certain facial expressions can be detected. There are different combinations of so- called action units for each emotion. This could be a way to decode the emotion perceived while listening to music. Even if there will not be a visible emotional expression, there could be a chance to detect so-called micro-expressions (emotional expressions that only last 1/25 to 1/15 of a second and are usually masked by another facial expression).
Goal. This project will in some part contribute to my master's thesis and is still in development. The goal of this project is to find out, if mimic expressions arise together with music perceived emotions and to discuss if some kind of empathy can be seen as one of the underlying mechanisms that let us understand emotions in music.
Keywords: emotion, facial expressions, music perception
Eerola, T. & Vuoskoski, J. (2013): A Review of Music and Emotion Studies. Approaches, Emotion Models, and Stimuli. In: Music Perception: An Interdisciplinary Journal, Vol. 30, No. 3 (2/2013), pp. 307-340 University of California Press
Ekman, P. & Friesen, W.V. (1978): Facial Action Coding System. Palo Alto, CA: Consulting Psychologists Press.
Ekman, P. (1992): An argument for basic emotions. Cognition and Emotion,6, 169-200. Gabrielsson, A. (2002: Emotion perceived and emotion felt. Same or different. Musicae Scientiae, 2001-2002, 123-147.
Gabrielsson, A., & LindstroÌˆm, E. (2010): The influence of musical structure on emotional expression. In: P. N. Juslin & J. A. Sloboda (Eds.), Handbook of music and emotion. Theory, research, application. (367-400). New York: Oxford University Press.
Ilie, G., & Thompson, W.F. (2006): A comparison of acoustic cues in music and speech for three dimensions of affect. Music Perception, 23, 319-329.
Juslin, P. N., & Laukka, P. (2004): Expression, perception, and induction of musical emotions: A review and a questionnaire study of everyday listening. Journal of New Music Research, 33, 217-238.
Juslin, P. N., LiljestroÌˆm, S. VaÌˆstefjaÌˆll, D., Barradas, G., & Silva, A. (2008): An experience sampling study of emotional reactions to music: Listener, music, and situation. Emotion, 8, 668-683.
Kohler et al. (2002): Hearing sounds, understanding action: Action representation in mirror neurons. Science, 297, 846-848.
Kreutz, G., Ott, U., Teichmann, D., Osawa, P., & Vaitl, D. (2008): Using music to induce emotions: Influences of musical preferance and absorption. Psychology of Music, 36, 101-126.
Leman, M. (2008): Embodied Music Cognition and Mediation Theory. Massachussets: The MIT Press. Ricci-Bitti, P.O. (1989): The universality in facial expressions of emotion. In: W. SchoÌˆnpflug (Ed.), Bericht über den 36. Kongress der Deutschen Gesellschaft für Psychologie. (332-343). GoÌˆttingen: Hogrefe.
Scherer, K. R., Banse, R., & Wallbott, H. G. (2001): Emotion inferences from vocal expression correlate across languages and cultures. Journal of Cross-Cultural Psychology, 32, 76-92.
VaÌˆstfjaÌˆll, Daniel (2010): Indirect perceptual, cognitive and behavioural measures. In: P. N. Juslin & J. A. Sloboda (Eds.) Handbook of music and emotion. Theory, research, application. New York: Oxford University Press.
Zentner, M., Grandjean, D., & Scherer, K. R. (2008).: Emotions evoked by the sound of music: Differentiation, classification, and measurement. Emotion, 8, 494-521.