New research conducted by Durham University's Department of Music has found that certain "musical cues used by composers and performers are important contributors in shaping the emotional expression conveyed by music".
Interface
Researchers created an interactive computer interface called EmoteControl which allowed users to control six cues (tempo, pitch, articulation, dynamics, brightness, and mode) of a musical piece in real time.
The participants were then asked to show how they thought seven different emotions (sadness, calmness, joy, anger, fear, power, and surprise) should sound like in music.
They did this by changing the musical cues in EmoteControl, allowing them to create their own variations of a range of music pieces that portrayed different emotions.
Musical cues
In general, musical cues were used in a similar way to represent a specific emotion — conveying sadness in the music using a slow tempo, minor mode, soft dynamics, legato articulation, low pitch level, and a dark timbre.
It was found that tempo and mode were the two cues that highly effected the emotion being conveyed, while dynamics and brightness cues had the least effect on shaping the different emotions in the music.
The researchers also found out that sadness and joy were amongst the most accurately recognised emotions, which correlated with previous studies.
It was found that tempo and mode were the two cues that highly effected the emotion being conveyed, while dynamics and brightness cues had the least effect on shaping the different emotions in the music4BR
Perceptions
Speaking about the research, Professor Tuomas Eerola of Durham University said: "This interactive approach allowed us to tap into the participants' perception of how different emotions should sound like in music and helped the participants create their own emotional variations of music that encompassed different emotional content."
It is claimed that the research and the EmoteControl interface could have implications for other sectors where emotional content is conveyed through music, such as sound branding (marketing), music in film and TV, adaptive music in gaming, as well as the potential to be used as an emotion communication medium for clinical purposes.
Find out more
The full paper can be accessed at: https://journals.sagepub.com/doi/10.1177/20592043211061745
Video demonstration of EmoteControl is available via: https://www.youtube.com/watch?v=fP3tAMGaaZw
Image credit: Durham University