Emotional Speech Prosody Projects

with Joselyn Ho, Melissa Huynh, Suyeon Hwang, Charlie Chubb, and Greg Hickok

How and why do humans perceive emotions through music?

Music and speech share many similarities in acoustic quality for various emotions, suggesting a shared mechanism that provokes similar emotional responses in both systems. The aims of these projects explore whether one aspect of the tonal perception of music – the perceived difference between major and minor scales – is rooted in speech perception and production. Utilizing the tone-scramble stimuli found here, we compare listeners’ performance on various prosodic speech perception and production tasks. This way, we can directly test whether the sensitivity to music is correlated with the sensitivity to emotions in speech.

Avatar
Solena Mednicoff
PhD Candidate

My research interests include music cognition, neuroscience, and user engagement & analytics.

Related