The relationship between musical features and perceived emotion was investigated by using continuous response methodology and time-series analysis. Sixty-seven participants responded to four pieces of Romantic music expressing different emotions. Responses were sampled once per second on a two-dimensional emotion space (happy-sad valence and aroused-sleepy). Musical feature variables of loudness, tempo, melodic contour, texture, and spectral centroid (related to perceived timbral sharpness) were coded. Musical feature variables were differenced and used as predictors in two univariate linear regression models of valence and arousal for each of the four pieces. Further adjustments were made to the models to correct for serial correlation. The models explained from 33% to 73% of variation in univariate perceived emotion. Changes in loudness and tempo were associated positively with changes in arousal, but loudness was dominant. Melodic contour varied positively with valence, though this finding was not conclusive. Texture and spectral centroid did not produce consistent predictions. This methodology facilitates a more ecologically valid investigation of emotion in music and, importantly in the present study, enabled the approximate identification of the lag between musical features and perceived emotion. Responses were made 1 to 3 s after a change in the causal musical event, with sudden changes in loudness producing response lags from zero (nearly instantaneous) to 1 s. Other findings, interactions, and ramifications of the methodology are also discussed.
Skip Nav Destination
Article navigation
June 2004
Research Article|
June 01 2004
Modeling Perceived Emotion With Continuous Musical Features
Emery Schubert
Emery Schubert
School of Music and Music Education, University of New South Wales, Sydney NSW 2052, Australia; e-mail: E.Schubert@unsw.edu.au
Search for other works by this author on:
Music Perception (2004) 21 (4): 561–585.
Citation
Emery Schubert; Modeling Perceived Emotion With Continuous Musical Features . Music Perception 1 June 2004; 21 (4): 561–585. doi: https://doi.org/10.1525/mp.2004.21.4.561
Download citation file:
Sign in
Don't already have an account? Register
Client Account
You could not be signed in. Please check your email address / username and password and try again.
Could not validate captcha. Please try again.