Recently, Vuoskoski, Thompson, Clarke, and Spence (2014) demonstrated that visual kinematic performance cues may be more important than auditory performance cues in terms of observers’ ratings of expressivity perceived in audiovisual excerpts of piano playing, and that visual kinematic performance cues had crossmodal effects on the perception of auditory expressivity. The present study was designed to extend these findings, and to provide additional information about the roles of sight and sound in the perception and experience of musical performance. Experiment 1 investigated the relative contributions of auditory and visual kinematic performance features to participants’ subjective emotional reactions evoked by piano performances, while Experiment 2 was designed to explore the effect of visual kinematic cues on the perception of loudness and tempo variability. Experiment 1 revealed that visual performance cues seem to be just as important as auditory performance cues in terms of the subjective emotional reaction of the observer, thus highlighting the importance of non-auditory cues for music-induced emotions. The results of Experiment 2 revealed that visual kinematic cues only affected ratings of loudness variability, but not ratings of tempo variability.
Listening to music makes us move in various ways. Several factors can affect the characteristics of these movements, including individual factors and musical features. Additionally, music-induced movement may also be shaped by the emotional content of the music, since emotions are an important element of musical expression. This study investigates possible relationships between emotional characteristics of music and music-induced, quasi-spontaneous movement. We recorded music-induced movement of 60 individuals, and computationally extracted features from the movement data. Additionally, the emotional content of the stimuli was assessed in a perceptual experiment. A subsequent correlational analysis revealed characteristic movement features for each emotion, suggesting that the body reflects emotional qualities of music. The results show similarities to movements of professional musicians and dancers, and to emotion-specific nonverbal behavior in general, and could furthermore be linked to notions of embodied music cognition. The valence and arousal ratings were subsequently projected onto polar coordinates to further investigate connections between the emotions of Russell’s (1980) circumplex models and the movement features
The visual channel has been shown to be more informative than the auditory channel in perceptual judgments of a performer's level of expression. Previous work has revealed a positive relationship between amplitude of music-related movement and ratings of expression, for example, and observers have been shown to be sensitive to kinematic features of music-related movement. In this study, we investigate relationships between the kinematics of a conductors' expressive gestures and ratings of perceived expression. Point-light representations (totalling 10 minutes) of two professional conductors were presented to participants who provided continuous ratings of perceived valence, activity, power, and overall expression using a virtual slider interface. Relationships between these ratings and 11 kinematic variables computationally extracted from the movement data were subsequently examined using linear regression. Higher levels of expressivity were found to be conveyed by gestures characterized by increased amplitude, greater variance, and higher speed of movement.
Listening to music often is associated with spontaneous body movements frequently synchronized with its periodic structure. The notion of embodied cognition assumes that intelligent behavior does not emerge from mere passive perception, but requires goal-directed interactions between the organism and its environment. According to this view, one could postulate that we may use our bodily movements to help parse the metric structure of music. The aim of this study was to investigate how pulsations on different metrical levels manifest in music-induced movement. Musicians were presented with a piece of instrumental music in 4/4 time, played at four different tempi ranging from 92 to 138 bpm. Participants were instructed to move to the music, and their movements were recorded with a high quality optical motion capture system. Subsequently, signal processing methods and principal components analysis were applied to extract movement primitives synchronized with different metrical levels. We found differences between metric levels in terms of the prevalence of synchronized eigenmovements. For instance, mediolateral movements of arms were found to be frequently synchronized with the tactus level pulse, while rotation and lateral flexion of the upper torso were commonly found to exhibit periods of two and four beats, respectively. The results imply that periodicities on several metric levels are simultaneously present in music-induced movement. This could suggest that the metric structure of music is encoded in these movements.