MUSIC ELICITS PROFOUND EMOTIONS; HOWEVER, THE time-course of these emotional responses during listening sessions is unclear. We investigated the length of time required for participants to initiate emotional responses ("integration time") to 138 musical samples from a variety of genres by monitoring their real-time continuous ratings of emotional content and arousal level of the musical excerpts (made using a joystick). On average, participants required 8.31 s ( SEM = 0.10) of music before initiating emotional judgments. Additionally, we found that: 1) integration time depended on familiarity of songs; 2) soul/funk, jazz, and classical genres were more quickly assessed than other genres; and 3) musicians did not differ significantly in their responses from those with minimal instrumental musical experience. Results were partially explained by the tempo of musical stimuli and suggest that decisions regarding musical structure, as well as prior knowledge and musical preference, are involved in the emotional response to music.
Several algorithms for finding the tonal center of a musical context are extant in the literature. For use in interactive music systems, we are interested in algorithms that are fast enough to run in real time and that need only make reference to the material as it appears in sequence. In this article, I examine a number of such algorithms and the ways in which their contribution to real-time algorithmic listening can be bolstered by reference to concurrent analyzers working on other tasks. Though as part of the discussion I review my own key finder, the focus here is on the coordination of published methods using control structures for multiprocess analysis and their application in performance.