COMPOSERS COMMONLY USE MAJOR OR MINOR SCALES to create different moods in music.Nonmusicians show poor discrimination and classification of this musical dimension; however, they can perform these tasks if the decision is phrased as happy vs. sad.We created pairs of melodies identical except for mode; the first major or minor third or sixth was the critical note that distinguished major from minor mode. Musicians and nonmusicians judged each melody as major vs. minor or happy vs. sad.We collected ERP waveforms, triggered to the onset of the critical note. Musicians showed a late positive component (P3) to the critical note only for the minor melodies, and in both tasks.Nonmusicians could adequately classify the melodies as happy or sad but showed little evidence of processing the critical information. Major appears to be the default mode in music, and musicians and nonmusicians apparently process mode differently.
Skip Nav Destination
Research Article| February 01 2008
An ERP Study of Major-Minor Classification in Melodies
Andrea R. Halpern;
Jeffrey S. Martin;
Music Perception (2008) 25 (3): 181–191.
- Views Icon Views
- Share Icon Share
- Search Site
Andrea R. Halpern, Jeffrey S. Martin, Tara D. Reed; An ERP Study of Major-Minor Classification in Melodies. Music Perception 1 February 2008; 25 (3): 181–191. doi: https://doi.org/10.1525/mp.2008.25.3.181
Download citation file: