Skip Nav Destination
Close Modal
Update search
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
NARROW
Format
Journal
Article Type
Date
Availability
1-12 of 12
Keywords: attention
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Music Perception (2020) 38 (1): 78–98.
Published: 09 September 2020
... mismatch negativity, MMN) and behavioral measurements (conscious discrimination). Furthermore, we wanted to see if focusing attention to the sounds modulated the neural processing. We used chords comprised of either highly consonant or highly dissonant intervals and further manipulated the degree of...
Abstract
Consonance and dissonance are basic phenomena in the perception of chords that can be discriminated very early in sensory processing. Musical expertise has been shown to facilitate neural processing of various musical stimuli, but it is unclear whether this applies to detecting consonance and dissonance. Our study aimed to determine if sensitivity to increasing levels of dissonance differs between musicians and nonmusicians, using a combination of neural (electroencephalographic mismatch negativity, MMN) and behavioral measurements (conscious discrimination). Furthermore, we wanted to see if focusing attention to the sounds modulated the neural processing. We used chords comprised of either highly consonant or highly dissonant intervals and further manipulated the degree of dissonance to create two levels of dissonant chords. Both groups discriminated dissonant chords from consonant ones neurally and behaviorally. The magnitude of the MMN differed only marginally between the more dissonant and the less dissonant chords. The musicians outperformed the nonmusicians in the behavioral task. As the dissonant chords elicited MMN responses for both groups, sensory dissonance seems to be discriminated in an early sensory level, irrespective of musical expertise, and the facilitating effects of musicianship for this discrimination may arise in later stages of auditory processing, appearing only in the behavioral auditory task.
Journal Articles
Music Perception (2020) 37 (4): 263–277.
Published: 11 March 2020
... be shorter. Thus, attending to different metrical levels in music, by deliberately directing attention and motor activity, affects time perception. David Hammerschmidt, Institute of Systematic Musicology, University of Hamburg, Neue Rabenstr. 13, 20354 Hamburg, Germany. E-mail...
Abstract
The aim of the present study was to investigate if the perception of time is affected by actively attending to different metrical levels in musical rhythmic patterns. In an experiment with a repeated-measures design, musicians and nonmusicians were presented with musical rhythmic patterns played at three different tempi. They synchronized with multiple metrical levels (half notes, quarter notes, eighth notes) of these patterns using a finger-tapping paradigm and listened without tapping. After each trial, stimulus duration was judged using a verbal estimation paradigm. Results show that the metrical level participants synchronized with influenced perceived time: actively attending to a higher metrical level (half notes, longer intertap intervals) led to the shortest time estimations, hence time was experienced as passing more quickly. Listening without tapping led to the longest time estimations. The faster the tempo of the patterns, the longer the time estimation. While there were no differences between musicians and nonmusicians, those participants who tapped more consistently and accurately (as analyzed by circular statistics) estimated durations to be shorter. Thus, attending to different metrical levels in music, by deliberately directing attention and motor activity, affects time perception.
Journal Articles
Music Perception (2018) 36 (2): 156–174.
Published: 01 December 2018
...), FIN-00014 University of Helsinki, Finland. E-mail: ritva.torppa@helsinki.fi 11 10 2016 2 5 2018 © 2018 by The Regents of the University of California 2018 ERP MMN and P3a informal singing and music instrument playing perception of speech in noise attention B...
Abstract
T he perception of speech in noise is challenging for children with cochlear implants (CIs). Singing and musical instrument playing have been associated with improved auditory skills in normal-hearing (NH) children. Therefore, we assessed how children with CIs who sing informally develop in the perception of speech in noise compared to those who do not. We also sought evidence of links of speech perception in noise with MMN and P3a brain responses to musical sounds and studied effects of age and changes over a 14–17 month time period in the speech-in-noise performance of children with CIs. Compared to the NH group, the entire CI group was less tolerant of noise in speech perception, but both groups improved similarly. The CI singing group showed better speech-in-noise perception than the CI non-singing group. The perception of speech in noise in children with CIs was associated with the amplitude of MMN to a change of sound from piano to cymbal, and in the CI singing group only, with earlier P3a for changes in timbre. While our results cannot address causality, they suggest that singing and musical instrument playing may have a potential to enhance the perception of speech in noise in children with CIs.
Journal Articles
Music Perception (2016) 34 (2): 152–166.
Published: 01 December 2016
...Ben Duane This study examines the difference between prominent and non-prominent lines (e.g., melodies and accompaniments). After reviewing research suggesting that lines with few repeating patterns would readily capture attention, the hypothesis that prominent lines tend to be less repetitive is...
Abstract
This study examines the difference between prominent and non-prominent lines (e.g., melodies and accompaniments). After reviewing research suggesting that lines with few repeating patterns would readily capture attention, the hypothesis that prominent lines tend to be less repetitive is tested. Various probabilistic models are used to quantify the repetitiveness of lines from three musical corpora—two containing Classical string quartets, one including songs by the Beatles. The results suggest that notes from prominent lines tend to have lower probability. This trend, along with others found in the corpora, is consistent with the hypothesis that prominent lines are generally less repetitive.
Journal Articles
Music Perception (2016) 33 (5): 561–570.
Published: 01 June 2016
...Martin Norgaard; Samantha N. Emerson; Kimberly Dawn; James D. Fidlon A growing body of research suggests that jazz musicians concatenate stored auditory and motor patterns during improvisation. We hypothesized that this mechanism allows musicians to focus attention more flexibly during...
Abstract
A growing body of research suggests that jazz musicians concatenate stored auditory and motor patterns during improvisation. We hypothesized that this mechanism allows musicians to focus attention more flexibly during improvisation; for example, on interaction with other ensemble members. We tested this idea by analyzing the frequency of repeated melodic patterns in improvisations by artist-level pianists forced to attend to a secondary unrelated counting task. Indeed, we found that compared to their own improvisations performed in a baseline control condition, participants used significantly more repeated patterns when their attention was focused on the secondary task. This main effect was independent of whether participants played in a familiar or unfamiliar key and held true using various measurements for pattern use.
Journal Articles
Music Perception (2016) 33 (3): 306–318.
Published: 01 February 2016
... repetitive, probably facilitating listeners’ tendency to focus on and follow the melodic lines they support. With the aim of contributing to the unexplored area of the relationship between repetition and attention in polyphonic music listening, this paper presents an empirical investigation of the way...
Abstract
Repetition and novelty are essential components of tonal music. Previous research suggests that the degree of repetitiveness of a line can determine its relative melodicity within a musical texture. Concordantly, musical accompaniments tend to be highly repetitive, probably facilitating listeners’ tendency to focus on and follow the melodic lines they support. With the aim of contributing to the unexplored area of the relationship between repetition and attention in polyphonic music listening, this paper presents an empirical investigation of the way listeners attend to exact and immediate reiterations of musical fragments in two-part contrapuntal textures. Participants heard original excerpts composed of a repetitive and a nonrepetitive part, continuously rating the relative prominence of the two voices. The results indicate that the line that consists of immediate and exact repetitions of a short musical fragment tends to perceptually decrease in salience for the listener. This suggests that musical repetition plays a significant role in dynamically shaping listeners’ perceptions of musical texture by affecting the relative perceived importance of simultaneous parts.
Journal Articles
Music Perception (2015) 33 (1): 70–82.
Published: 01 September 2015
... Canada. E-mail: calain@research.baycrest.org 19 1 2015 30 4 2015 © 2015 by The Regents of the University of California 2015 attention streaming neuroimaging ERP cognition AUDITORY SCENE ANALYSIS: TALES FROM COGNITIVE NEUROSCIENCES CLAUDE ALAIN Rotman Research Institute at...
Abstract
Albert Bregman’s (1990) book Auditory Scene Analysis: The Perceptual Organization of Sound has had a tremendous impact on research in auditory neuroscience. Here, we outline some of the accomplishments. This review is not meant to be exhaustive, but rather aims to highlight milestones in the brief history of auditory neuroscience. The steady increase in neuroscience research following the book’s pivotal publication has advanced knowledge about how the brain forms representations of auditory objects. This research has far-reaching societal implications on health and quality of life. For instance, it helped us understand why some people experience difficulties understanding speech in noise, which in turn has led to development of therapeutic interventions. Importantly, the book acts as a catalyst, providing scientists with a common conceptual framework for research in such diverse fields as speech perception, music perception, neurophysiology and computational neuroscience. This interdisciplinary approach to research in audition is one of this book’s legacies.
Journal Articles
Music Perception (2013) 30 (4): 369–390.
Published: 01 April 2013
... relationships, and participants (who were classically trained musicians) had to judge whether a probe fell on the beat in one or both rhythms. In a selective attention condition, they had to attend to one rhythm and to ignore the other, whereas in a divided attention condition, they had to attend to both. In...
Abstract
The simultaneous presence of different meters is not uncommon in Western art music and the music of various non-Western cultures. However, it is unclear how listeners and performers deal with this situation, and whether it is possible to cognitively establish and maintain different beats simultaneously without integrating them into a single metric framework. The present study is an attempt to address this issue empirically. Two rhythms, distinguished by pitch register and representing different meters (2/4 and 6/8), were presented simultaneously in various phase relationships, and participants (who were classically trained musicians) had to judge whether a probe fell on the beat in one or both rhythms. In a selective attention condition, they had to attend to one rhythm and to ignore the other, whereas in a divided attention condition, they had to attend to both. In Experiment 1, participants performed significantly better in the divided attention condition than predicted if they had been able to attend to only one rhythm at a time. In Experiments 2 and 3, however, which used more complex combinations of rhythms, performance did not differ significantly from chance. These results suggest that in Experiment 1 participants relied on the composite beat pattern (i.e., a nonisochronous sequence corresponding to the serial ordering of the two underlying beats) rather than tracking the two beats independently, while in Experiments 2 and 3, the level of complexity of the composite beat pattern may have prevented participants from tracking both beats simultaneously.
Journal Articles
Music Perception (2012) 29 (4): 377–385.
Published: 01 April 2012
... whenever they heard something from earlier in the piece repeat. Additional exposures facilitated repetition detection for long units, but impaired repetition detection for short ones, exposing an attentional shift toward larger temporal spans across multiple hearings. Elizabeth Hellmuth Margulis, 201...
Abstract
although music's repetitiveness has been a perennial topic of theoretical and philosophical interest, we know surprisingly little about the psychological processes underlying it. As one step in the larger enterprise of examining the psychology of musical repetition, a preliminary question addresses repetition detection: Which repetitions are listeners able to identify as such, and how does this ability change across repeated exposures of the same work? In this study, participants with minimal formal training heard short excerpts and were instructed to press a button whenever they heard something from earlier in the piece repeat. Additional exposures facilitated repetition detection for long units, but impaired repetition detection for short ones, exposing an attentional shift toward larger temporal spans across multiple hearings.
Journal Articles
Music Perception (2011) 29 (2): 133–146.
Published: 01 December 2011
... attention memory brain Playing Music for a Smarter Ear 133 Music Perception volume 29, issue 2, pp. 133 146. issn 0730-7829, electronic issn 1533-8312. © 2011 by the regents of the university of california all rights reserved. please direct all requests for permission to photocopy or reproduce article...
Abstract
human hearing depends on a combination of cognitive and sensory processes that function by means of an interactive circuitry of bottom-up and top-down neural pathways, extending from the cochlea to the cortex and back again. Given that similar neural pathways are recruited to process sounds related to both music and language, it is not surprising that the auditory expertise gained over years of consistent music practice fine-tunes the human auditory system in a comprehensive fashion, strengthening neurobiological and cognitive underpinnings of both music and speech processing. In this review we argue not only that common neural mechanisms for speech and music exist, but that experience in music leads to enhancements in sensory and cognitive contributors to speech processing. Of specific interest is the potential for music training to bolster neural mechanisms that undergird language-related skills, such as reading and hearing speech in background noise, which are critical to academic progress, emotional health, and vocational success.
Journal Articles
Music Perception (2009) 26 (4): 321–334.
Published: 01 April 2009
... reliable results were, (b) how results would change if listeners' attention changed from nondirected (NDL) to directed listening (DL), and (c) whether the perception of clash of keys is influenced by the musical style of the particular composition. Participants included 101 expert listeners and 147...
Abstract
THIS EXPLORATIVE STUDY INVESTIGATES THE PERCEPTION of clash of keys in music. It is a replication and extension of an earlier study by Wolpert (2000) on the perception of a harmonic (bitonal) manipulation of a melody and accompaniment. We investigated (a) how reliable results were, (b) how results would change if listeners' attention changed from nondirected (NDL) to directed listening (DL), and (c) whether the perception of clash of keys is influenced by the musical style of the particular composition. Participants included 101 expert listeners and 147 nonexpert listeners who evaluated music of four different styles in two versions each (original and with a pitch difference of 200 cents between melody and accompaniment). On the whole, expert listeners noticed the clash of keys significantly more often than did nonexperts (NDL: 49.30% vs. 9.30%; DL: 78.00% vs. 46.90%). For NDL, the perception of clash of keys differed between musical styles and decreased from classical to rock 'n' roll and from pop to jazz. Differences in responses are mainly explained by acculturation effects (listening expertise, attention, musical style, and familiarity with the particular piece).
Journal Articles
Music Perception (2009) 26 (4): 377–386.
Published: 01 April 2009
... music, adult listeners weight events within a measure in a hierarchical manner. We tested if listeners without advanced music training form such hierarchical representations for a rhythmical sound sequence under different attention conditions (Attend, Unattend, and Passive). Participants detected...
Abstract
BEAT AND METER INDUCTION ARE CONSIDERED important structuring mechanisms underlying the perception of rhythm. Meter comprises two or more levels of hierarchically ordered regular beats with different periodicities. When listening to music, adult listeners weight events within a measure in a hierarchical manner. We tested if listeners without advanced music training form such hierarchical representations for a rhythmical sound sequence under different attention conditions (Attend, Unattend, and Passive). Participants detected occasional weakly and strongly syncopated rhythmic patterns within the context of a strictly metrical rhythmical sound sequence. Detection performance was better and faster when syncopation occurred in a metrically strong as compared to a metrically weaker position. Compatible electrophysiological differences (earlier and higher-amplitude MMN responses) were obtained when participants did not attend the rhythmical sound sequences. These data indicate that hierarchical representations for rhythmical sound sequences are formed preattentively in the human auditory system.