Skip Nav Destination
Close Modal
Update search
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
NARROW
Format
Journal
Article Type
Date
Availability
1-3 of 3
Eduardo Coutinho
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Music Perception (2018) 35 (3): 376–399.
Published: 01 February 2018
Abstract
Music engagement is complex and is influenced by music training, capacity, preferences, and motivations. A multi-modular self-report instrument (the Music Use and Background Questionnaire, or MUSEBAQ) was developed to measure a diverse set of music engagement constructs. Based on earlier work, a hybrid approach of exploratory and confirmatory analyses was conducted across a series of three independent studies to establish reliability and validity of the modular tool. Module 1 (Musicianship) provides a brief assessment of formal and informal music knowledge and practice. Module 2 (Musical capacity) measures emotional sensitivity to music, listening sophistication, music memory and imagery, and personal commitment to music. Module 3 (Music preferences) captures preferences from six broad genres and utilizes adaptive reasoning to selectively expand subgenres when administered online. Module 4 (Motivations for music use) assesses musical transcendence, emotion regulation, social, and musical identity and expression. The MUSEBAQ offers researchers and practitioners a comprehensive, modular instrument that can be used in whole, or by module as required to capture an individual’s level of engagement with music and to serve as a background questionnaire to measure and interpret the effects of dispositional differences in emotional reactions to music.
Includes: Supplementary data
Journal Articles
Music Perception (2017) 34 (4): 371–386.
Published: 01 April 2017
Abstract
The systematic study of music-induced emotions requires standardized measurement instruments to reliably assess the nature of affective reactions to music, which tend to go beyond garden-variety basic emotions. We describe the development and conceptual validation of a checklist for rapid assessment of music-induced affect, designed to extend and complement the Geneva Emotional Music Scale. The checklist contains a selection of affect and emotion categories that are frequently used in the literature to refer to emotional reactions to music. The development of the checklist focused on an empirical investigation of the semantic structure of the relevant terms, combined with fuzzy classes based on a series of hierarchical cluster analyses. Two versions of the checklist for assessing the intensity and frequency of affective responses to music are proposed.
Journal Articles
Music Perception (2009) 27 (1): 1–15.
Published: 01 September 2009
Abstract
THIS ARTICLE PRESENTS A NOVEL METHODOLOGY TO analyze the dynamics of emotional responses to music. It consists of a computational investigation based on spatiotemporal neural networks, which "mimic" human affective responses to music and predict the responses to novel music sequences. The results provide evidence suggesting that spatiotemporal patterns of sound resonate with affective features underlying judgments of subjective feelings (arousal and valence). A significant part of the listener's affective response is predicted from a set of six psychoacoustic features of sound—âloudness, tempo, texture, mean pitch, pitch variation, and sharpness. A detailed analysis of the network parameters and dynamics also allows us to identify the role of specific psychoacoustic variables (e.g., tempo and loudness) in music emotional appraisal. This work contributes new evidence and insights to the study of musical emotions, with particular relevance to the music perception and cognition research community.