↓ Skip to main content

Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening

Overview of attention for article published in Frontiers in Neuroscience, May 2014
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (76th percentile)
  • Good Attention Score compared to outputs of the same age and source (71st percentile)

Mentioned by

twitter
9 X users
googleplus
1 Google+ user

Citations

dimensions_citation
77 Dimensions

Readers on

mendeley
218 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening
Published in
Frontiers in Neuroscience, May 2014
DOI 10.3389/fnins.2014.00094
Pubmed ID
Authors

Yuan-Pin Lin, Yi-Hsuan Yang, Tzyy-Ping Jung

Abstract

Electroencephalography (EEG)-based emotion classification during music listening has gained increasing attention nowadays due to its promise of potential applications such as musical affective brain-computer interface (ABCI), neuromarketing, music therapy, and implicit multimedia tagging and triggering. However, music is an ecologically valid and complex stimulus that conveys certain emotions to listeners through compositions of musical elements. Using solely EEG signals to distinguish emotions remained challenging. This study aimed to assess the applicability of a multimodal approach by leveraging the EEG dynamics and acoustic characteristics of musical contents for the classification of emotional valence and arousal. To this end, this study adopted machine-learning methods to systematically elucidate the roles of the EEG and music modalities in the emotion modeling. The empirical results suggested that when whole-head EEG signals were available, the inclusion of musical contents did not improve the classification performance. The obtained performance of 74~76% using solely EEG modality was statistically comparable to that using the multimodality approach. However, if EEG dynamics were only available from a small set of electrodes (likely the case in real-life applications), the music modality would play a complementary role and augment the EEG results from around 61-67% in valence classification and from around 58-67% in arousal classification. The musical timber appeared to replace less-discriminative EEG features and led to improvements in both valence and arousal classification, whereas musical loudness was contributed specifically to the arousal classification. The present study not only provided principles for constructing an EEG-based multimodal approach, but also revealed the fundamental insights into the interplay of the brain activity and musical contents in emotion modeling.

X Demographics

X Demographics

The data shown below were collected from the profiles of 9 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 218 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 2 <1%
Portugal 1 <1%
Taiwan 1 <1%
Colombia 1 <1%
Spain 1 <1%
China 1 <1%
Unknown 211 97%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 38 17%
Student > Master 25 11%
Student > Bachelor 22 10%
Researcher 20 9%
Student > Postgraduate 12 6%
Other 45 21%
Unknown 56 26%
Readers by discipline Count As %
Engineering 37 17%
Computer Science 28 13%
Neuroscience 16 7%
Business, Management and Accounting 15 7%
Psychology 13 6%
Other 50 23%
Unknown 59 27%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 6. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 31 August 2014.
All research outputs
#6,443,044
of 25,371,288 outputs
Outputs from Frontiers in Neuroscience
#4,274
of 11,538 outputs
Outputs of similar age
#57,129
of 242,173 outputs
Outputs of similar age from Frontiers in Neuroscience
#29
of 102 outputs
Altmetric has tracked 25,371,288 research outputs across all sources so far. This one has received more attention than most of these and is in the 74th percentile.
So far Altmetric has tracked 11,538 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 10.9. This one has gotten more attention than average, scoring higher than 62% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 242,173 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 76% of its contemporaries.
We're also able to compare this research output to 102 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 71% of its contemporaries.