↓ Skip to main content

Emotional sounds modulate early neural processing of emotional pictures

Overview of attention for article published in Frontiers in Psychology, January 2013
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
4 X users

Citations

dimensions_citation
40 Dimensions

Readers on

mendeley
116 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Emotional sounds modulate early neural processing of emotional pictures
Published in
Frontiers in Psychology, January 2013
DOI 10.3389/fpsyg.2013.00741
Pubmed ID
Authors

Antje B. M. Gerdes, Matthias J. Wieser, Florian Bublatzky, Anita Kusay, Michael M. Plichta, Georg W. Alpers

Abstract

In our natural environment, emotional information is conveyed by converging visual and auditory information; multimodal integration is of utmost importance. In the laboratory, however, emotion researchers have mostly focused on the examination of unimodal stimuli. Few existing studies on multimodal emotion processing have focused on human communication such as the integration of facial and vocal expressions. Extending the concept of multimodality, the current study examines how the neural processing of emotional pictures is influenced by simultaneously presented sounds. Twenty pleasant, unpleasant, and neutral pictures of complex scenes were presented to 22 healthy participants. On the critical trials these pictures were paired with pleasant, unpleasant, and neutral sounds. Sound presentation started 500 ms before picture onset and each stimulus presentation lasted for 2 s. EEG was recorded from 64 channels and ERP analyses focused on the picture onset. In addition, valence and arousal ratings were obtained. Previous findings for the neural processing of emotional pictures were replicated. Specifically, unpleasant compared to neutral pictures were associated with an increased parietal P200 and a more pronounced centroparietal late positive potential (LPP), independent of the accompanying sound valence. For audiovisual stimulation, increased parietal P100 and P200 were found in response to all pictures which were accompanied by unpleasant or pleasant sounds compared to pictures with neutral sounds. Most importantly, incongruent audiovisual pairs of unpleasant pictures and pleasant sounds enhanced parietal P100 and P200 compared to pairings with congruent sounds. Taken together, the present findings indicate that emotional sounds modulate early stages of visual processing and, therefore, provide an avenue by which multimodal experience may enhance perception.

Timeline

Login to access the full chart related to this output.

If you don’t have an account, click here to discover Explorer

X Demographics

X Demographics

The data shown below were collected from the profiles of 4 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 116 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Portugal 1 <1%
Switzerland 1 <1%
Netherlands 1 <1%
Belgium 1 <1%
China 1 <1%
Unknown 111 96%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 28 24%
Researcher 17 15%
Student > Master 17 15%
Student > Doctoral Student 11 9%
Student > Bachelor 7 6%
Other 21 18%
Unknown 15 13%
Readers by discipline Count As %
Psychology 50 43%
Neuroscience 16 14%
Engineering 8 7%
Medicine and Dentistry 5 4%
Agricultural and Biological Sciences 5 4%
Other 13 11%
Unknown 19 16%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 07 November 2013.
All research outputs
#14,351,359
of 25,182,110 outputs
Outputs from Frontiers in Psychology
#13,045
of 34,011 outputs
Outputs of similar age
#166,382
of 293,942 outputs
Outputs of similar age from Frontiers in Psychology
#497
of 969 outputs
Altmetric has tracked 25,182,110 research outputs across all sources so far. This one is in the 42nd percentile – i.e., 42% of other outputs scored the same or lower than it.
So far Altmetric has tracked 34,011 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.2. This one has gotten more attention than average, scoring higher than 61% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 293,942 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 43rd percentile – i.e., 43% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 969 others from the same source and published within six weeks on either side of this one. This one is in the 48th percentile – i.e., 48% of its contemporaries scored the same or lower than it.