↓ Skip to main content

Temporal Reference, Attentional Modulation, and Crossmodal Assimilation

Overview of attention for article published in Frontiers in Computational Neuroscience, June 2018
Altmetric Badge

About this Attention Score

  • Above-average Attention Score compared to outputs of the same age (63rd percentile)
  • Above-average Attention Score compared to outputs of the same age and source (60th percentile)

Mentioned by

twitter
8 X users

Readers on

mendeley
13 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Temporal Reference, Attentional Modulation, and Crossmodal Assimilation
Published in
Frontiers in Computational Neuroscience, June 2018
DOI 10.3389/fncom.2018.00039
Pubmed ID
Authors

Yingqi Wan, Lihan Chen

Abstract

Crossmodal assimilation effect refers to the prominent phenomenon by which ensemble mean extracted from a sequence of task-irrelevant distractor events, such as auditory intervals, assimilates/biases the perception (such as visual interval) of the subsequent task-relevant target events in another sensory modality. In current experiments, using visual Ternus display, we examined the roles of temporal reference, materialized as the time information accumulated before the onset of target event, as well as the attentional modulation in crossmodal temporal interaction. Specifically, we examined how the global time interval, the mean auditory inter-intervals and the last interval in the auditory sequence assimilate and bias the subsequent percept of visual Ternus motion (element motion vs. group motion). We demonstrated that both the ensemble (geometric) mean and the last interval in the auditory sequence contribute to bias the percept of visual motion. Longer mean (or last) interval elicited more reports of group motion, whereas the shorter mean (or last) auditory intervals gave rise to more dominant percept of element motion. Importantly, observers have shown dynamic adaptation to the temporal reference of crossmodal assimilation: when the target visual Ternus stimuli were separated by a long gap interval after the preceding sound sequence, the assimilation effect by ensemble mean was reduced. Our findings suggested that crossmodal assimilation relies on a suitable temporal reference on adaptation level, and revealed a general temporal perceptual grouping principle underlying complex audio-visual interactions in everyday dynamic situations.

Timeline

Login to access the full chart related to this output.

If you don’t have an account, click here to discover Explorer

X Demographics

X Demographics

The data shown below were collected from the profiles of 8 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 13 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 13 100%

Demographic breakdown

Readers by professional status Count As %
Student > Doctoral Student 2 15%
Student > Bachelor 2 15%
Student > Postgraduate 2 15%
Student > Master 1 8%
Student > Ph. D. Student 1 8%
Other 2 15%
Unknown 3 23%
Readers by discipline Count As %
Psychology 5 38%
Neuroscience 2 15%
Linguistics 1 8%
Computer Science 1 8%
Philosophy 1 8%
Other 0 0%
Unknown 3 23%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 4. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 13 June 2018.
All research outputs
#6,986,082
of 23,055,429 outputs
Outputs from Frontiers in Computational Neuroscience
#365
of 1,355 outputs
Outputs of similar age
#120,953
of 329,696 outputs
Outputs of similar age from Frontiers in Computational Neuroscience
#12
of 30 outputs
Altmetric has tracked 23,055,429 research outputs across all sources so far. This one has received more attention than most of these and is in the 69th percentile.
So far Altmetric has tracked 1,355 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.1. This one has gotten more attention than average, scoring higher than 72% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 329,696 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 63% of its contemporaries.
We're also able to compare this research output to 30 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 60% of its contemporaries.