↓ Skip to main content

The Influence of Facial Signals on the Automatic Imitation of Hand Actions

Overview of attention for article published in Frontiers in Psychology, October 2016
Altmetric Badge

Mentioned by

twitter
1 X user

Citations

dimensions_citation
13 Dimensions

Readers on

mendeley
47 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
The Influence of Facial Signals on the Automatic Imitation of Hand Actions
Published in
Frontiers in Psychology, October 2016
DOI 10.3389/fpsyg.2016.01653
Pubmed ID
Authors

Emily E. Butler, Robert Ward, Richard Ramsey

Abstract

Imitation and facial signals are fundamental social cues that guide interactions with others, but little is known regarding the relationship between these behaviors. It is clear that during expression detection, we imitate observed expressions by engaging similar facial muscles. It is proposed that a cognitive system, which matches observed and performed actions, controls imitation and contributes to emotion understanding. However, there is little known regarding the consequences of recognizing affective states for other forms of imitation, which are not inherently tied to the observed emotion. The current study investigated the hypothesis that facial cue valence would modulate automatic imitation of hand actions. To test this hypothesis, we paired different types of facial cue with an automatic imitation task. Experiments 1 and 2 demonstrated that a smile prompted greater automatic imitation than angry and neutral expressions. Additionally, a meta-analysis of this and previous studies suggests that both happy and angry expressions increase imitation compared to neutral expressions. By contrast, Experiments 3 and 4 demonstrated that invariant facial cues, which signal trait-levels of agreeableness, had no impact on imitation. Despite readily identifying trait-based facial signals, levels of agreeableness did not differentially modulate automatic imitation. Further, a Bayesian analysis showed that the null effect was between 2 and 5 times more likely than the experimental effect. Therefore, we show that imitation systems are more sensitive to prosocial facial signals that indicate "in the moment" states than enduring traits. These data support the view that a smile primes multiple forms of imitation including the copying actions that are not inherently affective. The influence of expression detection on wider forms of imitation may contribute to facilitating interactions between individuals, such as building rapport and affiliation.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 47 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 2%
Unknown 46 98%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 11 23%
Student > Bachelor 9 19%
Student > Master 9 19%
Student > Doctoral Student 5 11%
Researcher 3 6%
Other 4 9%
Unknown 6 13%
Readers by discipline Count As %
Psychology 27 57%
Neuroscience 4 9%
Business, Management and Accounting 3 6%
Agricultural and Biological Sciences 2 4%
Sports and Recreations 1 2%
Other 2 4%
Unknown 8 17%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 16 October 2016.
All research outputs
#20,346,264
of 22,893,031 outputs
Outputs from Frontiers in Psychology
#24,254
of 30,015 outputs
Outputs of similar age
#271,444
of 314,037 outputs
Outputs of similar age from Frontiers in Psychology
#403
of 459 outputs
Altmetric has tracked 22,893,031 research outputs across all sources so far. This one is in the 1st percentile – i.e., 1% of other outputs scored the same or lower than it.
So far Altmetric has tracked 30,015 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.5. This one is in the 1st percentile – i.e., 1% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 314,037 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 1st percentile – i.e., 1% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 459 others from the same source and published within six weeks on either side of this one. This one is in the 1st percentile – i.e., 1% of its contemporaries scored the same or lower than it.