↓ Skip to main content

A Combination of Outcome and Process Feedback Enhances Performance in Simulations of Child Sexual Abuse Interviews Using Avatars

Overview of attention for article published in Frontiers in Psychology, September 2017
Altmetric Badge

Mentioned by

twitter
2 X users
facebook
1 Facebook page

Citations

dimensions_citation
29 Dimensions

Readers on

mendeley
44 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
A Combination of Outcome and Process Feedback Enhances Performance in Simulations of Child Sexual Abuse Interviews Using Avatars
Published in
Frontiers in Psychology, September 2017
DOI 10.3389/fpsyg.2017.01474
Pubmed ID
Authors

Francesco Pompedda, Jan Antfolk, Angelo Zappalà, Pekka Santtila

Abstract

Simulated interviews in alleged child sexual abuse (CSA) cases with computer-generated avatars paired with feedback improve interview quality. In the current study, we aimed to understand better the effect of different types of feedback in this context. Feedback was divided into feedback regarding conclusions about what happened to the avatar (outcome feedback) and feedback regarding the appropriateness of question-types used by the interviewer (process feedback). Forty-eight participants each interviewed four different avatars. Participants were divided into four groups (no feedback, outcome feedback, process feedback, and a combination of both feedback types). Compared to the control group, interview quality was generally improved in all the feedback groups on all outcome variables included. Combined feedback produced the strongest effect on increasing recommended questions and correct conclusions. For relevant and neutral details elicited by the interviewers, no statistically significant differences were found between feedback types. For wrong details, the combination of feedback produced the strongest effect, but this did not differ from the other two feedback groups. Nevertheless, process feedback produced a better result compared to outcome feedback. The present study replicated previous findings regarding the effect of feedback in improving interview quality, and provided new knowledge on feedback characteristics that maximize training effects. A combination of process and outcome feedback showed the strongest effect in enhancing training in simulated CSA interviews. Further research is, however, needed.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 44 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 44 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 8 18%
Student > Ph. D. Student 6 14%
Researcher 3 7%
Student > Doctoral Student 2 5%
Student > Bachelor 2 5%
Other 3 7%
Unknown 20 45%
Readers by discipline Count As %
Psychology 15 34%
Social Sciences 3 7%
Computer Science 1 2%
Business, Management and Accounting 1 2%
Agricultural and Biological Sciences 1 2%
Other 1 2%
Unknown 22 50%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 29 September 2017.
All research outputs
#17,914,959
of 23,001,641 outputs
Outputs from Frontiers in Psychology
#20,734
of 30,230 outputs
Outputs of similar age
#226,714
of 316,063 outputs
Outputs of similar age from Frontiers in Psychology
#466
of 580 outputs
Altmetric has tracked 23,001,641 research outputs across all sources so far. This one is in the 19th percentile – i.e., 19% of other outputs scored the same or lower than it.
So far Altmetric has tracked 30,230 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.5. This one is in the 25th percentile – i.e., 25% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 316,063 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 23rd percentile – i.e., 23% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 580 others from the same source and published within six weeks on either side of this one. This one is in the 15th percentile – i.e., 15% of its contemporaries scored the same or lower than it.