↓ Skip to main content

Effects of Prompting in Reflective Learning Tools: Findings from Experimental Field, Lab, and Online Studies

Overview of attention for article published in Frontiers in Psychology, May 2016
Altmetric Badge

Mentioned by

twitter
2 X users

Readers on

mendeley
51 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Effects of Prompting in Reflective Learning Tools: Findings from Experimental Field, Lab, and Online Studies
Published in
Frontiers in Psychology, May 2016
DOI 10.3389/fpsyg.2016.00820
Pubmed ID
Authors

Bettina Renner, Michael Prilla, Ulrike Cress, Joachim Kimmerle

Abstract

Reflective learning is an important type of learning both in formal and informal situations-in school, higher education, at the workplace, and in everyday life. People may benefit from technical support for reflective learning, in particular when supporting each other by reflecting not only upon their own but also upon other people's problems. We refer to this collective approach where people come together to think about experiences and find solutions to problems as "collaborative reflection." We present three empirical studies about the effects of prompting in reflective learning tools in such situations where people reflect on others' issues. In Study 1 we applied a three-stage within-group design in a field experiment, where 39 participants from two organizations received different types of prompts while they used a reflection app. We found that prompts that invited employees to write down possible solutions led to more comprehensive comments on their colleagues' experiences. In Study 2 we used a three-stage between-group design in a laboratory experiment, where 78 university students were invited to take part in an experiment about the discussion of problems at work or academic studies in online forums. Here we found that short, abstract prompts showed no superiority to a situation without any prompts with respect to quantity or quality of contributions. Finally, Study 3 featured a two-stage between-group design in an online experiment, where 60 participants received either general reflection instructions or detailed instructions about how to reflect on other people's problems. We could show that detailed reflection instructions supported people in producing more comprehensive comments that included more general advice. The results demonstrate that to increase activity and to improve quality of comments with prompting tools require detailed instructions and specific wording of the prompts.

Timeline

Login to access the full chart related to this output.

If you don’t have an account, click here to discover Explorer

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 51 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 2%
Austria 1 2%
Unknown 49 96%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 13 25%
Researcher 5 10%
Student > Doctoral Student 4 8%
Lecturer 3 6%
Professor 3 6%
Other 6 12%
Unknown 17 33%
Readers by discipline Count As %
Social Sciences 10 20%
Psychology 9 18%
Computer Science 4 8%
Business, Management and Accounting 3 6%
Nursing and Health Professions 2 4%
Other 6 12%
Unknown 17 33%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 08 June 2016.
All research outputs
#16,446,399
of 24,220,739 outputs
Outputs from Frontiers in Psychology
#20,306
of 32,552 outputs
Outputs of similar age
#217,969
of 344,732 outputs
Outputs of similar age from Frontiers in Psychology
#309
of 441 outputs
Altmetric has tracked 24,220,739 research outputs across all sources so far. This one is in the 21st percentile – i.e., 21% of other outputs scored the same or lower than it.
So far Altmetric has tracked 32,552 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.8. This one is in the 31st percentile – i.e., 31% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 344,732 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 28th percentile – i.e., 28% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 441 others from the same source and published within six weeks on either side of this one. This one is in the 23rd percentile – i.e., 23% of its contemporaries scored the same or lower than it.