↓ Skip to main content

Experience in a Climate Microworld: Influence of Surface and Structure Learning, Problem Difficulty, and Decision Aids in Reducing Stock-Flow Misconceptions

Overview of attention for article published in Frontiers in Psychology, March 2018
Altmetric Badge

Mentioned by

twitter
3 X users

Citations

dimensions_citation
6 Dimensions

Readers on

mendeley
20 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Experience in a Climate Microworld: Influence of Surface and Structure Learning, Problem Difficulty, and Decision Aids in Reducing Stock-Flow Misconceptions
Published in
Frontiers in Psychology, March 2018
DOI 10.3389/fpsyg.2018.00299
Pubmed ID
Authors

Medha Kumar, Varun Dutt

Abstract

Research shows that people's wait-and-see preferences for actions against climate change are a result of several factors, including cognitive misconceptions. The use of simulation tools could help reduce these misconceptions concerning Earth's climate. However, it is still unclear whether the learning in these tools is of the problem's surface features (dimensions of emissions and absorptions and cover-story used) or of the problem's structural features (how emissions and absorptions cause a change in CO2 concentration under different CO2 concentration scenarios). Also, little is known on how problem's difficulty in these tools (the shape of CO2 concentration trajectory), as well as the use of these tools as a decision aid influences performance. The primary objective of this paper was to investigate how learning about Earth's climate via simulation tools is influenced by problem's surface and structural features, problem's difficulty, and decision aids. In experiment 1, we tested the influence of problem's surface and structural features in a simulation called Dynamic Climate Change Simulator (DCCS) on subsequent performance in a paper-and-pencil Climate Stabilization (CS) task (N = 100 across four between-subject conditions). In experiment 2, we tested the effects of problem's difficulty in DCCS on subsequent performance in the CS task (N = 90 across three between-subject conditions). In experiment 3, we tested the influence of DCCS as a decision aid on subsequent performance in the CS task (N = 60 across two between-subject conditions). Results revealed a significant reduction in people's misconceptions in the CS task after performing in DCCS compared to when performing in CS task in the absence of DCCS. The decrease in misconceptions in the CS task was similar for both problems' surface and structural features, showing both structure and surface learning in DCCS. However, the proportion of misconceptions was similar across both simple and difficult problems, indicating the role of cognitive load to hamper learning. Finally, misconceptions were reduced when DCCS was used as a decision aid. Overall, these results highlight the role of simulation tools in alleviating climate misconceptions. We discuss the implication of using simulation tools for climate education and policymaking.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 20 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 20 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 5 25%
Student > Bachelor 3 15%
Student > Postgraduate 2 10%
Student > Doctoral Student 1 5%
Lecturer 1 5%
Other 4 20%
Unknown 4 20%
Readers by discipline Count As %
Psychology 4 20%
Social Sciences 2 10%
Business, Management and Accounting 1 5%
Nursing and Health Professions 1 5%
Computer Science 1 5%
Other 3 15%
Unknown 8 40%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 11 April 2018.
All research outputs
#17,932,482
of 23,025,074 outputs
Outputs from Frontiers in Psychology
#20,767
of 30,283 outputs
Outputs of similar age
#240,050
of 330,366 outputs
Outputs of similar age from Frontiers in Psychology
#483
of 570 outputs
Altmetric has tracked 23,025,074 research outputs across all sources so far. This one is in the 19th percentile – i.e., 19% of other outputs scored the same or lower than it.
So far Altmetric has tracked 30,283 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.5. This one is in the 25th percentile – i.e., 25% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 330,366 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 22nd percentile – i.e., 22% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 570 others from the same source and published within six weeks on either side of this one. This one is in the 10th percentile – i.e., 10% of its contemporaries scored the same or lower than it.