↓ Skip to main content

Beyond Psychometrics: The Difference between Difficult Problem Solving and Complex Problem Solving

Overview of attention for article published in Frontiers in Psychology, October 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (71st percentile)
  • Above-average Attention Score compared to outputs of the same age and source (61st percentile)

Mentioned by

twitter
9 X users

Citations

dimensions_citation
20 Dimensions

Readers on

mendeley
50 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Beyond Psychometrics: The Difference between Difficult Problem Solving and Complex Problem Solving
Published in
Frontiers in Psychology, October 2017
DOI 10.3389/fpsyg.2017.01739
Pubmed ID
Authors

Jens F. Beckmann, Damian P. Birney, Natassia Goode

Abstract

In this paper we argue that a synthesis of findings across the various sub-areas of research in complex problem solving and consequently progress in theory building is hampered by an insufficient differentiation of complexity and difficulty. In the proposed framework of person, task, and situation (PTS), complexity is conceptualized as a quality that is determined by the cognitive demands that the characteristics of the task and the situation impose. Difficulty represents the quantifiable level of a person's success in dealing with such demands. We use the well-documented "semantic effect" as an exemplar for testing some of the conceptual assumptions derived from the PTS framework. We demonstrate how a differentiation between complexity and difficulty can help take beyond a potentially too narrowly defined psychometric perspective and subsequently gain a better understanding of the cognitive mechanisms behind this effect. In an empirical study a total of 240 university students were randomly allocated to one of four conditions. The four conditions resulted from contrasting the semanticity level of the variable labels used in the CPS system (high vs. low) and two instruction conditions for how to explore the CPS system's causal structure (starting with the assumption that all relationships between variables existed vs. starting with the assumption that none of the relationships existed). The variation in the instruction aimed at inducing knowledge acquisition processes of either (1) systematic elimination of presumptions, or (2) systematic compilation of a mental representation of the causal structure underpinning the system. Results indicate that (a) it is more complex to adopt a "blank slate" perspective under high semanticity as it requires processes of inhibiting prior assumptions, and (b) it seems more difficult to employ a systematic heuristic when testing against presumptions. In combination, situational characteristics, such as the semanticity of variable labels, have the potential to trigger qualitatively different tasks. Failing to differentiate between 'task' and 'situation' as independent sources of complexity and treating complexity and difficulty synonymously threaten the validity of performance scores obtained in CPS research.

X Demographics

X Demographics

The data shown below were collected from the profiles of 9 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 50 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 50 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 11 22%
Researcher 5 10%
Student > Bachelor 5 10%
Student > Master 4 8%
Student > Doctoral Student 3 6%
Other 11 22%
Unknown 11 22%
Readers by discipline Count As %
Psychology 18 36%
Social Sciences 7 14%
Engineering 3 6%
Business, Management and Accounting 2 4%
Nursing and Health Professions 1 2%
Other 7 14%
Unknown 12 24%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 6. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 12 June 2018.
All research outputs
#5,796,460
of 23,577,761 outputs
Outputs from Frontiers in Psychology
#8,378
of 31,442 outputs
Outputs of similar age
#91,197
of 325,603 outputs
Outputs of similar age from Frontiers in Psychology
#233
of 601 outputs
Altmetric has tracked 23,577,761 research outputs across all sources so far. Compared to these this one has done well and is in the 75th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 31,442 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.6. This one has gotten more attention than average, scoring higher than 73% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 325,603 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 71% of its contemporaries.
We're also able to compare this research output to 601 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 61% of its contemporaries.