↓ Skip to main content

An Examination of Recording Accuracy and Precision From Eye Tracking Data From Toddlerhood to Adulthood

Overview of attention for article published in Frontiers in Psychology, May 2018
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
5 X users

Citations

dimensions_citation
55 Dimensions

Readers on

mendeley
160 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
An Examination of Recording Accuracy and Precision From Eye Tracking Data From Toddlerhood to Adulthood
Published in
Frontiers in Psychology, May 2018
DOI 10.3389/fpsyg.2018.00803
Pubmed ID
Authors

Kirsten A. Dalrymple, Marie D. Manner, Katherine A. Harmelink, Elayne P. Teska, Jed T. Elison

Abstract

The quantitative assessment of eye tracking data quality is critical for ensuring accuracy and precision of gaze position measurements. However, researchers often report the eye tracker's optimal manufacturer's specifications rather than empirical data about the accuracy and precision of the eye tracking data being presented. Indeed, a recent report indicates that less than half of eye tracking researchers surveyed take the eye tracker's accuracy into account when determining areas of interest for analysis, an oversight that could impact the validity of reported results and conclusions. Accordingly, we designed a calibration verification protocol to augment independent quality assessment of eye tracking data and examined whether accuracy and precision varied between three age groups of participants. We also examined the degree to which our externally quantified quality assurance metrics aligned with those reported by the manufacturer. We collected data in standard laboratory conditions to demonstrate our method, to illustrate how data quality can vary with participant age, and to give a simple example of the degree to which data quality can differ from manufacturer reported values. In the sample data we collected, accuracy for adults was within the range advertised by the manufacturer, but for school-aged children, accuracy and precision measures were outside this range. Data from toddlers were less accurate and less precise than data from adults. Based on an a priori inclusion criterion, we determined that we could exclude approximately 20% of toddler participants for poor calibration quality quantified using our calibration assessment protocol. We recommend implementing and reporting quality assessment protocols for any eye tracking tasks with participants of any age or developmental ability. We conclude with general observations about our data, recommendations for what factors to consider when establishing data inclusion criteria, and suggestions for stimulus design that can help accommodate variability in calibration. The methods outlined here may be particularly useful for developmental psychologists who use eye tracking as a tool, but who are not experts in eye tracking per se. The calibration verification stimuli and data processing scripts that we developed, along with step-by-step instructions, are freely available for other researchers.

X Demographics

X Demographics

The data shown below were collected from the profiles of 5 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 160 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 160 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 31 19%
Student > Master 23 14%
Researcher 14 9%
Student > Doctoral Student 13 8%
Student > Bachelor 10 6%
Other 25 16%
Unknown 44 28%
Readers by discipline Count As %
Psychology 33 21%
Computer Science 16 10%
Neuroscience 12 8%
Engineering 9 6%
Social Sciences 6 4%
Other 30 19%
Unknown 54 34%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 23 May 2018.
All research outputs
#15,406,777
of 26,130,653 outputs
Outputs from Frontiers in Psychology
#14,478
of 35,009 outputs
Outputs of similar age
#180,615
of 346,804 outputs
Outputs of similar age from Frontiers in Psychology
#389
of 658 outputs
Altmetric has tracked 26,130,653 research outputs across all sources so far. This one is in the 40th percentile – i.e., 40% of other outputs scored the same or lower than it.
So far Altmetric has tracked 35,009 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.6. This one has gotten more attention than average, scoring higher than 56% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 346,804 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 46th percentile – i.e., 46% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 658 others from the same source and published within six weeks on either side of this one. This one is in the 38th percentile – i.e., 38% of its contemporaries scored the same or lower than it.