↓ Skip to main content

Accuracy and Feasibility of an Android-Based Digital Assessment Tool for Post Stroke Visual Disorders—The StrokeVision App

Overview of attention for article published in Frontiers in Neurology, March 2018
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Above-average Attention Score compared to outputs of the same age and source (54th percentile)

Mentioned by

twitter
3 X users

Citations

dimensions_citation
16 Dimensions

Readers on

mendeley
59 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Accuracy and Feasibility of an Android-Based Digital Assessment Tool for Post Stroke Visual Disorders—The StrokeVision App
Published in
Frontiers in Neurology, March 2018
DOI 10.3389/fneur.2018.00146
Pubmed ID
Authors

Terence J. Quinn, Iain Livingstone, Alexander Weir, Robert Shaw, Andrew Breckenridge, Christine McAlpine, Claire M. Tarbert

Abstract

Visual impairment affects up to 70% of stroke survivors. We designed an app (StrokeVision) to facilitate screening for common post stroke visual issues (acuity, visual fields, and visual inattention). We sought to describe the test time, feasibility, acceptability, and accuracy of our app-based digital visual assessments against (a) current methods used for bedside screening and (b) gold standard measures. Patients were prospectively recruited from acute stroke settings. Index tests were app-based assessments of fields and inattention performed by a trained researcher. We compared against usual clinical screening practice of visual fields to confrontation, including inattention assessment (simultaneous stimuli). We also compared app to gold standard assessments of formal kinetic perimetry (Goldman or Octopus Visual Field Assessment); and pencil and paper-based tests of inattention (Albert's, Star Cancelation, and Line Bisection). Results of inattention and field tests were adjudicated by a specialist Neuro-ophthalmologist. All assessors were masked to each other's results. Participants and assessors graded acceptability using a bespoke scale that ranged from 0 (completely unacceptable) to 10 (perfect acceptability). Of 48 stroke survivors recruited, the complete battery of index and reference tests for fields was successfully completed in 45. Similar acceptability scores were observed for app-based [assessor median score 10 (IQR: 9-10); patient 9 (IQR: 8-10)] and traditional bedside testing [assessor 10 (IQR: 9-10); patient 10 (IQR: 9-10)]. Median test time was longer for app-based testing [combined time to completion of all digital tests 420 s (IQR: 390-588)] when compared with conventional bedside testing [70 s, (IQR: 40-70)], but shorter than gold standard testing [1,260 s, (IQR: 1005-1,620)]. Compared with gold standard assessments, usual screening practice demonstrated 79% sensitivity and 82% specificity for detection of a stroke-related field defect. This compares with 79% sensitivity and 88% specificity for StrokeVision digital assessment. StrokeVision shows promise as a screening tool for visual complications in the acute phase of stroke. The app is at least as good as usual screening and offers other functionality that may make it attractive for use in acute stroke. https://ClinicalTrials.gov/ct2/show/NCT02539381.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 59 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 59 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 11 19%
Student > Ph. D. Student 8 14%
Student > Doctoral Student 5 8%
Student > Bachelor 4 7%
Student > Postgraduate 4 7%
Other 10 17%
Unknown 17 29%
Readers by discipline Count As %
Medicine and Dentistry 10 17%
Neuroscience 7 12%
Nursing and Health Professions 4 7%
Engineering 4 7%
Social Sciences 4 7%
Other 9 15%
Unknown 21 36%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 24 June 2019.
All research outputs
#14,096,200
of 23,031,582 outputs
Outputs from Frontiers in Neurology
#5,517
of 11,923 outputs
Outputs of similar age
#180,534
of 329,889 outputs
Outputs of similar age from Frontiers in Neurology
#115
of 263 outputs
Altmetric has tracked 23,031,582 research outputs across all sources so far. This one is in the 37th percentile – i.e., 37% of other outputs scored the same or lower than it.
So far Altmetric has tracked 11,923 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 7.3. This one has gotten more attention than average, scoring higher than 52% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 329,889 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 43rd percentile – i.e., 43% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 263 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 54% of its contemporaries.