↓ Skip to main content

Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design

Overview of attention for article published in Frontiers in Human Neuroscience, January 2013
Altmetric Badge

Mentioned by

twitter
2 X users

Citations

dimensions_citation
240 Dimensions

Readers on

mendeley
306 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design
Published in
Frontiers in Human Neuroscience, January 2013
DOI 10.3389/fnhum.2013.00568
Pubmed ID
Authors

Fabien Lotte, Florian Larrue, Christian Mühl

Abstract

While recent research on Brain-Computer Interfaces (BCI) has highlighted their potential for many applications, they remain barely used outside laboratories. The main reason is their lack of robustness. Indeed, with current BCI, mental state recognition is usually slow and often incorrect. Spontaneous BCI (i.e., mental imagery-based BCI) often rely on mutual learning efforts by the user and the machine, with BCI users learning to produce stable ElectroEncephaloGraphy (EEG) patterns (spontaneous BCI control being widely acknowledged as a skill) while the computer learns to automatically recognize these EEG patterns, using signal processing. Most research so far was focused on signal processing, mostly neglecting the human in the loop. However, how well the user masters the BCI skill is also a key element explaining BCI robustness. Indeed, if the user is not able to produce stable and distinct EEG patterns, then no signal processing algorithm would be able to recognize them. Unfortunately, despite the importance of BCI training protocols, they have been scarcely studied so far, and used mostly unchanged for years. In this paper, we advocate that current human training approaches for spontaneous BCI are most likely inappropriate. We notably study instructional design literature in order to identify the key requirements and guidelines for a successful training procedure that promotes a good and efficient skill learning. This literature study highlights that current spontaneous BCI user training procedures satisfy very few of these requirements and hence are likely to be suboptimal. We therefore identify the flaws in BCI training protocols according to instructional design principles, at several levels: in the instructions provided to the user, in the tasks he/she has to perform, and in the feedback provided. For each level, we propose new research directions that are theoretically expected to address some of these flaws and to help users learn the BCI skill more efficiently.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 306 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Germany 2 <1%
United States 2 <1%
Russia 2 <1%
France 1 <1%
Netherlands 1 <1%
Italy 1 <1%
Unknown 297 97%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 60 20%
Student > Master 54 18%
Researcher 44 14%
Student > Bachelor 33 11%
Student > Doctoral Student 21 7%
Other 39 13%
Unknown 55 18%
Readers by discipline Count As %
Engineering 76 25%
Psychology 37 12%
Computer Science 36 12%
Neuroscience 36 12%
Agricultural and Biological Sciences 13 4%
Other 32 10%
Unknown 76 25%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 01 June 2017.
All research outputs
#15,280,625
of 22,723,682 outputs
Outputs from Frontiers in Human Neuroscience
#5,257
of 7,131 outputs
Outputs of similar age
#181,551
of 280,761 outputs
Outputs of similar age from Frontiers in Human Neuroscience
#681
of 862 outputs
Altmetric has tracked 22,723,682 research outputs across all sources so far. This one is in the 22nd percentile – i.e., 22% of other outputs scored the same or lower than it.
So far Altmetric has tracked 7,131 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.5. This one is in the 20th percentile – i.e., 20% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 280,761 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 25th percentile – i.e., 25% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 862 others from the same source and published within six weeks on either side of this one. This one is in the 15th percentile – i.e., 15% of its contemporaries scored the same or lower than it.