↓ Skip to main content

Multimodal Neural Network for Rapid Serial Visual Presentation Brain Computer Interface

Overview of attention for article published in Frontiers in Computational Neuroscience, December 2016
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Above-average Attention Score compared to outputs of the same age and source (52nd percentile)

Mentioned by

twitter
2 X users

Citations

dimensions_citation
30 Dimensions

Readers on

mendeley
61 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Multimodal Neural Network for Rapid Serial Visual Presentation Brain Computer Interface
Published in
Frontiers in Computational Neuroscience, December 2016
DOI 10.3389/fncom.2016.00130
Pubmed ID
Authors

Ran Manor, Liran Mishali, Amir B. Geva

Abstract

Brain computer interfaces allow users to preform various tasks using only the electrical activity of the brain. BCI applications often present the user a set of stimuli and record the corresponding electrical response. The BCI algorithm will then have to decode the acquired brain response and perform the desired task. In rapid serial visual presentation (RSVP) tasks, the subject is presented with a continuous stream of images containing rare target images among standard images, while the algorithm has to detect brain activity associated with target images. In this work, we suggest a multimodal neural network for RSVP tasks. The network operates on the brain response and on the initiating stimulus simultaneously, providing more information for the BCI application. We present two variants of the multimodal network, a supervised model, for the case when the targets are known in advanced, and a semi-supervised model for when the targets are unknown. We test the neural networks with a RSVP experiment on satellite imagery carried out with two subjects. The multimodal networks achieve a significant performance improvement in classification metrics. We visualize what the networks has learned and discuss the advantages of using neural network models for BCI applications.

Timeline

Login to access the full chart related to this output.

If you don’t have an account, click here to discover Explorer

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 61 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Denmark 1 2%
Unknown 60 98%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 12 20%
Student > Master 10 16%
Researcher 7 11%
Student > Doctoral Student 4 7%
Student > Bachelor 3 5%
Other 9 15%
Unknown 16 26%
Readers by discipline Count As %
Computer Science 16 26%
Engineering 15 25%
Agricultural and Biological Sciences 3 5%
Neuroscience 3 5%
Linguistics 2 3%
Other 6 10%
Unknown 16 26%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 28 December 2016.
All research outputs
#14,422,753
of 23,566,295 outputs
Outputs from Frontiers in Computational Neuroscience
#643
of 1,380 outputs
Outputs of similar age
#226,865
of 424,159 outputs
Outputs of similar age from Frontiers in Computational Neuroscience
#15
of 34 outputs
Altmetric has tracked 23,566,295 research outputs across all sources so far. This one is in the 37th percentile – i.e., 37% of other outputs scored the same or lower than it.
So far Altmetric has tracked 1,380 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.3. This one is in the 49th percentile – i.e., 49% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 424,159 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 45th percentile – i.e., 45% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 34 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 52% of its contemporaries.