↓ Skip to main content

Neuromorphic Event-Based 3D Pose Estimation

Overview of attention for article published in Frontiers in Neuroscience, January 2016
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

googleplus
2 Google+ users
video
1 YouTube creator

Readers on

mendeley
88 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Neuromorphic Event-Based 3D Pose Estimation
Published in
Frontiers in Neuroscience, January 2016
DOI 10.3389/fnins.2015.00522
Pubmed ID
Authors

David Reverter Valeiras, Garrick Orchard, Sio-Hoi Ieng, Ryad B. Benosman

Abstract

Pose estimation is a fundamental step in many artificial vision tasks. It consists of estimating the 3D pose of an object with respect to a camera from the object's 2D projection. Current state of the art implementations operate on images. These implementations are computationally expensive, especially for real-time applications. Scenes with fast dynamics exceeding 30-60 Hz can rarely be processed in real-time using conventional hardware. This paper presents a new method for event-based 3D object pose estimation, making full use of the high temporal resolution (1 μs) of asynchronous visual events output from a single neuromorphic camera. Given an initial estimate of the pose, each incoming event is used to update the pose by combining both 3D and 2D criteria. We show that the asynchronous high temporal resolution of the neuromorphic camera allows us to solve the problem in an incremental manner, achieving real-time performance at an update rate of several hundreds kHz on a conventional laptop. We show that the high temporal resolution of neuromorphic cameras is a key feature for performing accurate pose estimation. Experiments are provided showing the performance of the algorithm on real data, including fast moving objects, occlusions, and cases where the neuromorphic camera and the object are both in motion.

Timeline

Login to access the full chart related to this output.

If you don’t have an account, click here to discover Explorer

Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 88 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Singapore 2 2%
United States 1 1%
Switzerland 1 1%
Unknown 84 95%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 22 25%
Researcher 16 18%
Student > Master 13 15%
Professor > Associate Professor 6 7%
Professor 3 3%
Other 8 9%
Unknown 20 23%
Readers by discipline Count As %
Engineering 31 35%
Computer Science 26 30%
Neuroscience 3 3%
Social Sciences 2 2%
Energy 1 1%
Other 2 2%
Unknown 23 26%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 18 February 2016.
All research outputs
#14,914,476
of 25,374,647 outputs
Outputs from Frontiers in Neuroscience
#6,085
of 11,542 outputs
Outputs of similar age
#199,820
of 403,895 outputs
Outputs of similar age from Frontiers in Neuroscience
#85
of 157 outputs
Altmetric has tracked 25,374,647 research outputs across all sources so far. This one is in the 40th percentile – i.e., 40% of other outputs scored the same or lower than it.
So far Altmetric has tracked 11,542 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 10.9. This one is in the 45th percentile – i.e., 45% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 403,895 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 49th percentile – i.e., 49% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 157 others from the same source and published within six weeks on either side of this one. This one is in the 43rd percentile – i.e., 43% of its contemporaries scored the same or lower than it.