↓ Skip to main content

Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors

Overview of attention for article published in Frontiers in Neurorobotics, February 2018
Altmetric Badge

About this Attention Score

  • Above-average Attention Score compared to outputs of the same age (64th percentile)
  • Good Attention Score compared to outputs of the same age and source (78th percentile)

Mentioned by

twitter
3 X users
patent
3 patents

Readers on

mendeley
61 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors
Published in
Frontiers in Neurorobotics, February 2018
DOI 10.3389/fnbot.2018.00004
Pubmed ID
Authors

Lukas Everding, Jörg Conradt

Abstract

In order to safely navigate and orient in their local surroundings autonomous systems need to rapidly extract and persistently track visual features from the environment. While there are many algorithms tackling those tasks for traditional frame-based cameras, these have to deal with the fact that conventional cameras sample their environment with a fixed frequency. Most prominently, the same features have to be found in consecutive frames and corresponding features then need to be matched using elaborate techniques as any information between the two frames is lost. We introduce a novel method to detect and track line structures in data streams of event-based silicon retinae [also known as dynamic vision sensors (DVS)]. In contrast to conventional cameras, these biologically inspired sensors generate a quasicontinuous stream of vision information analogous to the information stream created by the ganglion cells in mammal retinae. All pixels of DVS operate asynchronously without a periodic sampling rate and emit a so-called DVS address event as soon as they perceive a luminance change exceeding an adjustable threshold. We use the high temporal resolution achieved by the DVS to track features continuously through time instead of only at fixed points in time. The focus of this work lies on tracking lines in a mostly static environment which is observed by a moving camera, a typical setting in mobile robotics. Since DVS events are mostly generated at object boundaries and edges which in man-made environments often form lines they were chosen as feature to track. Our method is based on detecting planes of DVS address events in x-y-t-space and tracing these planes through time. It is robust against noise and runs in real time on a standard computer, hence it is suitable for low latency robotics. The efficacy and performance are evaluated on real-world data sets which show artificial structures in an office-building using event data for tracking and frame data for ground-truth estimation from a DAVIS240C sensor.

Timeline

Login to access the full chart related to this output.

If you don’t have an account, click here to discover Explorer

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 61 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 61 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 17 28%
Student > Master 10 16%
Researcher 6 10%
Student > Doctoral Student 4 7%
Lecturer 1 2%
Other 4 7%
Unknown 19 31%
Readers by discipline Count As %
Engineering 25 41%
Computer Science 8 13%
Agricultural and Biological Sciences 2 3%
Arts and Humanities 1 2%
Unspecified 1 2%
Other 6 10%
Unknown 18 30%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 4. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 07 August 2024.
All research outputs
#6,578,356
of 23,275,636 outputs
Outputs from Frontiers in Neurorobotics
#168
of 901 outputs
Outputs of similar age
#115,922
of 331,559 outputs
Outputs of similar age from Frontiers in Neurorobotics
#3
of 14 outputs
Altmetric has tracked 23,275,636 research outputs across all sources so far. This one has received more attention than most of these and is in the 70th percentile.
So far Altmetric has tracked 901 research outputs from this source. They receive a mean Attention Score of 4.1. This one has done well, scoring higher than 81% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 331,559 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 64% of its contemporaries.
We're also able to compare this research output to 14 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 78% of its contemporaries.