↓ Skip to main content

Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification

Overview of attention for article published in Frontiers in Neuroscience, December 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (93rd percentile)
  • High Attention Score compared to outputs of the same age and source (94th percentile)

Mentioned by

blogs
2 blogs
twitter
3 X users
patent
7 patents
reddit
1 Redditor

Citations

dimensions_citation
675 Dimensions

Readers on

mendeley
425 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification
Published in
Frontiers in Neuroscience, December 2017
DOI 10.3389/fnins.2017.00682
Pubmed ID
Authors

Bodo Rueckauer, Iulia-Alexandra Lungu, Yuhuang Hu, Michael Pfeiffer, Shih-Chii Liu

Abstract

Spiking neural networks (SNNs) can potentially offer an efficient way of doing inference because the neurons in the networks are sparsely activated and computations are event-driven. Previous work showed that simple continuous-valued deep Convolutional Neural Networks (CNNs) can be converted into accurate spiking equivalents. These networks did not include certain common operations such as max-pooling, softmax, batch-normalization and Inception-modules. This paper presents spiking equivalents of these operations therefore allowing conversion of nearly arbitrary CNN architectures. We show conversion of popular CNN architectures, including VGG-16 and Inception-v3, into SNNs that produce the best results reported to date on MNIST, CIFAR-10 and the challenging ImageNet dataset. SNNs can trade off classification error rate against the number of available operations whereas deep continuous-valued neural networks require a fixed number of operations to achieve their classification error rate. From the examples of LeNet for MNIST and BinaryNet for CIFAR-10, we show that with an increase in error rate of a few percentage points, the SNNs can achieve more than 2x reductions in operations compared to the original CNNs. This highlights the potential of SNNs in particular when deployed on power-efficient neuromorphic spiking neuron chips, for use in embedded applications.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 425 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 425 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 75 18%
Student > Master 67 16%
Researcher 39 9%
Student > Bachelor 23 5%
Student > Doctoral Student 21 5%
Other 44 10%
Unknown 156 37%
Readers by discipline Count As %
Computer Science 106 25%
Engineering 93 22%
Neuroscience 29 7%
Physics and Astronomy 10 2%
Agricultural and Biological Sciences 3 <1%
Other 22 5%
Unknown 162 38%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 30. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 25 January 2024.
All research outputs
#1,384,749
of 26,383,000 outputs
Outputs from Frontiers in Neuroscience
#632
of 11,827 outputs
Outputs of similar age
#29,816
of 452,431 outputs
Outputs of similar age from Frontiers in Neuroscience
#11
of 184 outputs
Altmetric has tracked 26,383,000 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 94th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 11,827 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 11.3. This one has done particularly well, scoring higher than 94% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 452,431 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 93% of its contemporaries.
We're also able to compare this research output to 184 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 94% of its contemporaries.