↓ Skip to main content

Accelerating Inference of Convolutional Neural Networks Using In-memory Computing

Overview of attention for article published in Frontiers in Computational Neuroscience, August 2021
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (85th percentile)
  • High Attention Score compared to outputs of the same age and source (95th percentile)

Mentioned by

twitter
10 X users
patent
3 patents

Citations

dimensions_citation
18 Dimensions

Readers on

mendeley
22 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Accelerating Inference of Convolutional Neural Networks Using In-memory Computing
Published in
Frontiers in Computational Neuroscience, August 2021
DOI 10.3389/fncom.2021.674154
Pubmed ID
Authors

Martino Dazzi, Abu Sebastian, Luca Benini, Evangelos Eleftheriou

Abstract

In-memory computing (IMC) is a non-von Neumann paradigm that has recently established itself as a promising approach for energy-efficient, high throughput hardware for deep learning applications. One prominent application of IMC is that of performing matrix-vector multiplication in O ( 1 ) time complexity by mapping the synaptic weights of a neural-network layer to the devices of an IMC core. However, because of the significantly different pattern of execution compared to previous computational paradigms, IMC requires a rethinking of the architectural design choices made when designing deep-learning hardware. In this work, we focus on application-specific, IMC hardware for inference of Convolution Neural Networks (CNNs), and provide methodologies for implementing the various architectural components of the IMC core. Specifically, we present methods for mapping synaptic weights and activations on the memory structures and give evidence of the various trade-offs therein, such as the one between on-chip memory requirements and execution latency. Lastly, we show how to employ these methods to implement a pipelined dataflow that offers throughput and latency beyond state-of-the-art for image classification tasks.

X Demographics

X Demographics

The data shown below were collected from the profiles of 10 X users who shared this research output. Click here to find out more about how the information was compiled.
As of 1 July 2024, you may notice a temporary increase in the numbers of X profiles with Unknown location. Click here to learn more.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 22 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 22 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 4 18%
Researcher 3 14%
Student > Bachelor 1 5%
Lecturer 1 5%
Student > Doctoral Student 1 5%
Other 1 5%
Unknown 11 50%
Readers by discipline Count As %
Engineering 4 18%
Biochemistry, Genetics and Molecular Biology 1 5%
Chemical Engineering 1 5%
Materials Science 1 5%
Computer Science 1 5%
Other 0 0%
Unknown 14 64%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 13. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 04 April 2024.
All research outputs
#2,877,794
of 26,485,427 outputs
Outputs from Frontiers in Computational Neuroscience
#103
of 1,498 outputs
Outputs of similar age
#63,956
of 444,436 outputs
Outputs of similar age from Frontiers in Computational Neuroscience
#1
of 24 outputs
Altmetric has tracked 26,485,427 research outputs across all sources so far. Compared to these this one has done well and is in the 89th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,498 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 6.9. This one has done particularly well, scoring higher than 93% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 444,436 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 85% of its contemporaries.
We're also able to compare this research output to 24 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 95% of its contemporaries.