TraNNsformer: Neural network transformation for memristive crossbar based neuromorphic system design

Aayush Ankit, Abhronil Sengupta, Kaushik Roy

Research output: Chapter in Book/Report/Conference proceedingConference contribution

13 Citations (Scopus)

Abstract

Implementation of Neuromorphic Systems using post Complementary Metal-Oxide-Semiconductor (CMOS) technology based Memristive Crossbar Array (MCA) has emerged as a promising solution to enable low-power acceleration of neural networks. However, the recent trend to design Deep Neural Networks (DNNs) for achieving human-like cognitive abilities poses significant challenges towards the scalable design of neuromorphic systems (due to the increase in computation/storage demands). Network pruning [7] is a powerful technique to remove redundant connections for designing optimally connected (maximally sparse) DNNs. However, such pruning techniques induce irregular connections that are incoherent to the crossbar structure. Eventually they produce DNNs with highly inefficient hardware realizations (in terms of area and energy). In this work, we propose TraNNsformer - an integrated training framework that transforms DNNs to enable their efficient realization on MCA-based systems. TraNNsformer first prunes the connectivity matrix while forming clusters with the remaining connections. Subsequently, it retrains the network to fine tune the connections and reinforce the clusters. This is done iteratively to transform the original connectivity into an optimally pruned and maximally clustered mapping. We evaluated the proposed framework by transforming different Multi-Layer Perceptron (MLP) based Spiking Neural Networks (SNNs) on a wide range of datasets (MNIST, SVHN and CIFAR10) and executing them on MCA-based systems to analyze the area and energy benefits. Without accuracy loss, TraNNsformer reduces the area (energy) consumption by 28%-55% (49%-67%) with respect to the original network. Compared to network pruning, TraNNsformer achieves 28%-49% (15%-29%) area (energy) savings. Furthermore, TraNNsformer is a technology-aware framework that allows mapping a given DNN to any MCA size permissible by the memristive technology for reliable operations.

Original languageEnglish (US)
Title of host publication2017 IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages533-540
Number of pages8
ISBN (Electronic)9781538630938
DOIs
StatePublished - Dec 13 2017
Event36th IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017 - Irvine, United States
Duration: Nov 13 2017Nov 16 2017

Publication series

NameIEEE/ACM International Conference on Computer-Aided Design, Digest of Technical Papers, ICCAD
Volume2017-November
ISSN (Print)1092-3152

Other

Other36th IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017
CountryUnited States
CityIrvine
Period11/13/1711/16/17

Fingerprint

Systems analysis
Neural networks
Multilayer neural networks
Energy conservation
Energy utilization
Deep neural networks
Hardware
Metals

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design

Cite this

Ankit, A., Sengupta, A., & Roy, K. (2017). TraNNsformer: Neural network transformation for memristive crossbar based neuromorphic system design. In 2017 IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017 (pp. 533-540). (IEEE/ACM International Conference on Computer-Aided Design, Digest of Technical Papers, ICCAD; Vol. 2017-November). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICCAD.2017.8203823
Ankit, Aayush ; Sengupta, Abhronil ; Roy, Kaushik. / TraNNsformer : Neural network transformation for memristive crossbar based neuromorphic system design. 2017 IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017. Institute of Electrical and Electronics Engineers Inc., 2017. pp. 533-540 (IEEE/ACM International Conference on Computer-Aided Design, Digest of Technical Papers, ICCAD).
@inproceedings{436f7abe77984f03acf997a9fab57de9,
title = "TraNNsformer: Neural network transformation for memristive crossbar based neuromorphic system design",
abstract = "Implementation of Neuromorphic Systems using post Complementary Metal-Oxide-Semiconductor (CMOS) technology based Memristive Crossbar Array (MCA) has emerged as a promising solution to enable low-power acceleration of neural networks. However, the recent trend to design Deep Neural Networks (DNNs) for achieving human-like cognitive abilities poses significant challenges towards the scalable design of neuromorphic systems (due to the increase in computation/storage demands). Network pruning [7] is a powerful technique to remove redundant connections for designing optimally connected (maximally sparse) DNNs. However, such pruning techniques induce irregular connections that are incoherent to the crossbar structure. Eventually they produce DNNs with highly inefficient hardware realizations (in terms of area and energy). In this work, we propose TraNNsformer - an integrated training framework that transforms DNNs to enable their efficient realization on MCA-based systems. TraNNsformer first prunes the connectivity matrix while forming clusters with the remaining connections. Subsequently, it retrains the network to fine tune the connections and reinforce the clusters. This is done iteratively to transform the original connectivity into an optimally pruned and maximally clustered mapping. We evaluated the proposed framework by transforming different Multi-Layer Perceptron (MLP) based Spiking Neural Networks (SNNs) on a wide range of datasets (MNIST, SVHN and CIFAR10) and executing them on MCA-based systems to analyze the area and energy benefits. Without accuracy loss, TraNNsformer reduces the area (energy) consumption by 28{\%}-55{\%} (49{\%}-67{\%}) with respect to the original network. Compared to network pruning, TraNNsformer achieves 28{\%}-49{\%} (15{\%}-29{\%}) area (energy) savings. Furthermore, TraNNsformer is a technology-aware framework that allows mapping a given DNN to any MCA size permissible by the memristive technology for reliable operations.",
author = "Aayush Ankit and Abhronil Sengupta and Kaushik Roy",
year = "2017",
month = "12",
day = "13",
doi = "10.1109/ICCAD.2017.8203823",
language = "English (US)",
series = "IEEE/ACM International Conference on Computer-Aided Design, Digest of Technical Papers, ICCAD",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "533--540",
booktitle = "2017 IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017",
address = "United States",

}

Ankit, A, Sengupta, A & Roy, K 2017, TraNNsformer: Neural network transformation for memristive crossbar based neuromorphic system design. in 2017 IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017. IEEE/ACM International Conference on Computer-Aided Design, Digest of Technical Papers, ICCAD, vol. 2017-November, Institute of Electrical and Electronics Engineers Inc., pp. 533-540, 36th IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017, Irvine, United States, 11/13/17. https://doi.org/10.1109/ICCAD.2017.8203823

TraNNsformer : Neural network transformation for memristive crossbar based neuromorphic system design. / Ankit, Aayush; Sengupta, Abhronil; Roy, Kaushik.

2017 IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017. Institute of Electrical and Electronics Engineers Inc., 2017. p. 533-540 (IEEE/ACM International Conference on Computer-Aided Design, Digest of Technical Papers, ICCAD; Vol. 2017-November).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - TraNNsformer

T2 - Neural network transformation for memristive crossbar based neuromorphic system design

AU - Ankit, Aayush

AU - Sengupta, Abhronil

AU - Roy, Kaushik

PY - 2017/12/13

Y1 - 2017/12/13

N2 - Implementation of Neuromorphic Systems using post Complementary Metal-Oxide-Semiconductor (CMOS) technology based Memristive Crossbar Array (MCA) has emerged as a promising solution to enable low-power acceleration of neural networks. However, the recent trend to design Deep Neural Networks (DNNs) for achieving human-like cognitive abilities poses significant challenges towards the scalable design of neuromorphic systems (due to the increase in computation/storage demands). Network pruning [7] is a powerful technique to remove redundant connections for designing optimally connected (maximally sparse) DNNs. However, such pruning techniques induce irregular connections that are incoherent to the crossbar structure. Eventually they produce DNNs with highly inefficient hardware realizations (in terms of area and energy). In this work, we propose TraNNsformer - an integrated training framework that transforms DNNs to enable their efficient realization on MCA-based systems. TraNNsformer first prunes the connectivity matrix while forming clusters with the remaining connections. Subsequently, it retrains the network to fine tune the connections and reinforce the clusters. This is done iteratively to transform the original connectivity into an optimally pruned and maximally clustered mapping. We evaluated the proposed framework by transforming different Multi-Layer Perceptron (MLP) based Spiking Neural Networks (SNNs) on a wide range of datasets (MNIST, SVHN and CIFAR10) and executing them on MCA-based systems to analyze the area and energy benefits. Without accuracy loss, TraNNsformer reduces the area (energy) consumption by 28%-55% (49%-67%) with respect to the original network. Compared to network pruning, TraNNsformer achieves 28%-49% (15%-29%) area (energy) savings. Furthermore, TraNNsformer is a technology-aware framework that allows mapping a given DNN to any MCA size permissible by the memristive technology for reliable operations.

AB - Implementation of Neuromorphic Systems using post Complementary Metal-Oxide-Semiconductor (CMOS) technology based Memristive Crossbar Array (MCA) has emerged as a promising solution to enable low-power acceleration of neural networks. However, the recent trend to design Deep Neural Networks (DNNs) for achieving human-like cognitive abilities poses significant challenges towards the scalable design of neuromorphic systems (due to the increase in computation/storage demands). Network pruning [7] is a powerful technique to remove redundant connections for designing optimally connected (maximally sparse) DNNs. However, such pruning techniques induce irregular connections that are incoherent to the crossbar structure. Eventually they produce DNNs with highly inefficient hardware realizations (in terms of area and energy). In this work, we propose TraNNsformer - an integrated training framework that transforms DNNs to enable their efficient realization on MCA-based systems. TraNNsformer first prunes the connectivity matrix while forming clusters with the remaining connections. Subsequently, it retrains the network to fine tune the connections and reinforce the clusters. This is done iteratively to transform the original connectivity into an optimally pruned and maximally clustered mapping. We evaluated the proposed framework by transforming different Multi-Layer Perceptron (MLP) based Spiking Neural Networks (SNNs) on a wide range of datasets (MNIST, SVHN and CIFAR10) and executing them on MCA-based systems to analyze the area and energy benefits. Without accuracy loss, TraNNsformer reduces the area (energy) consumption by 28%-55% (49%-67%) with respect to the original network. Compared to network pruning, TraNNsformer achieves 28%-49% (15%-29%) area (energy) savings. Furthermore, TraNNsformer is a technology-aware framework that allows mapping a given DNN to any MCA size permissible by the memristive technology for reliable operations.

UR - http://www.scopus.com/inward/record.url?scp=85043519956&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85043519956&partnerID=8YFLogxK

U2 - 10.1109/ICCAD.2017.8203823

DO - 10.1109/ICCAD.2017.8203823

M3 - Conference contribution

AN - SCOPUS:85043519956

T3 - IEEE/ACM International Conference on Computer-Aided Design, Digest of Technical Papers, ICCAD

SP - 533

EP - 540

BT - 2017 IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017

PB - Institute of Electrical and Electronics Engineers Inc.

ER -

Ankit A, Sengupta A, Roy K. TraNNsformer: Neural network transformation for memristive crossbar based neuromorphic system design. In 2017 IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2017. Institute of Electrical and Electronics Engineers Inc. 2017. p. 533-540. (IEEE/ACM International Conference on Computer-Aided Design, Digest of Technical Papers, ICCAD). https://doi.org/10.1109/ICCAD.2017.8203823