Gaussian synapses for probabilistic neural networks

Amritanand Sebastian, Andrew Pannone, Shiva Subbulakshmi Radhakrishnan, Saptarshi Das

Research output: Contribution to journalArticle

4 Scopus citations

Abstract

The recent decline in energy, size and complexity scaling of traditional von Neumann architecture has resurrected considerable interest in brain-inspired computing. Artificial neural networks (ANNs) based on emerging devices, such as memristors, achieve brain-like computing but lack energy-efficiency. Furthermore, slow learning, incremental adaptation, and false convergence are unresolved challenges for ANNs. In this article we, therefore, introduce Gaussian synapses based on heterostructures of atomically thin two-dimensional (2D) layered materials, namely molybdenum disulfide and black phosphorus field effect transistors (FETs), as a class of analog and probabilistic computational primitives for hardware implementation of statistical neural networks. We also demonstrate complete tunability of amplitude, mean and standard deviation of the Gaussian synapse via threshold engineering in dual gated molybdenum disulfide and black phosphorus FETs. Finally, we show simulation results for classification of brainwaves using Gaussian synapse based probabilistic neural networks.

Original languageEnglish (US)
Article number4199
JournalNature communications
Volume10
Issue number1
DOIs
StatePublished - Dec 1 2019

    Fingerprint

All Science Journal Classification (ASJC) codes

  • Chemistry(all)
  • Biochemistry, Genetics and Molecular Biology(all)
  • Physics and Astronomy(all)

Cite this

Sebastian, A., Pannone, A., Subbulakshmi Radhakrishnan, S., & Das, S. (2019). Gaussian synapses for probabilistic neural networks. Nature communications, 10(1), [4199]. https://doi.org/10.1038/s41467-019-12035-6