Fast computation with spikes in a recurrent neural network

Dezhe Z. Jin, H. Sebastian Seung

Research output: Contribution to journalArticle

22 Citations (Scopus)

Abstract

Neural networks with recurrent connections are sometimes regarded as too slow at computation to serve as models of the brain. Here we analytically study a counterexample, a network consisting of N integrate-and-fire neurons with self excitation, all-to-all inhibition, instantaneous synaptic coupling, and constant external driving inputs. When the inhibition and/or excitation are large enough, the network performs a winner-take-all computation for all possible external inputs and initial states of the network. The computation is done very quickly: As soon as the winner spikes once, the computation is completed since no other neurons will spike. For some initial states, the winner is the first neuron to spike, and the computation is done at the first spike of the network. In general, there are M potential winners, corresponding to the top M external inputs. When the external inputs are close in magnitude, M tends to be larger. If M>1, the selection of the actual winner is strongly influenced by the initial states. If a special relation between the excitation and inhibition is satisfied, the network always selects the neuron with the maximum external input as the winner.

Original languageEnglish (US)
Article number051922
Pages (from-to)051922/1-051922/4
JournalPhysical Review E - Statistical, Nonlinear, and Soft Matter Physics
Volume65
Issue number5
DOIs
StatePublished - May 1 2002

Fingerprint

Recurrent Neural Networks
Spike
spikes
Neuron
neurons
Excitation
Winner-take-all
self excitation
Instantaneous
Counterexample
excitation
brain
Integrate
Tend
Neural Networks

All Science Journal Classification (ASJC) codes

  • Statistical and Nonlinear Physics
  • Statistics and Probability
  • Condensed Matter Physics

Cite this

@article{90b4bb2de6ac41f9b7d09abcf6da9116,
title = "Fast computation with spikes in a recurrent neural network",
abstract = "Neural networks with recurrent connections are sometimes regarded as too slow at computation to serve as models of the brain. Here we analytically study a counterexample, a network consisting of N integrate-and-fire neurons with self excitation, all-to-all inhibition, instantaneous synaptic coupling, and constant external driving inputs. When the inhibition and/or excitation are large enough, the network performs a winner-take-all computation for all possible external inputs and initial states of the network. The computation is done very quickly: As soon as the winner spikes once, the computation is completed since no other neurons will spike. For some initial states, the winner is the first neuron to spike, and the computation is done at the first spike of the network. In general, there are M potential winners, corresponding to the top M external inputs. When the external inputs are close in magnitude, M tends to be larger. If M>1, the selection of the actual winner is strongly influenced by the initial states. If a special relation between the excitation and inhibition is satisfied, the network always selects the neuron with the maximum external input as the winner.",
author = "Jin, {Dezhe Z.} and Seung, {H. Sebastian}",
year = "2002",
month = "5",
day = "1",
doi = "10.1103/PhysRevE.65.051922",
language = "English (US)",
volume = "65",
pages = "051922/1--051922/4",
journal = "Physical Review E",
issn = "2470-0045",
publisher = "American Physical Society",
number = "5",

}

Fast computation with spikes in a recurrent neural network. / Jin, Dezhe Z.; Seung, H. Sebastian.

In: Physical Review E - Statistical, Nonlinear, and Soft Matter Physics, Vol. 65, No. 5, 051922, 01.05.2002, p. 051922/1-051922/4.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Fast computation with spikes in a recurrent neural network

AU - Jin, Dezhe Z.

AU - Seung, H. Sebastian

PY - 2002/5/1

Y1 - 2002/5/1

N2 - Neural networks with recurrent connections are sometimes regarded as too slow at computation to serve as models of the brain. Here we analytically study a counterexample, a network consisting of N integrate-and-fire neurons with self excitation, all-to-all inhibition, instantaneous synaptic coupling, and constant external driving inputs. When the inhibition and/or excitation are large enough, the network performs a winner-take-all computation for all possible external inputs and initial states of the network. The computation is done very quickly: As soon as the winner spikes once, the computation is completed since no other neurons will spike. For some initial states, the winner is the first neuron to spike, and the computation is done at the first spike of the network. In general, there are M potential winners, corresponding to the top M external inputs. When the external inputs are close in magnitude, M tends to be larger. If M>1, the selection of the actual winner is strongly influenced by the initial states. If a special relation between the excitation and inhibition is satisfied, the network always selects the neuron with the maximum external input as the winner.

AB - Neural networks with recurrent connections are sometimes regarded as too slow at computation to serve as models of the brain. Here we analytically study a counterexample, a network consisting of N integrate-and-fire neurons with self excitation, all-to-all inhibition, instantaneous synaptic coupling, and constant external driving inputs. When the inhibition and/or excitation are large enough, the network performs a winner-take-all computation for all possible external inputs and initial states of the network. The computation is done very quickly: As soon as the winner spikes once, the computation is completed since no other neurons will spike. For some initial states, the winner is the first neuron to spike, and the computation is done at the first spike of the network. In general, there are M potential winners, corresponding to the top M external inputs. When the external inputs are close in magnitude, M tends to be larger. If M>1, the selection of the actual winner is strongly influenced by the initial states. If a special relation between the excitation and inhibition is satisfied, the network always selects the neuron with the maximum external input as the winner.

UR - http://www.scopus.com/inward/record.url?scp=41349115003&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=41349115003&partnerID=8YFLogxK

U2 - 10.1103/PhysRevE.65.051922

DO - 10.1103/PhysRevE.65.051922

M3 - Article

AN - SCOPUS:41349115003

VL - 65

SP - 051922/1-051922/4

JO - Physical Review E

JF - Physical Review E

SN - 2470-0045

IS - 5

M1 - 051922

ER -