Hebbian learning with winner take all for spiking neural networks

Ankur Gupta, Lyle N. Long

Research output: Chapter in Book/Report/Conference proceedingConference contribution

27 Scopus citations

Abstract

Learning methods for spiking neural networks are not as well developed as the traditional rate based networks, which widely use the back-propagation learning algorithm. We propose and implement an efficient Hebbian learning method with homeostasis for a network of spiking neurons. Similar to STDP, timing between spikes is used for synaptic modification. Homeostasis ensures that the synaptic weights are bounded and the learning is stable. The winner take all mechanism is also implemented to promote competitive learning among output neurons. We have implemented this method in a C++ object oriented code (called CSpike). We have tested the code on four images of Gabor filters and found bell-shaped tuning curves using 36 test set images of Gabor filters in different orientations. These bell-shapes curves are similar to those experimentally observed in the VI and MT/V5 area of the mammalian brain.

Original languageEnglish (US)
Title of host publication2009 International Joint Conference on Neural Networks, IJCNN 2009
Pages1054-1060
Number of pages7
DOIs
StatePublished - Nov 18 2009
Event2009 International Joint Conference on Neural Networks, IJCNN 2009 - Atlanta, GA, United States
Duration: Jun 14 2009Jun 19 2009

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Other

Other2009 International Joint Conference on Neural Networks, IJCNN 2009
CountryUnited States
CityAtlanta, GA
Period6/14/096/19/09

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Hebbian learning with winner take all for spiking neural networks'. Together they form a unique fingerprint.

  • Cite this

    Gupta, A., & Long, L. N. (2009). Hebbian learning with winner take all for spiking neural networks. In 2009 International Joint Conference on Neural Networks, IJCNN 2009 (pp. 1054-1060). [5178751] (Proceedings of the International Joint Conference on Neural Networks). https://doi.org/10.1109/IJCNN.2009.5178751