Training Spiking Deep Networks for Neuromorphic Hardware

arXiv:1611.05141, 2016

Eric Hunsberger, Chris Eliasmith

Abstract

We describe a method to train spiking deep networks that can be run using leaky integrate-and-fire (LIF) neurons, achieving state-of-the-art results for spiking LIF networks on five datasets, including the large ImageNet ILSVRC-2012 benchmark. Our method for transforming deep artificial neural networks into spiking networks is scalable and works with a wide range of neural nonlinearities. We achieve these results by softening the neural response function, such that its derivative remains bounded, and by training the network with noise to provide robustness against the variability introduced by spikes. Our analysis shows that implementations of these networks on neuromorphic hardware will be many times more power-efficient than the equivalent non-spiking networks on traditional hardware.

Full text links

 DOI

 arXiv

Preprint

Journal
arXiv:1611.05141
Doi
10.13140/RG.2.2.10967.06566
Arxiv
1611.05141

Cite

Plain text

BibTeX