Surrogate-Assisted Evolutionary Search of Spiking Neural Architectures in Liquid State Machines
Zhou Y, Jin Y, Ding J (2020)
Neurocomputing 406: 12-23.
Zeitschriftenaufsatz
| Veröffentlicht | Englisch
Download
Es wurden keine Dateien hochgeladen. Nur Publikationsnachweis!
Autor*in
Zhou, Yan;
Jin, YaochuUniBi ;
Ding, Jinliang
Abstract / Bemerkung
Spiking neural networks (SNNs) are believed to be a powerful neural computation framework inspired by the vivo neurons. As a class of recurrent SNNs, liquid state machines (LSMs) are biologically more plausible models imitating the architecture and functions of the human brain for information processing. However, few LSM models can outperform conventional analogue neural networks for solving real-world classification or regression problems, which can mainly be attributed to the sensitivity of the training performance to the architecture of the reservoir and the parameters in the spiking neuron models. Most recently, many algorithms have been proposed for automated machine learning that aims to automatically design the architecture and parameters of deep neural networks without much human intervention. Although automated machine learning and neural architecture search have been extremely successful in conventional neural networks, little research on search for an optimal architecture and hyperparameters of LSMs has been reported. This work proposes on a surrogate-assisted evolutionary search method for optimization of the hyperparameters and neural architecture of the reservoir of LSMs using the covariance matrix adaptation evolution strategy (CMA-ES). For reducing the search space, the architecture of the LSM is encoded by a connectivity probability together with the hyperparameters in the spiking neuron models. To enhance the computational efficiency, a Gaussian process is adopted as the surrogate to assist the CMA-ES. The proposed GP-assisted CMA-ES is compared with the canonical CMA-ES and a Bayesian optimization algorithm on two popular datasets including image and action recognition. Our results confirm that the proposed algorithm is efficient and effective in optimizing the parameters and architecture of LSMs.
Erscheinungsjahr
2020
Zeitschriftentitel
Neurocomputing
Band
406
Seite(n)
12-23
ISSN
0925-2312
Page URI
https://pub.uni-bielefeld.de/record/2978395
Zitieren
Zhou Y, Jin Y, Ding J. Surrogate-Assisted Evolutionary Search of Spiking Neural Architectures in Liquid State Machines. Neurocomputing. 2020;406:12-23.
Zhou, Y., Jin, Y., & Ding, J. (2020). Surrogate-Assisted Evolutionary Search of Spiking Neural Architectures in Liquid State Machines. Neurocomputing, 406, 12-23. https://doi.org/10.1016/j.neucom.2020.04.079
Zhou, Yan, Jin, Yaochu, and Ding, Jinliang. 2020. “Surrogate-Assisted Evolutionary Search of Spiking Neural Architectures in Liquid State Machines”. Neurocomputing 406: 12-23.
Zhou, Y., Jin, Y., and Ding, J. (2020). Surrogate-Assisted Evolutionary Search of Spiking Neural Architectures in Liquid State Machines. Neurocomputing 406, 12-23.
Zhou, Y., Jin, Y., & Ding, J., 2020. Surrogate-Assisted Evolutionary Search of Spiking Neural Architectures in Liquid State Machines. Neurocomputing, 406, p 12-23.
Y. Zhou, Y. Jin, and J. Ding, “Surrogate-Assisted Evolutionary Search of Spiking Neural Architectures in Liquid State Machines”, Neurocomputing, vol. 406, 2020, pp. 12-23.
Zhou, Y., Jin, Y., Ding, J.: Surrogate-Assisted Evolutionary Search of Spiking Neural Architectures in Liquid State Machines. Neurocomputing. 406, 12-23 (2020).
Zhou, Yan, Jin, Yaochu, and Ding, Jinliang. “Surrogate-Assisted Evolutionary Search of Spiking Neural Architectures in Liquid State Machines”. Neurocomputing 406 (2020): 12-23.
Link(s) zu Volltext(en)
Access Level
Closed Access