A Gradient-Guided Evolutionary Approach to Training Deep Neural Networks

Yang S, Tian Y, He C, Zhang X, Tan KC, Jin Y (2022)
IEEE Transactions on Neural Networks and Learning Systems 33(9): 4861-4875.

Zeitschriftenaufsatz | Veröffentlicht | Englisch
 
Download
Es wurden keine Dateien hochgeladen. Nur Publikationsnachweis!
Autor*in
Yang, Shangshang; Tian, Ye; He, Cheng; Zhang, Xingyi; Tan, Kay Chen; Jin, YaochuUniBi
Abstract / Bemerkung
It has been widely recognized that the efficient training of neural networks (NNs) is crucial to classification performance. While a series of gradient-based approaches have been extensively developed, they are criticized for the ease of trapping into local optima and sensitivity to hyperparameters. Due to the high robustness and wide applicability, evolutionary algorithms (EAs) have been regarded as a promising alternative for training NNs in recent years. However, EAs suffer from the curse of dimensionality and are inefficient in training deep NNs (DNNs). By inheriting the advantages of both the gradient-based approaches and EAs, this article proposes a gradient-guided evolutionary approach to train DNNs. The proposed approach suggests a novel genetic operator to optimize the weights in the search space, where the search direction is determined by the gradient of weights. Moreover, the network sparsity is considered in the proposed approach, which highly reduces the network complexity and alleviates overfitting. Experimental results on single-layer NNs, deep-layer NNs, recurrent NNs, and convolutional NNs (CNNs) demonstrate the effectiveness of the proposed approach. In short, this work not only introduces a novel approach for training DNNs but also enhances the performance of EAs in solving large-scale optimization problems.
Erscheinungsjahr
2022
Zeitschriftentitel
IEEE Transactions on Neural Networks and Learning Systems
Band
33
Ausgabe
9
Seite(n)
4861-4875
ISSN
2162-237X
eISSN
2162-2388
Page URI
https://pub.uni-bielefeld.de/record/2978334

Zitieren

Yang S, Tian Y, He C, Zhang X, Tan KC, Jin Y. A Gradient-Guided Evolutionary Approach to Training Deep Neural Networks. IEEE Transactions on Neural Networks and Learning Systems. 2022;33(9):4861-4875.
Yang, S., Tian, Y., He, C., Zhang, X., Tan, K. C., & Jin, Y. (2022). A Gradient-Guided Evolutionary Approach to Training Deep Neural Networks. IEEE Transactions on Neural Networks and Learning Systems, 33(9), 4861-4875. https://doi.org/10.1109/TNNLS.2021.3061630
Yang, Shangshang, Tian, Ye, He, Cheng, Zhang, Xingyi, Tan, Kay Chen, and Jin, Yaochu. 2022. “A Gradient-Guided Evolutionary Approach to Training Deep Neural Networks”. IEEE Transactions on Neural Networks and Learning Systems 33 (9): 4861-4875.
Yang, S., Tian, Y., He, C., Zhang, X., Tan, K. C., and Jin, Y. (2022). A Gradient-Guided Evolutionary Approach to Training Deep Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33, 4861-4875.
Yang, S., et al., 2022. A Gradient-Guided Evolutionary Approach to Training Deep Neural Networks. IEEE Transactions on Neural Networks and Learning Systems, 33(9), p 4861-4875.
S. Yang, et al., “A Gradient-Guided Evolutionary Approach to Training Deep Neural Networks”, IEEE Transactions on Neural Networks and Learning Systems, vol. 33, 2022, pp. 4861-4875.
Yang, S., Tian, Y., He, C., Zhang, X., Tan, K.C., Jin, Y.: A Gradient-Guided Evolutionary Approach to Training Deep Neural Networks. IEEE Transactions on Neural Networks and Learning Systems. 33, 4861-4875 (2022).
Yang, Shangshang, Tian, Ye, He, Cheng, Zhang, Xingyi, Tan, Kay Chen, and Jin, Yaochu. “A Gradient-Guided Evolutionary Approach to Training Deep Neural Networks”. IEEE Transactions on Neural Networks and Learning Systems 33.9 (2022): 4861-4875.

Link(s) zu Volltext(en)
Access Level
Restricted Closed Access

Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Suchen in

Google Scholar