Loading...
Thumbnail Image
Publication

A weight initialization method based on neural network with asymmetric activation function

Liu, J.
Liu, Y.
Zhang, Qichun
Publication Date
2022-04
End of Embargo
Supervisor
Rights
© 2022 Elsevier. Reproduced in accordance with the publisher's self-archiving policy. This manuscript version is made available under the CC-BY-NC-ND 4.0 license.
Peer-Reviewed
Yes
Open Access status
openAccess
Accepted for publication
2022-01-23
Institution
Department
Awarded
Embargo end date
Additional title
Abstract
Weight initialization of neural networks has an important influence on the learning process, and the selection of initial weights is related to the activation interval of the activation function. It is proposed that an improved and extended weight initialization method for neural network with asymmetric activation function as an extension of the linear interval tolerance method (LIT), called ‘GLIT’ (generalized LIT), which is more suitable for higher-dimensional inputs. The purpose is to expand the selection range of the activation function so that the input falls in the unsaturated region, so as to improve the performance of the network. Then, a tolerance solution theorem based upon neural network system is given and proved. Furthermore, the algorithm is given about determining the initial weight interval. The validity of the theorem and algorithm is verified by numerical experiments. The input could fall into any preset interval in the sense of probability under the GLIT method. In another sense, the GLIT method could provide a theoretical basis for the further study of neural networks.
Version
Accepted manuscript
Citation
Liu J, Liu Y & Zhang Q (2022) A weight initialization method based on neural network with asymmetric activation function. Neurocomputing. 483: 171-182.
Link to publisher’s version
Link to published version
Type
Article
Qualification name
Notes