Abstract
A novel activation function (referred to as tanhLU) that integrates hyperbolic tangent function (tanh) with a linear unit is proposed as a promising alternative to tanh for neural networks. The tanhLU is inspired by the boundlessness of rectified linear unit (ReLU) and the symmetry of tanh. Three variable parameters in tanhLU controlling activation values and gradients could be preconfigured as constants or adaptively optimized during the training process. The capacity of tanhLU is first investigated by checking the weight gradients in error back propagation. Experiments are conducted to validate the improvement of tanhLUs on five types of neural networks, based on seven benchmark datasets in different domains. tanhLU is then applied to predict the highly nonlinear stress–strain relationship of soils by using the multiscale stress–strain (MSS) dataset. The experiment results indicate that using constant tanhLU leads to apparent improvement on FCNN and LSTM with lower loss and higher accuracy compared with tanh. Adaptive tanhLUs achieved the state-of-the-art performance for multiple deep neural networks in image classification and face recognition.
Original language | English |
---|---|
Article number | 117181 |
Journal | Expert Systems with Applications |
Volume | 199 |
DOIs | |
Publication status | Published - 1 Aug 2022 |
Keywords
- Activation function
- Neural networks
- tanhLUs
ASJC Scopus subject areas
- General Engineering
- Computer Science Applications
- Artificial Intelligence