Enhancement of neural networks with an alternative activation function tanhLU

Shui Long Shen, Ning Zhang, Annan Zhou, Zhen Yu Yin

Research output: Journal article publicationJournal articleAcademic researchpeer-review

62 Citations (Scopus)

Abstract

A novel activation function (referred to as tanhLU) that integrates hyperbolic tangent function (tanh) with a linear unit is proposed as a promising alternative to tanh for neural networks. The tanhLU is inspired by the boundlessness of rectified linear unit (ReLU) and the symmetry of tanh. Three variable parameters in tanhLU controlling activation values and gradients could be preconfigured as constants or adaptively optimized during the training process. The capacity of tanhLU is first investigated by checking the weight gradients in error back propagation. Experiments are conducted to validate the improvement of tanhLUs on five types of neural networks, based on seven benchmark datasets in different domains. tanhLU is then applied to predict the highly nonlinear stress–strain relationship of soils by using the multiscale stress–strain (MSS) dataset. The experiment results indicate that using constant tanhLU leads to apparent improvement on FCNN and LSTM with lower loss and higher accuracy compared with tanh. Adaptive tanhLUs achieved the state-of-the-art performance for multiple deep neural networks in image classification and face recognition.

Original languageEnglish
Article number117181
JournalExpert Systems with Applications
Volume199
DOIs
Publication statusPublished - 1 Aug 2022

Keywords

  • Activation function
  • Neural networks
  • tanhLUs

ASJC Scopus subject areas

  • General Engineering
  • Computer Science Applications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Enhancement of neural networks with an alternative activation function tanhLU'. Together they form a unique fingerprint.

Cite this