Abstract
New dynamical models for solving the matrix equations BX=D and XC=D are developed in time-invariant case. These models are derived as a combination of GNN and ZNN models. They do not posses GNN dynamic due to their implicit dynamics. Formally observed, they can be derived by multiplying the right hand side in the ZNN dynamics by an appropriate symmetric positive definite matrix which improves the convergence rate. For this purpose, these models are termed as HZNN. The convergence of HZNN models is global and exponential. Also, the convergence rate of HZNN models is superior with respect to the convergence rate of the classical GNN model as well as with respect to ZNN models in time-invariant case. Capability of the HZNN models to overcome unavoidable implementation noises is considered theoretically and numerically. The Matlab implementation of HZNN models is proposed and used in numerical experiments for solving matrix equations and computing various appearances of outer inverses with prescribed range and null space.
Original language | English |
---|---|
Pages (from-to) | 124-134 |
Number of pages | 11 |
Journal | Neurocomputing |
Volume | 316 |
DOIs | |
Publication status | Published - 17 Nov 2018 |
Keywords
- Activation function
- Dynamic equation
- Gradient neural network
- Moore-Penrose inverse
- Outer inverse
- Zhang neural network
ASJC Scopus subject areas
- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence