Performance Analyses of Recurrent Neural Network Models Exploited for Online Time-Varying Nonlinear Optimization

Mei Liu, Bolin Liao, Lei Ding, Lin Xiao

In this paper, a special recurrent neural network (RNN), i.e., the Zhang neural network (ZNN), is presented and investigated for online time-varying nonlinear optimization (OTVNO). Compared with the research work done previously by others, this paper analyzes continuous-time and discrete-time ZNN models theoretically via rigorous proof. Theoretical results show that the residual errors of the continuous-time ZNN model possesses a global exponential convergence property and that the maximal steady-state residual errors of any method designed intrinsically for solving the static optimization problem and employed for the online solution of OTVNO is O(τ ), where τ denotes the sampling gap. In the presence of noises, the residual errors of the continuous-time ZNN model can be arbitrarily small for constant noises and random noises. Moreover, an optimal sampling gap formula is proposed for discrete-time ZNN model in the noisy environments. Finally, computer-simulation results further substantiate the performance analyses of ZNN models exploited for online time-varying nonlinear optimization.