Our goal is to investigate and exploit an analogy between the scaled hyperpower family (SHPI family) of iterative methods for computing the matrix inverse and the discretization of Zhang Neural Network (ZNN) models. A class of ZNN models corresponding to the family of hyperpower iterative methods for computing generalized inverses is defined on the basis of the discovered analogy. The Simulink implementation in Matlab of the introduced ZNN models is described in the case of scaled hyperpower methods of the order 2 and 3. Convergence properties of the proposed ZNN models are investigated as well as their numerical behavior.