I am reading the Simon Haykin's cornerstone book, "Neural Networks, A Comprehensive Foundation, Second Edition" and I cannot understand a paragraph below:
The analysis of the dynamic behaviour of neural networks involving the application of feedback is unfortunately complicated by virute (or virtue I cannot get word appropriately) of the fact that the processing units used for the construction of the network are usually nonlinear. Further consideration of this issue is deferred to the latter part of the book.
Before the paragraph, the author analysis the affects of weight of synapsis to the neural network's stability. Roughly speaking, he says, if |w| >= 1 the neural network become unstable.
Could you please explain the paragraph? Thanks in advance.