【正文】
hip. The Hopfield puter, wave ear boltzmann machine all belong to this type. Learning type Neural work learning is an important content, it is through the adaptability of the realization of learning. According to the change of environment, adjust to weights, improve the behavior of the system. The proposed by the Hebb Hebb learning rules for neural work learning algorithm to lay the foundation. Hebb rules say that learning process finally happened between neurons in the synapse, the contact strength synapses parts with before and after the activity and synaptic neuron changes. Based on this, people put forward various learning rules and algorithm, in order to adapt to the needs of different work model. Effective learning algorithm, and makes the god The work can through the weights between adjustment, the structure of the objective world, said the formation of inner characteristics of information processing method, information storage and processing reflected in the work connection. According to the learning environment is different, the study method of the neural work can be divided into learning supervision and unsupervised learning. In the supervision and study, will the training sample data added to the work input, and the corresponding expected output and work output, in parison to get error signal control value connection strength adjustment, the DuoCi after training to a certain convergence weights. While the sample conditions change, the study can modify weights to adapt to the new environment. Use of neural work learning supervision model is the work, the sensor etc. The learning supervision, in a given sample, in the environment of the work directly, learning and working stages bee one. At this time, the change of the rules of learning to obey the weights between evolution equation of. Unsupervised learning the most simple example is Hebb learning rules. Competition rules is a learning more plex than learning supervision example, it is according to established clustering on weights adjustment. Selforganizing mapping, adapt to the resonance theory is the work and petitive learning about the typical model. Analysis method Study of the neural work nonlinear dynamic properties, mainly USES the dynamics system theory and nonlinear programming theory and statistical theory to analysis of the evolution process of the neural work and the nature of the attractor, explore the synergy of neural work behavior and collective puting functions, understand neural information processing mechanism. In order to discuss the neural work and fuzzy prehensive deal of information may, the concept of chaos theory and method will play a role. The chaos is a rather difficult to precise definition of the math concepts. In general, chaos it is to point to by the dynamic system of equations describe deterministic performance of the uncertain behavior, or call it sure the randomness. Authenticity because it by the intrinsic reason and not outside noise or interference produced, and random refers to the irregular, unpredictable behavior, can only use statistics method description. Chaotic dynamics of the main features of the system is the state of the sensitive dependence on the initial conditions, the chaos reflected its inherent randomness. Chaos theory is to point to describe the nonlinear dynamic behavior with chaos theory, the system of basic concept, methods, it dynamics system plex behavior understanding for his own with the outside world and for material, energy and information exchange process of the internal structure of behavior, not foreign and accidental behavior, chaos is a stationary. Chaotic dynamics system of stationary including: still, stable quantity, the periodicity, with sex and chaos of accurate solution... Chaos rail line is overall stability and local unstable bination of results, call it strange attractor. A strange attractor has the following features: (1) some strange attractor is a attractor, but it is not a fixed point, also not periodic solution。大腦的智慧就是一種非線性現(xiàn)象。神經(jīng)網(wǎng)絡(luò)不但處理的信息可以有各種變化,而且在處理信息的同時,非線性動力系統(tǒng)本身也在不斷變化。神經(jīng)元間的連接權(quán)值反映了單元間的連接強(qiáng)度,信息的表示和處理體現(xiàn)在網(wǎng)絡(luò)處理單元的連接關(guān)系中。 等仔細(xì)分析了以感知器為代表的神經(jīng)網(wǎng)絡(luò)系統(tǒng)的功能及局限后,于 1969 年出版了《 Perceptron》一書,指出感知器不能解決高階謂詞問題。在 日本 的“真實世界計算( RWC)”項目中,人工智能的研究成了一個重要的組成部分。這種神經(jīng)網(wǎng)絡(luò)的信息處理是狀態(tài)的變換,可以用 動力學(xué) 系統(tǒng)理論 處理。有效的學(xué)習(xí)算法,使得神 經(jīng)網(wǎng)絡(luò)能夠通過連接權(quán)值的