【正文】
gorithm(訓(xùn)練算法)(P172)Step 1: Initialisation(初始化) Set initial weights w1, w2,…, wn and threshold θ to random numbers in the range [, ].Step 2: Activation (激活) Activate the perceptron by applying inputs x1(p), x2(p),…, xn(p) and desired output Yd (p). Calculate the actual output at iteration p = 1where n is the number of the perceptron inputs, and step is a step activation function.Step 3: Weight training(權(quán)重訓(xùn)練) Update the weights of the perceptron。where Dwi(p) is the weight correction at iteration p.The weight correction is puted by the delta rule:Step 4: Iteration (迭代) Increase iteration p by one, go back to Step 2 and repeat the process until convergence.1監(jiān)督學(xué)習(xí)和無監(jiān)督學(xué)習(xí)的區(qū)別。(P200)Supervised or active learning learn with an external “teacher” or a supervisor who presents a training set to the network. In contrast to supervised learning, or learning with an external ‘teacher’ who presents a training set to the network, unsupervised or selforganised learning does not require a teacher. During a training session, the neural network receives a number of different input patterns, discovers significant features in these patterns and learns how to classify input.第7章1遺傳算法定義及進(jìn)化過程(交叉,突變操作)。(1)Genetic algorithms are a class of stochastic search algorithms based on biological evolution.(P222) 基本遺傳算法(P223) (2)輪盤選擇技術(shù)(P227)遺傳算法(GA)循環(huán)杰的小靈兒整