【正文】
2. Multiweights neuron works architecture As a new general purpose theoretical model of pattern Recognition, here BPR is realized by multiweights neuron Networks. In training of a certain class of samples, an multiweights neuron subNetwork should be established. The subNetwork consists of one input layer. one multiweights neuron hidden layer and one output layer. Such a subNetwork can be considered as a mapping 512:F R R? . 1 2 m( ) m in( , , Y )F X Y Y? … , WhereYi is the output of a Multiweights neuron. There are m hidden Multiweights neurons. i= 1,2, …,m, 512XR? is the input vector. IV . Training for MWN Networks 1. Basics of MWN works training Training one multiweights neuron subNetwork requires calculating the multiweights neuron layer weights. The multiweights neuron and the training algorithm used was that of Ref.[4]. In this algorithm, if the number of training samples of each class is N ,we can use 2N? neurons . In this paper , N =30. 12[( , , , )]i i i iY f s s s x??? , is a function with multivector input, one scalar quantity output. 2. Optimization method According to the ments in ,if there are many training samples, the neuron number will be very large thus reduce the recognition speed. In the case of learning several classes of samples, knowledge of the class membership of training samples is available. We use this information in a supervised training algorithm to reduce the work scales. When training class A, we looked the left training samples of the other 14 classes as class B. So there are 30 training samples in set 1 2 3 0: { , , }A A a a a? … ,and 420 training samples in set 1 2 420: { , , }B B b b? … , b. Firstly select 3 samples from A, and we have a neuron:1 1 2 3Y =f [( , , , )]k k ka a a 0 1 _ 1 2 3, = f [ ( , , , ) ]A i k k k iA A Y a a a a? , where i= 1,2,… , 30; 1 _ 1 2 3Y = f [ ( , , , ) ]B j k k k ja a a b, where j= 1,2,…420 ; 1_min(Y )BjV ? ,we specify a value r ,0r1 .If 1_ *AiY r V? ,removed ia from set A, thus we get a new set (1)A .We continue until the number of samples in set ()kA is () {}kA ?? ,then the training is ended, and the subNetwork of class A has a hidden layer of 1r? neurons. V. Experiment Results A speech database consisting of 15 Chinese dish’s names was developed for the course of study. The length of each name is 4 Chinese words, that is to say, each sample of speech is a continuous string of 4 words, such as “yu xiang rou si”, “gong bao ji ding”, etc. It was organized into two sets: training set and test set. The speech signal is sampled at 16kHz and 16bit resolution. 第 4 頁 Table 1. Experimental result at r of different values r Accuracy(%) MWN number The first option choice recognition rate The first two options choice recognition rate Training set Test set Training set Test set basic algorithm 448 132 126 115 110 96 93 84 65 52 44 450 utterances constitute the training set used to train the multiweights neuron works. The 450 ones belong to 10 speakers(5 males and 5 females) who are from different Chinese provinces. Each of the speakers uttered each of the word 3 times. The test set had a total of 539 utterances which involved another 4 speakers who uttered the 15 words arbitrarily. The tests made to evaluate the recognition system were carried out on different r from to with a step increment of experiment results atr of different values are shown in Table 1. Obviously, the works was able to achieve full recognition of training set at anyr . From the experiments,it was found