【正文】
min=min(c3)。 ? c1_min=min(c1)。 ? % in(:,6)=c6。 ? % in(:,2)=c2。 ? % c6=c6/c6_max。 ? % c2=c2/c2_max。 ? c6_max=max(c6)。 ? c2_max=max(c2)。 ? c6=in(:,6)。 ? c2=in(:,2)。. ? and returns an N layer feedforward backprop work. 參數(shù)說明 ? The transfer functions TFi can be any differentiable transfer function such as TANSIG, LOGSIG, or PURELIN. ? The training function BTF can be any of the backprop training functions such as TRAINLM, TRAINBFG, TRAINRP, TRAINGD, etc. 參數(shù)說明 ? *WARNING*: TRAINLM is the default training function because it is very fast, but it requires a lot of memory to run. If you get an outofmemory error when training try doing one of these: ? (1) Slow TRAINLM training, but reduce memory requirements, by setting to 2 or more. (See HELP TRAINLM.) ? (2) Use TRAINBFG, which is slower but more memory efficient than TRAINLM. ? (3) Use TRAINRP which is slower but more memory efficient than TRAINBFG. 參數(shù)說明 ? The learning function BLF can be either of the backpropagation learning functions such as LEARNGD, or LEARNGDM.