【正文】
的 Hilbert 施密特我們能夠找到一種核心功能,滿足了條件,使美( , ) ( ) ( )iiK x x x x??? 這樣式 12可以改寫成 1( ( ) ) ( ( , ) )Mkki i kiV x K x x? ? ????? 這里? 是 K的一個變量。我們只需要計算測試點的預(yù)測 的特征向量對應(yīng)的非零特征值的 F 這樣做主要成分的提取。一般而言,第一次 N 值山對角線長,相應(yīng)的大特征值,是有用的信息在數(shù)據(jù)分析 .PCA 解決了特征值和特征向量的協(xié)方差矩陣 。假設(shè) 7 230 jJEE???? ?????,從而得出的特征向量是,最后: T=[ 3 0 3 1 3 6 3 7/ 1 , / 1 , ......., / 1 , / 1E E E E] 信號分解的小波包,然后有用的特 征信息提取的特征向量是通過上述過程。能源分布的回波信號在每個頻帶然后將不同: 承擔(dān)相應(yīng)的能量 3 j S (j=0, 1, …, 7) 可以代表 3 j E (j=0, 1, …, 7). 的規(guī)模分散點的重建信號 3 j S 是 jk x (j=0,1, …, 7。 第 2步:重構(gòu)系數(shù)波讓包分解。然而小波包變換是一個很好 的信號分析方法,分解許多層并給出了一個見上書決議時間頻域。 2基于小波包變換的特征提 取 核心單元的分析 小波包變換(小波包變換)方法 [ 3 ] ,這是一個小波的概括分解,提供了的很多可能性分析,傳感系統(tǒng)的信號頻帶的升降器點擊收集到的信號是非常廣泛的。接著我們做到尺寸的減少 然后進行核心點陣式尺寸的重建。因此我們的研究員已經(jīng)計劃了包括浮躁的非線性變形等一系列的無線發(fā)現(xiàn)技術(shù)。比如主要成份分析( PCA)和部分最少廣場( PLS)。但是,在那里礦井提升的高度檢測和工作設(shè)備之間有許多復(fù)雜的相互關(guān)系。故障檢測 1介紹 因為我的礦井提升機是一個非常復(fù)雜的可變性比較大的系統(tǒng), 升高不可避免的產(chǎn)生錯誤和長時間的超載。結(jié)果表示,被提議的方法負擔(dān)可信的過失發(fā)現(xiàn)和確認。它對分析改變時間或短暫的信號是適當(dāng)?shù)摹?Diagnosis, 2020, 21(4): 258–262. [8] Zhao L J, Wang G, Li Y. Study of a nonlinear PCA fault detection and diagnosis method. Information and Control, 2020, 30(4): 359–364. [9] Xiao J H, Wu J P. Theory and application study of feature extraction based on kernel. Computer Engineering, 2020, 28(10): 36–38. 中文譯文 基于 PCA 技術(shù)核心的打包和變換的礦井提升機失誤的發(fā)現(xiàn) 摘要: 一個新的運算法則被正確的運用于證明和監(jiān)視礦井提升機的過失情況。 k=1, 2, …, n), where n is the length of thesignal. Then we can get: 2233 1()nj j jkkE S t d t x????? ( 2) Consider that we have made only a 3layer waveletpackage deposition of the echo signals. To makethe change of each frequency ponent more detailedthe 2rank statistical characteristics of the reconstructedsignal is also regarded as a feature vector: 23 11 ()njkj jkkD x xn ???? ( 3) Step 4: The 3 j E are often large so we normalize them. Assume that 7 230 jJEE???? ?????, thus the derived feature vectors are, at last: T=[ 3 0 3 1 3 6 3 7/ 1 , / 1 , ......., / 1 , / 1E E E E] (4) The signal is deposed by a wavelet packageand then the useful characteristic information featurevectors are extracted through the process given to other traditional methods, like the Hilberttransform, approaches based on the WPT analysisare more wele due to the agility of the processand its scientific deposition. Kernel principal ponent analysis The method of kernel principal ponent analysisapplies kernel methods to principal ponent analysis[4–5]. 1, 1 , 2 , .. ., , 0 .MNkk kLe tx R k M x?? ? ??The principalponent is the element at the diagonal afterthe covariance matrix,11 M TijjC x xM ?? ?has beendiagonalized. Generally speaking, the first N valuesalong the diagonal, corresponding to the large eigenvalues,are the useful information in the solves the eigenvalues and eigenvectors of thecovariance matrix. Solving the characteristic equation[6]: 11 ()M jjjc x xM?? ? ??? ? ?? (5) where the eigenvalues 0?? ,and the eigenvectors, ? ?\0NR?? is essence of PCA. Let the nonlinear transformations, ??: RN??F ,x??X , project the original space into feature space,F. Then the covariance matrix, C, of the original space has the following form in the feature space: 11 ( ) ( )M TijJC x xM ???? ? (6) Nonlinear principal ponent analysis can be considered to be principal ponent analysis ofC in the feature space, F. Obviously, all the igenvaluesof C( 0)?? and eigenvectors, V ? F \ {0} satisfy? V ??C V . All of the solutions are in the subspace that transforms from ( ) , 1, 2 , ...,jx i M? ? ( ( ) ) ( ) , 1 , 2 , . . . ,kkx V x C V k M? ? ??? (7) There is a coefficient i? Let 1 ()MiiiVx????? (8) From Eqs.(6), (7) and (8) we can obtain: 111( ( ) ( ) )1 ( ( ) ( ) ) ( ( ) ( ) )Mi k jiMMi k j k jija x xa x x x xM? ? ?? ? ? ???????? (9) where k ?1, 2, … .., M . Define A as an MM rank matrix. Its elements are: ( ) ( )ij i jA x x??? From Eqs.(9) and (10), we can obtain M? ?Aa ??A2a . This is equivalent