freepeople性欧美熟妇, 色戒完整版无删减158分钟hd, 无码精品国产vα在线观看DVD, 丰满少妇伦精品无码专区在线观看,艾栗栗与纹身男宾馆3p50分钟,国产AV片在线观看,黑人与美女高潮,18岁女RAPPERDISSSUBS,国产手机在机看影片

正文內(nèi)容

數(shù)學(xué)統(tǒng)計(jì)學(xué)專業(yè)文獻(xiàn)翻譯中英文對(duì)照-資料下載頁

2024-12-04 02:33本頁面

【導(dǎo)讀】通過這個(gè)課程的學(xué)習(xí)我們將有興趣計(jì)算最大似然估計(jì)。如我們常常觀察到的復(fù)雜的非線性函數(shù)的數(shù)據(jù)。因此,通過我們的計(jì)算封閉形式。牛頓拉夫森算法是一個(gè)迭代的過程,可用于計(jì)算出極大似然估計(jì)。算法的基本思想的內(nèi)容。首先,圍繞一些初步的參數(shù)值構(gòu)造一個(gè)二次近似逼近的。其次是,調(diào)整參數(shù)值讓其最大限度地提。此過程再不斷的重復(fù)進(jìn)行,直到參數(shù)值穩(wěn)定。發(fā),我們轉(zhuǎn)而更為一般的情況下最大化的一個(gè)變量k的函數(shù)。以很容易的最大化的分析。要做到這一點(diǎn),我們需要利用泰勒定理。在I區(qū)間上存在的一點(diǎn)w在x到hx?他可以表示成為從h到0的方程的高階項(xiàng)從1到0更快于從h到0。這凸顯出的一個(gè)事實(shí),即是二階泰勒。近似值是在h上的第二階多項(xiàng)式。x的值是x時(shí),其中函數(shù)f的值達(dá)到最大,換。假設(shè)我們想要找到x的值當(dāng)最大化的二次連續(xù)可微的函數(shù))(xf的值。是一個(gè)取得成功的概率。下一步,我們初始猜測(cè)的最大似然估計(jì)(記。牛頓拉夫森算法返回pi的的值等于到接近,這是合理的分析值。

  

【正文】 ?? yy ???? ?? . Now we calculate )( 139。 ??y?? which is still larger in absolute value than our tolerance of . Thus we set .)()( 1 139。139。39。12 ??? yy ???? ?? )( 239。 y?? is approximately equal to which is smaller in absolute value than our tolerance of so we can stop. The Newton Raphson algorithm here returns a value of pi equal to which is reasonably close to the analytical value of . Note we can make the Newton Raphson procedure more accurate (within machine precision) by setting the tolerance level closer to 0 . 3 The Newton Raphson Algorithm for Finding the Maximum of a Function of k Variables Taylor Series Approximations in k Dimensions Consider a function RRf k ?: that is at least twice continuously differentiable. Suppose kRx? and kRh? . Then the first order Taylor approximation to f at x is given by hxfxfhxf 39。)()()( ???? and the second order Taylor approximation to f at x is given by hxfDhhxfxfhxf )(21)()()( 239。39。 ????? where )(xf? is the gradient (vector of first derivatives) f at x , and )(2 xfD is the Hessian (matrix of second derivatives) of f at x . Finding the Maximum of a Second Order Polynomial in k Variables Consider Cxxxbaxf 39。39。)( ??? where a is a scalar, b and x are kvectors, and C is a kk? symmetric, negative definite matrix. The gradient of f at x is Cxbxf 2)( ??? Since the gradient at the value that maximizes f will be a vector of zeros we know that the maximizer ?x satisfies ??? xCb 20 Solving for ?x we find that bCx 121 ?? ?? Since C is assumed to be negative definite we know that this is a maximum. The Newton Raphson Algorithm in k Dimensions Suppose we want to find the kRx?? that maximizes the twice continuously differentiable function RRf k ?: Recall Chhhbahxf 39。39。 21)( ???? where ),(),( xfbxfa ??? and )(2 xfDC? . Note that C will be symmetric. This implies .)( Chbhxf ???? Once again, the first order condition for a maximum is ??? hCb0 which implies that bCh 1??? In other words, the vector that maximizes the second order Taylor approximation to f at x is bCxhx 1?? ??? )())(( 12 xfxfDx ??? ? With this in mind we can specify the Newton Raphson algorithm for kdimensional function optimization. Algorithm : NewtonRaphsonKD ),( 0 tolerancexf ment: Find the value ?x of x that maximizes )(xf 0?i While tolerancexf i ?? )( do ??? ??? ?????? )())((111121 iiii xfxfDxxii ixx?? Return(?x )
點(diǎn)擊復(fù)制文檔內(nèi)容
研究報(bào)告相關(guān)推薦
文庫吧 www.dybbs8.com
備案圖鄂ICP備17016276號(hào)-1