Python與矩陣論——特征值與特征向量


Python與矩陣論——特征值與特征向量


The Unknown Word

The First Column The Second Column
receptive field [ri'septiv] [fild]感受野
filter 濾波器['filte]
toggle movement ['ta:gl]轉換 ['muvment]動作
recurrent 循環的[ri'ke:rent]
ReLU The Rectified Linear Unit
rectified ['rekte faie]整流器
leaky 有漏洞的['liki]
tuning ['tju:ning]調諧
SGD Stochastic gradient descent
stochastic [ste'kae stik]隨機的
hypothesis [hai'pa:thesis]假設
SVD Singular Value Decomposition 萬能矩陣分解
singular ['singjule(r)]奇特的
decomposition [dikamlpe'zition]分解
format n.版式 vt.格式化

PCA 降維舉例

1.X=\(\begin{bmatrix} -1 & -1 & 0 & 2 & 0 \\-2 & 0 & 0 & 1 & 1 \\ \end{bmatrix}\)\(C_x\)=\(\begin{bmatrix} 6/5 & 4/5 \\ 4/5 & 6/5 \\ \end{bmatrix}\)
2.計算\(C_x\)特征值為:\(\lambda\)=2,\(\lambda_2\)=2/5,特征值特征向量為$\begin{bmatrix} \sqrt{2}\ \ \sqrt{2}\ \ \end{bmatrix} $ ,可驗證\(\Lambda\)=\(U^T\)\(C_x\)U
3.降維:\(\begin{bmatrix} 1/\sqrt{2}\ & -1/\sqrt{2}\ \end{bmatrix}\)X=\(\begin{bmatrix} -3/\sqrt{2}\ & -1/\sqrt{2}\ & 0 & 3/\sqrt{2}\ & -1/\sqrt{2}\ \end{bmatrix}\)

Gradient vector and Hessian matrix

  • Function : f(x)=2\(x_1^3+3x_2^2+3x_1^2x_2-24x_2\)
  • calculate(Gradient vector and Hessian matrix): \(\nabla\)f(x)=\(\begin{bmatrix} 6x_1^2+6x_1x_2 \\ 6x_2+3x_1^2-24 \\ \end{bmatrix}\)\(\nabla^2\)f(x)=6\(\begin{bmatrix} 2x_1+x_2 & x_1 \\ x_1 & 1 \\ \end{bmatrix}\)

The Unknown Word

The First Column The Second Column
convex optimization 凸規划
optimization [optemi'zetion]最優化


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM