神經網絡——損失函數


符號:

\[\left\{ {\left( {{x^{\left( 1 \right)}},{y^{\left( 1 \right)}}} \right),\left( {{x^{\left( 2 \right)}},{y^{\left( 2 \right)}}} \right),...,\left( {{x^{\left( m \right)}},{y^{\left( m \right)}}} \right)} \right\}\]

L=total no. of layers in network

L=神經網絡的總層數

sl = no. of units (not counting bias unit) in layer l

sl = 第l層的神經元的個數(不包含bias unit)


Binary classification                          Multi-class classification (K classes)

y = 0 or 1                                          y ∈RK  E.g [1 0 0 0], [0 1 0 0]

1 output unit                                     k output units


\[{\left( {{h_\Theta }\left( x \right)} \right)_i} = 神經網絡的第i個輸出\]

神經網絡的損失函數

\[J\left( \Theta  \right) =  - \frac{1}{m}\left[ {\sum\limits_{i = 1}^m {\sum\limits_{k = 1}^k {y_k^{\left( i \right)}\log {{\left( {{h_\Theta }\left( {{x^{\left( i \right)}}} \right)} \right)}_k} + \left( {1 - y_k^{\left( i \right)}} \right)\log \left( {1 - {{\left( {{h_\Theta }\left( {{x^{\left( i \right)}}} \right)} \right)}_k}} \right)} } } \right] + \frac{\lambda }{{2m}}\sum\limits_{l = 1}^{L - 1} {\sum\limits_{i = 1}^{{s_l}} {\sum\limits_{j = 1}^{{s_{l + 1}}} {{{\left( {\Theta _{ji}^{\left( l \right)}} \right)}^2}} } } \]

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM