前言
最近有遇到些同學找我討論sigmoid訓練多標簽或者用在目標檢測中的問題,我想寫一些他們的東西,想到以前的博客里躺着這篇文章(2015年讀研時機器學課的作業)感覺雖然不夠嚴謹,但是很多地方還算直觀,就先把它放過來吧。
說明: 本文只討論Logistic回歸的交叉熵,對Softmax回歸的交叉熵類似(Logistic回歸和Softmax回歸兩者本質是一樣的,后面我會專門有一篇文章說明兩者關系,先在這里挖個坑)。 首先,我們二話不說,先放出邏輯回歸交叉熵的公式:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD1KJTI4JTVDdGhldGElMjklM0QtJTVDZnJhYyU3QjElN0QlN0JtJTdEJTVDc3VtXyU3QmklM0QxJTdEJTVFJTdCbSU3RHklNUUlN0IlMjhpJTI5JTdEJTVDbG9nJTI4aF8lNUN0aGV0YSUyOHglNUUlN0IlMjhpJTI5JTdEJTI5JTI5JTJCJTI4MS15JTVFJTdCJTI4aSUyOSU3RCUyOSU1Q2xvZyUyODEtaF8lNUN0aGV0YSUyOHglNUUlN0IlMjhpJTI5JTdEJTI5JTI5JTJDKw==.png)
以及
對參數
的偏導數(用於諸如梯度下降法等優化算法的參數更新),如下:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNmcmFjJTdCJTVDcGFydGlhbCU3RCU3QiU1Q3BhcnRpYWwlNUN0aGV0YV8lN0JqJTdEJTdESiUyOCU1Q3RoZXRhJTI5KyUzRCU1Q2ZyYWMlN0IxJTdEJTdCbSU3RCU1Q3N1bV8lN0JpJTNEMSU3RCU1RSU3Qm0lN0QlMjhoXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjkteSU1RSU3QiUyOGklMjklN0QlMjl4X2olNUUlN0IlMjhpJTI5JTdEKw==.png)
但是在大多論文或數教程中,也就是直接給出了上面兩個公式,而未給出推導過程,這就給初學者造成了一定的困惑。交叉熵的公式可以用多種解釋得到,甚至不同領域也會有不同,比如數學系的用極大似然估計,信息工程系的的從信息編碼角度,當然更多是聯合KL散度來解釋。但是我這里假設那些你都不了解的情況下如何用一個更加直白和直觀的解釋來得到Logistic Regression的交叉熵損失函數,說清楚它存在的合理性就可以解惑(關於交叉熵的所謂"正統"解釋后續我會專門寫一篇文章來總結,先挖個坑)。因水平有限,如有錯誤,歡迎指正。
廢話不說,下文將介紹一步步得到Logistic Regression的交叉熵損失函數,並推導出其導數,同時給出簡潔的向量形式及其導數推導過程。
交叉熵損失函數(Logistic Regression代價函數)
我們一共有
組已知樣本(
),
表示第
組數據及其對應的類別標記。其中
為
維向量(考慮偏置項),
則為表示類別的一個數:
- logistic回歸(是非問題)中,
取0或者1; - softmax回歸 (多分類問題)中,
取1,2...k中的一個表示類別標號的一個數(假設共有k類)。
這里,只討論logistic回歸,輸入樣本數據
,模型的參數為
,因此有
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlM0ElM0QlNUN0aGV0YV8wJTJCJTVDdGhldGFfMSt4JTVFJTdCJTI4aSUyOSU3RF8xJTJCJTVDZG90cyUyQiU1Q3RoZXRhX3AreCU1RSU3QiUyOGklMjklN0RfcC4r.png)
二元問題中常用sigmoid作為假設函數(hypothesis function),定義為:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD1oXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjklM0QlNUNmcmFjJTdCMSU3RCU3QjElMkJlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTVFJTdCJTI4aSUyOSU3RCU3RCslN0QuKw==.png)
因為Logistic回歸問題就是0/1的二分類問題,可以有
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD1QJTI4JTdCJTVDaGF0JTdCeSU3RCU3RCU1RSU3QiUyOGklMjklN0QlM0QxJTdDeCU1RSU3QiUyOGklMjklN0QlM0IlNUN0aGV0YSUyOSUzRGhfJTVDdGhldGElMjh4JTVFJTdCJTI4aSUyOSU3RCUyOSslNUMlNUMrUCUyOCU3QiU1Q2hhdCU3QnklN0QlN0QlNUUlN0IlMjhpJTI5JTdEJTNEMCU3Q3glNUUlN0IlMjhpJTI5JTdEJTNCJTVDdGhldGElMjklM0QxLWhfJTVDdGhldGElMjh4JTVFJTdCJTI4aSUyOSU3RCUyOSs=.png)
現在,我們不考慮“熵”的概念,根據下面的說明,從簡單直觀角度理解,就可以得到我們想要的損失函數:我們將概率取對數,其單調性不變,有
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNsb2crUCUyOCU3QiU1Q2hhdCU3QnklN0QlN0QlNUUlN0IlMjhpJTI5JTdEJTNEMSU3Q3glNUUlN0IlMjhpJTI5JTdEJTNCJTVDdGhldGElMjklM0QlNUNsb2craF8lNUN0aGV0YSUyOHglNUUlN0IlMjhpJTI5JTdEJTI5JTNEJTVDbG9nJTVDZnJhYyU3QjElN0QlN0IxJTJCZSU1RSU3Qi0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlN0QrJTdEKyU1QyU1QyslNUNsb2crUCUyOCU3QiU1Q2hhdCU3QnklN0QlN0QlNUUlN0IlMjhpJTI5JTdEJTNEMCU3Q3glNUUlN0IlMjhpJTI5JTdEJTNCJTVDdGhldGElMjklM0QlNUNsb2crJTI4MS1oXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjklMjklM0QlNUNsb2clNUNmcmFjJTdCZSU1RSU3Qi0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlN0QlN0QlN0IxJTJCZSU1RSU3Qi0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlN0QrJTdEKw==.png)
那么對於第
組樣本,假設函數表征正確的組合對數概率為:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD1JJTVDJTdCeSU1RSU3QiUyOGklMjklN0QlM0QxJTVDJTdEJTVDbG9nK1AlMjglN0IlNUNoYXQlN0J5JTdEJTdEJTVFJTdCJTI4aSUyOSU3RCUzRDElN0N4JTVFJTdCJTI4aSUyOSU3RCUzQiU1Q3RoZXRhJTI5JTJCSSU1QyU3QnklNUUlN0IlMjhpJTI5JTdEJTNEMCU1QyU3RCU1Q2xvZytQJTI4JTdCJTVDaGF0JTdCeSU3RCU3RCU1RSU3QiUyOGklMjklN0QlM0QwJTdDeCU1RSU3QiUyOGklMjklN0QlM0IlNUN0aGV0YSUyOSU1QyU1QyslM0R5JTVFJTdCJTI4aSUyOSU3RCU1Q2xvZytQJTI4JTdCJTVDaGF0JTdCeSU3RCU3RCU1RSU3QiUyOGklMjklN0QlM0QxJTdDeCU1RSU3QiUyOGklMjklN0QlM0IlNUN0aGV0YSUyOSUyQiUyODEteSU1RSU3QiUyOGklMjklN0QlMjklNUNsb2crUCUyOCU3QiU1Q2hhdCU3QnklN0QlN0QlNUUlN0IlMjhpJTI5JTdEJTNEMCU3Q3glNUUlN0IlMjhpJTI5JTdEJTNCJTVDdGhldGElMjklNUMlNUMrJTNEeSU1RSU3QiUyOGklMjklN0QlNUNsb2clMjhoXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjklMjklMkIlMjgxLXklNUUlN0IlMjhpJTI5JTdEJTI5JTVDbG9nJTI4MS1oXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjklMjkr.png)
其中,
和
為示性函數(indicative function),簡單理解為{ }內條件成立時,取1,否則取0,這里不贅言。 那么對於一共
組樣本,我們就可以得到模型對於整體訓練樣本的表現能力:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNzdW1fJTdCaSUzRDElN0QlNUUlN0JtJTdEeSU1RSU3QiUyOGklMjklN0QlNUNsb2clMjhoXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjklMjklMkIlMjgxLXklNUUlN0IlMjhpJTI5JTdEJTI5JTVDbG9nJTI4MS1oXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjklMjkr.png)
由以上表征正確的概率含義可知,我們希望其值越大,模型對數據的表達能力越好。而我們在參數更新或衡量模型優劣時是需要一個能充分反映模型表現誤差的損失函數(Loss function)或者代價函數(Cost function)的,而且我們希望損失函數越小越好。由這兩個矛盾,那么我們不妨領代價函數為上述組合對數概率的相反數:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD1KJTI4JTVDdGhldGElMjklM0QtJTVDZnJhYyU3QjElN0QlN0JtJTdEJTVDc3VtXyU3QmklM0QxJTdEJTVFJTdCbSU3RHklNUUlN0IlMjhpJTI5JTdEJTVDbG9nJTI4aF8lNUN0aGV0YSUyOHglNUUlN0IlMjhpJTI5JTdEJTI5JTI5JTJCJTI4MS15JTVFJTdCJTI4aSUyOSU3RCUyOSU1Q2xvZyUyODEtaF8lNUN0aGV0YSUyOHglNUUlN0IlMjhpJTI5JTdEJTI5JTI5Kw==.png)
上式即為大名鼎鼎的交叉熵損失函數。(說明:如果熟悉“信息熵"的概念
,那么可以有助理解叉熵損失函數)
交叉熵損失函數的求導
這步需要用到一些簡單的對數運算公式,這里先以編號形式給出,下面推導過程中使用特意說明時都會在該步驟下腳標標出相應的公式編號,以保證推導的連貫性。
① ![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNsb2crJTVDZnJhYyU3QmElN0QlN0JiJTdEJTNEJTVDbG9nK2EtJTVDbG9nK2I=.png)
② ![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNsb2crYSUyQiU1Q2xvZytiJTNEJTVDbG9nKyUyOGFiJTI5.png)
③
(為了方便這里
指
,即
,其他底數如2,10等,由換底公式可知,只是前置常數系數不同,對結論毫無影響)
另外,值得一提的是在這里涉及的求導均為矩陣、向量的導數(矩陣微商),這里有一篇教程總結得精簡又全面,非常棒,推薦給需要的同學。
下面開始推導:
交叉熵損失函數為:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD1KJTI4JTVDdGhldGElMjklM0QtJTVDZnJhYyU3QjElN0QlN0JtJTdEJTVDc3VtXyU3QmklM0QxJTdEJTVFJTdCbSU3RHklNUUlN0IlMjhpJTI5JTdEJTVDbG9nJTI4aF8lNUN0aGV0YSUyOHglNUUlN0IlMjhpJTI5JTdEJTI5JTI5JTJCJTI4MS15JTVFJTdCJTI4aSUyOSU3RCUyOSU1Q2xvZyUyODEtaF8lNUN0aGV0YSUyOHglNUUlN0IlMjhpJTI5JTdEJTI5JTI5JTVDdGFnJTdCMSU3RCs=.png)
其中,
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNsb2craF8lNUN0aGV0YSUyOHglNUUlN0IlMjhpJTI5JTdEJTI5JTNEJTVDbG9nJTVDZnJhYyU3QjElN0QlN0IxJTJCZSU1RSU3Qi0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlN0QrJTdEJTNELSU1Q2xvZyslMjgrMSUyQmUlNUUlN0ItJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdEKyUyOSU1QyslMkMlNUMlNUMrKyU1Q2JlZ2luJTdCYWxpZ24lN0QlNUNsb2clMjgxLStoXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjklMjklMjYlM0QlNUNsb2clMjgxLSU1Q2ZyYWMlN0IxJTdEJTdCMSUyQmUlNUUlN0ItJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdEKyU3RCUyOSU1QyU1QyslMjYlM0QlNUNsb2clMjglNUNmcmFjJTdCZSU1RSU3Qi0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlN0QlN0QlN0IxJTJCZSU1RSU3Qi0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlN0QrJTdEJTI5JTVDJTVDJTI2JTNEJTVDbG9nKyUyOGUlNUUlN0ItJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdEKyUyOS0lNUNsb2crJTI4KzElMkJlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTVFJTdCJTI4aSUyOSU3RCU3RCslMjkrJTVDJTVDKyUyNiUzRC0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QtJTVDbG9nKyUyOCsxJTJCZSU1RSU3Qi0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlN0QrJTI5K18lN0IlRTIlOTElQTAlRTIlOTElQTIlN0QlNUMrLislNUNlbmQlN0JhbGlnbiU3RA==.png)
由此,得到
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNiZWdpbiU3QmFsaWduJTdEK0olMjglNUN0aGV0YSUyOSslMjYlM0QtJTVDZnJhYyU3QjElN0QlN0JtJTdEJTVDc3VtXyU3QmklM0QxJTdEJTVFbSslNUNsZWZ0JTVCLXklNUUlN0IlMjhpJTI5JTdEJTI4JTVDbG9nKyUyOCsxJTJCZSU1RSU3Qi0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlN0QlMjklMjkrJTJCKyUyODEteSU1RSU3QiUyOGklMjklN0QlMjklMjgtJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdELSU1Q2xvZyslMjgrMSUyQmUlNUUlN0ItJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdEKyUyOSUyOSU1Q3JpZ2h0JTVEJTVDJTVDKyUyNiUzRC0lNUNmcmFjJTdCMSU3RCU3Qm0lN0QlNUNzdW1fJTdCaSUzRDElN0QlNUVtKyU1Q2xlZnQlNUJ5JTVFJTdCJTI4aSUyOSU3RCU1Q3RoZXRhJTVFVCt4JTVFJTdCJTI4aSUyOSU3RC0lNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QtJTVDbG9nJTI4MSUyQmUlNUUlN0ItJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdEJTI5JTVDcmlnaHQlNUQlNUMlNUMrJTI2JTNELSU1Q2ZyYWMlN0IxJTdEJTdCbSU3RCU1Q3N1bV8lN0JpJTNEMSU3RCU1RW0rJTVDbGVmdCU1QnklNUUlN0IlMjhpJTI5JTdEJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdELSU1Q2xvZytlJTVFJTdCJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdELSU1Q2xvZyUyODElMkJlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTVFJTdCJTI4aSUyOSU3RCU3RCUyOSU1Q3JpZ2h0JTVEXyU3QiVFMiU5MSVBMiU3RCU1QyU1QyslMjYlM0QtJTVDZnJhYyU3QjElN0QlN0JtJTdEJTVDc3VtXyU3QmklM0QxJTdEJTVFbSslNUNsZWZ0JTVCeSU1RSU3QiUyOGklMjklN0QlNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QtJTVDbGVmdCUyOCU1Q2xvZytlJTVFJTdCJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdEJTJCJTVDbG9nJTI4MSUyQmUlNUUlN0ItJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdEJTI5JTVDcmlnaHQlMjklNUNyaWdodCU1RCtfJUUyJTkxJUExJTVDJTVDKyUyNiUzRC0lNUNmcmFjJTdCMSU3RCU3Qm0lN0QlNUNzdW1fJTdCaSUzRDElN0QlNUVtKyU1Q2xlZnQlNUJ5JTVFJTdCJTI4aSUyOSU3RCU1Q3RoZXRhJTVFVCt4JTVFJTdCJTI4aSUyOSU3RC0lNUNsb2clMjgxJTJCZSU1RSU3QiU1Q3RoZXRhJTVFVCt4JTVFJTdCJTI4aSUyOSU3RCU3RCUyOSU1Q3JpZ2h0JTVEKyslNUNlbmQlN0JhbGlnbiU3RA==.png)
這次再計算
對第
個參數分量
求偏導:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNiZWdpbiU3QmFsaWduJTdEKyU1Q2ZyYWMlN0IlNUNwYXJ0aWFsJTdEJTdCJTVDcGFydGlhbCU1Q3RoZXRhXyU3QmolN0QlN0RKJTI4JTVDdGhldGElMjkrJTI2JTNEJTVDZnJhYyU3QiU1Q3BhcnRpYWwlN0QlN0IlNUNwYXJ0aWFsJTVDdGhldGFfJTdCaiU3RCU3RCU1Q2xlZnQlMjglNUNmcmFjJTdCMSU3RCU3Qm0lN0QlNUNzdW1fJTdCaSUzRDElN0QlNUVtKyU1Q2xlZnQlNUIlNUNsb2clMjgxJTJCZSU1RSU3QiU1Q3RoZXRhJTVFVCt4JTVFJTdCJTI4aSUyOSU3RCU3RCUyOS15JTVFJTdCJTI4aSUyOSU3RCU1Q3RoZXRhJTVFVCt4JTVFJTdCJTI4aSUyOSU3RCU1Q3JpZ2h0JTVEJTVDcmlnaHQlMjklNUMlNUMrJTI2JTNEJTVDZnJhYyU3QjElN0QlN0JtJTdEJTVDc3VtXyU3QmklM0QxJTdEJTVFbSslNUNsZWZ0JTVCJTVDZnJhYyU3QiU1Q3BhcnRpYWwlN0QlN0IlNUNwYXJ0aWFsJTVDdGhldGFfJTdCaiU3RCU3RCU1Q2xvZyUyODElMkJlJTVFJTdCJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdEJTI5LSU1Q2ZyYWMlN0IlNUNwYXJ0aWFsJTdEJTdCJTVDcGFydGlhbCU1Q3RoZXRhXyU3QmolN0QlN0QlNUNsZWZ0JTI4eSU1RSU3QiUyOGklMjklN0QlNUN0aGV0YSU1RVQreCU1RSU3QiUyOGklMjklN0QlNUNyaWdodCUyOSU1Q3JpZ2h0JTVEJTVDJTVDKyUyNiUzRCU1Q2ZyYWMlN0IxJTdEJTdCbSU3RCU1Q3N1bV8lN0JpJTNEMSU3RCU1RW0rJTVDbGVmdCUyOCU1Q2ZyYWMlN0J4JTVFJTdCJTI4aSUyOSU3RF9qZSU1RSU3QiU1Q3RoZXRhJTVFVCt4JTVFJTdCJTI4aSUyOSU3RCU3RCU3RCU3QjElMkJlJTVFJTdCJTVDdGhldGElNUVUK3glNUUlN0IlMjhpJTI5JTdEJTdEJTdELXklNUUlN0IlMjhpJTI5JTdEeCU1RSU3QiUyOGklMjklN0RfaiU1Q3JpZ2h0JTI5JTVDJTVDKyUyNiUzRCU1Q2ZyYWMlN0IxJTdEJTdCbSU3RCU1Q3N1bV8lN0JpJTNEMSU3RCU1RSU3Qm0lN0QlMjhoXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjkteSU1RSU3QiUyOGklMjklN0QlMjl4X2olNUUlN0IlMjhpJTI5JTdEKyslNUNlbmQlN0JhbGlnbiU3RA==.png)
這就是交叉熵對參數的導數:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNmcmFjJTdCJTVDcGFydGlhbCU3RCU3QiU1Q3BhcnRpYWwlNUN0aGV0YV8lN0JqJTdEJTdESiUyOCU1Q3RoZXRhJTI5KyUzRCU1Q2ZyYWMlN0IxJTdEJTdCbSU3RCU1Q3N1bV8lN0JpJTNEMSU3RCU1RSU3Qm0lN0QlMjhoXyU1Q3RoZXRhJTI4eCU1RSU3QiUyOGklMjklN0QlMjkteSU1RSU3QiUyOGklMjklN0QlMjl4X2olNUUlN0IlMjhpJTI5JTdEKw==.png)
向量形式
前面都是元素表示的形式,只是寫法不同,過程基本都是一樣的,不過寫成向量形式會更清晰,這樣就會把
和求和符號
省略掉了。我們不妨忽略前面的固定系數項
,交叉墒的損失函數(1)則可以寫成下式:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD1KJTI4JTVDdGhldGElMjkrJTNEKy0lNUNsZWZ0JTVCK3klNUVUKyU1Q2xvZytoXyU1Q3RoZXRhJTI4eCUyOSUyQiUyODEteSU1RVQlMjklNUNsb2clMjgxLWhfJTVDdGhldGElMjh4JTI5JTI5JTVDcmlnaHQlNUQlNUN0YWclN0IyJTdEKw==.png)
將
帶入,得到:
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNiZWdpbiU3QmFsaWduJTdEK0olMjglNUN0aGV0YSUyOSslMjYlM0QrLSU1Q2xlZnQlNUIreSU1RVQrJTVDbG9nKyU1Q2ZyYWMlN0IxJTdEJTdCMSUyQmUlNUUlN0ItJTVDdGhldGElNUVUK3glN0QrJTdEJTJCJTI4MS15JTVFVCUyOSU1Q2xvZyU1Q2ZyYWMlN0JlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTdEJTdEJTdCMSUyQmUlNUUlN0ItJTVDdGhldGElNUVUK3glN0QrJTdEJTVDcmlnaHQlNUQrJTVDJTVDKyUyNiUzRCstJTVDbGVmdCU1QisteSU1RVQrJTVDbG9nKyUyODElMkJlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTdEJTI5KyUyQislMjgxLXklNUVUJTI5KyU1Q2xvZytlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTdEKy0rJTI4MS15JTVFVCUyOSU1Q2xvZyslMjgxJTJCZSU1RSU3Qi0lNUN0aGV0YSU1RVQreCU3RCUyOSU1Q3JpZ2h0JTVEKyU1QyU1QyslMjYlM0QrLSU1Q2xlZnQlNUIlMjgxLXklNUVUJTI5KyU1Q2xvZytlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTdEKy0rJTVDbG9nKyUyODElMkJlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTdEJTI5KyU1Q3JpZ2h0JTVEJTVDJTVDKyUyNiUzRCstJTVDbGVmdCU1QiUyODEteSU1RVQrJTI5KyUyOC0lNUN0aGV0YSU1RVR4JTI5Ky0rJTVDbG9nKyUyODElMkJlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTdEJTI5KyU1Q3JpZ2h0JTVEKyslNUNlbmQlN0JhbGlnbiU3RA==.png)
再對
求導,前面的負號直接削掉了,
![[公式]](/image/aHR0cHM6Ly93d3cuemhpaHUuY29tL2VxdWF0aW9uP3RleD0lNUNiZWdpbiU3QmFsaWduJTdEKyU1Q2ZyYWMlN0IlNUNwYXJ0aWFsJTdEJTdCJTVDcGFydGlhbCU1Q3RoZXRhXyU3QmolN0QlN0RKJTI4JTVDdGhldGElMjkrJTI2JTNEKy0lNUNmcmFjJTdCJTVDcGFydGlhbCU3RCU3QiU1Q3BhcnRpYWwlNUN0aGV0YV8lN0JqJTdEJTdEJTVDbGVmdCU1QiUyODEteSU1RVQrJTI5KyUyOC0lNUN0aGV0YSU1RVR4JTI5Ky0rJTVDbG9nKyUyODElMkJlJTVFJTdCLSU1Q3RoZXRhJTVFVCt4JTdEJTI5KyU1Q3JpZ2h0JTVEKyU1QyU1QyslMjYlM0QrJTI4MS15JTVFVCUyOXgtKyU1Q2ZyYWMlN0JlJTVFJTdCLSU1Q3RoZXRhJTVFVHgrJTdEJTdEJTdCMSUyQmUlNUUlN0ItJTVDdGhldGElNUVUK3glN0QrJTdEeCslNUMlNUMrJTI2JTNEKyUyOCU1Q2ZyYWMlN0IxJTdEJTdCMSUyQmUlNUUlN0ItJTVDdGhldGElNUVUK3glN0QrJTdEKy0reSU1RVQlMjl4KyU1QyU1QyslMjYlM0QrJTVDbGVmdCUyOGhfJTVDdGhldGElMjh4JTI5LXklNUVUKyU1Q3JpZ2h0JTI5eCsrJTVDZW5kJTdCYWxpZ24lN0Q=.png)
3 梯度下降參數更新


轉載請注明出處Jason Zhao的知乎專欄“人工+智能“,文章鏈接:
Jason Zhao:交叉熵損失函數的求導(Logistic回歸)
