版權聲明:本文為博主原創文章,未經博主允許不得轉載。
https://blog.csdn.net/qq_21904665/article/details/52315642
ElasticNet 是一種使用L1和L2先驗作為正則化矩陣的線性回歸模型.這種組合用於只有很少的權重非零的稀疏模型,比如:class:Lasso, 但是又能保持:class:Ridge 的正則化屬性.我們可以使用 l1_ratio 參數來調節L1和L2的凸組合(一類特殊的線性組合)。
當多個特征和另一個特征相關的時候彈性網絡非常有用。Lasso 傾向於隨機選擇其中一個,而彈性網絡更傾向於選擇兩個.
在實踐中,Lasso 和 Ridge 之間權衡的一個優勢是它允許在循環過程(Under rotate)中繼承 Ridge 的穩定性.
彈性網絡的目標函數是最小化:
ElasticNetCV 可以通過交叉驗證來用來設置參數 alpha
() 和
l1_ratio
()
-
print(__doc__)
-
-
import numpy as np
-
import matplotlib.pyplot as plt
-
-
from sklearn.linear_model import lasso_path, enet_path
-
from sklearn import datasets
-
-
diabetes = datasets.load_diabetes()
-
X = diabetes.data
-
y = diabetes.target
-
-
X /= X.std(axis= 0) # Standardize data (easier to set the l1_ratio parameter)
-
-
# Compute paths
-
-
eps = 5e-3 # the smaller it is the longer is the path
-
-
print( "Computing regularization path using the lasso...")
-
alphas_lasso, coefs_lasso, _ = lasso_path(X, y, eps, fit_intercept= False)
-
-
print( "Computing regularization path using the positive lasso...")
-
alphas_positive_lasso, coefs_positive_lasso, _ = lasso_path(
-
X, y, eps, positive= True, fit_intercept=False)
-
print( "Computing regularization path using the elastic net...")
-
alphas_enet, coefs_enet, _ = enet_path(
-
X, y, eps=eps, l1_ratio= 0.8, fit_intercept=False)
-
-
print( "Computing regularization path using the positve elastic net...")
-
alphas_positive_enet, coefs_positive_enet, _ = enet_path(
-
X, y, eps=eps, l1_ratio= 0.8, positive=True, fit_intercept=False)
-
-
# Display results
-
-
plt.figure( 1)
-
ax = plt.gca()
-
ax.set_color_cycle( 2 * ['b', 'r', 'g', 'c', 'k'])
-
l1 = plt.plot(-np.log10(alphas_lasso), coefs_lasso.T)
-
l2 = plt.plot(-np.log10(alphas_enet), coefs_enet.T, linestyle= '--')
-
-
plt.xlabel( '-Log(alpha)')
-
plt.ylabel( 'coefficients')
-
plt.title( 'Lasso and Elastic-Net Paths')
-
plt.legend((l1[ -1], l2[-1]), ('Lasso', 'Elastic-Net'), loc='lower left')
-
plt.axis( 'tight')
-
-
-
plt.figure( 2)
-
ax = plt.gca()
-
ax.set_color_cycle( 2 * ['b', 'r', 'g', 'c', 'k'])
-
l1 = plt.plot(-np.log10(alphas_lasso), coefs_lasso.T)
-
l2 = plt.plot(-np.log10(alphas_positive_lasso), coefs_positive_lasso.T,
-
linestyle= '--')
-
-
plt.xlabel( '-Log(alpha)')
-
plt.ylabel( 'coefficients')
-
plt.title( 'Lasso and positive Lasso')
-
plt.legend((l1[ -1], l2[-1]), ('Lasso', 'positive Lasso'), loc='lower left')
-
plt.axis( 'tight')
-
-
-
plt.figure( 3)
-
ax = plt.gca()
-
ax.set_color_cycle( 2 * ['b', 'r', 'g', 'c', 'k'])
-
l1 = plt.plot(-np.log10(alphas_enet), coefs_enet.T)
-
l2 = plt.plot(-np.log10(alphas_positive_enet), coefs_positive_enet.T,
-
linestyle= '--')
-
-
plt.xlabel( '-Log(alpha)')
-
plt.ylabel( 'coefficients')
-
plt.title( 'Elastic-Net and positive Elastic-Net')
-
plt.legend((l1[ -1], l2[-1]), ('Elastic-Net', 'positive Elastic-Net'),
-
loc= 'lower left')
-
plt.axis( 'tight')
-
plt.show()