《PyTorch深度學習實踐》完結合集_嗶哩嗶哩_bilibili
Multiple Dimension Imput
1、糖尿病預測案例
2、輸入8個特征變量
3、Mini-batch
N個樣本,每個樣本有8個特征變量
3、輸入8維變量,輸出1維,代碼部分修改
4、構造神經網絡
增加網絡層數,增加網絡復雜度。
Layer1:從8D降到6D
Layer2:從6D降到4D
Layer3:從4D降到1D
!!通過網絡,維度增加也是可以的!!
5、不同激活函數
6、代碼實現
import torch import numpy as np ## 載入數據集,delimiter--分隔符 xy = np.loadtxt('diabetes.csv.gz', delimiter=',', dtype=np.float32) #從numpy中生成Tensor x_data = torch.from_numpy(xy[:, :-1]) y_data = torch.from_numpy(xy[:, [-1]]) ##Design Model ##構造類,繼承torch.nn.Module類 class Model(torch.nn.Module): ## 構造函數,初始化對象 def __init__(self): ##super調用父類 super(Model, self).__init__() ##構造三層神經網絡 self.linear1 = torch.nn.Linear(8, 6) self.linear2 = torch.nn.Linear(6, 4) self.linear3 = torch.nn.Linear(4, 1) ##激活函數,進行非線性變換 self.sigmoid = torch.nn.Sigmoid() ## 構造函數,前饋運算 def forward(self, x): x = self.sigmoid(self.linear1(x)) x = self.sigmoid(self.linear2(x)) x = self.sigmoid(self.linear3(x)) return x # ============================================================================= # # 激活函數,進行非線性變換 # self.activate = torch.nn.ReLU() # # # 構造函數,前饋運算 # def forward(self, x): # x = self.activate(self.linear1(x)) # x = self.activate(self.linear2(x)) # #最后一層為了保證輸出結果(概率)在[0,1],要用sigmoid # x = self.sigmoid(self.linear3(x)) # return x # ============================================================================= model = Model() ##Construct Loss and Optimizer ##損失函數,傳入y和y_pred,size_average--是否取平均 criterion = torch.nn.BCELoss(size_average = True) ##優化器,model.parameters()找出模型所有的參數,Lr--學習率 optimizer = torch.optim.SGD(model.parameters(), lr=0.1) ## Training cycle for epoch in range(100): ##前向傳播 y_pred = model(x_data) loss = criterion(y_pred, y_data) print(epoch, loss.item()) ##梯度歸零 optimizer.zero_grad() ##反向傳播 loss.backward() ##更新 optimizer.step()