動物心率與體重的模型
動物消耗的能量p主要用於維持體溫,而體內熱量通過其表面積S散失,記動物體重為w,則\(P \propto S \propto w^{\alpha}\)。又\(P\)正比於血流量\(Q\),而\(Q=wr\),其中\(q\)是動物每次心跳泵出的血流量,\(r\)為心率。假設\(q\)與\(r\)成正比,於是\(P \propto wr\)。於是有\(r \propto w^{\alpha-1}=w^a\),有\(r=kw^a+b\)。
import numpy as np
import matplotlib.pyplot as plt
import torch
import math
%matplotlib inline
r=np.array([[670],[420],[205],[120],[85],[70],[72],[38]])
w=np.array([[25],[200],[2000],[5000],[30000],[50000],[70000],[450000]])
plt.plot(w,r,'bo')
x_sample = np.arange(85, 450000, 0.1)
bottom_range = [1,2,3,4,5]
color = ['red','green','pink','black','blue']
for i in range(5):
y_sample = 5000*x_sample**(-1/bottom_range[i])
plt.plot(x_sample, y_sample, color[i],label='real curve')

由上圖的預模擬,考慮\(r\)的指數為\(-1\),\(-\frac{1}{2}\),\(-\frac{1}{3}\),\(-\frac{1}{4}\),\(-\frac{1}{5}\),從中選取誤差最小的
from torch.autograd import Variable
from torch import nn
from torch import optim
import math
#生成目標函數 構建數據集
x_train = w
x_train = torch.from_numpy(x_train).float()
x_train = Variable(x_train)
y_train = torch.from_numpy(r).float()
y_train = Variable(y_train)
#構建模型
class poly_model(nn.Module):
def __init__(self,bottom):
super(poly_model,self).__init__()
self.k = nn.Parameter(torch.randn(1))
self.b = nn.Parameter(torch.zeros(1))
self.bottom = bottom
def forward(self,x):
out = (x)**(-1/self.bottom)*self.k+self.b
return out
for i in range(5):
print("exponential is -1/%d"%(bottom_range[i]))
model = poly_model(bottom_range[i])
criterion = nn.MSELoss()
optimizer = optim.SGD(model.parameters(),lr=1e-3)
# 更新參數
for j in range(150000):
output = model(x_train)
loss = criterion(output,y_train)
optimizer.zero_grad()
loss.backward()
optimizer.step()
if(j%50000 == 0):
print(loss.item())
if(loss.item() < 1e-3): break
print(model.parameters())
y_pred = model(x_train)
plt.plot(x_train.data.numpy()[:, 0], y_pred.data.numpy(), label='fitting curve', color=color[i])
plt.plot(w, r, label='real curve', color='orange')
經過150000輪預訓練,我們得到如下圖,表中為曲線顏色對應的指數
| 指數 | 顏色 | 誤差 |
|---|---|---|
| -1/1 | 紅 | 41184 |
| -1/2 | 綠 | 10599 |
| -1/3 | 粉 | 1195 |
| -1/4 | 黑 | 360 |
| -1/5 | 藍 | 468 |
其中誤差最小的項為\(-\frac{1}{4}\)

這里可以做一些交叉熵驗證找一個最佳的learning rate代碼就不貼了 隨機生成學習率即可,經過100次驗證 我得到的最佳學習率是0.20485,收斂的很快
model = poly_model(4)
criterion = nn.MSELoss()
optimizer = optim.SGD(model.parameters(),lr=0.20485)
for j in range(50001):
output = model(x_train)
loss = criterion(output,y_train)
optimizer.zero_grad()
loss.backward()
optimizer.step()
if(j%50000 == 0):
print(loss.item())
y_pred = model(x_train)
plt.plot(x_train.data.numpy()[:, 0], y_pred.data.numpy(), label='fitting curve', color=color[i])
plt.plot(w, r, label='real curve', color='orange')

打印模型參數
param = list(model.parameters())
print(param)
[Parameter containing:
tensor([1591.8446], requires_grad=True), Parameter containing:
tensor([-33.6434], requires_grad=True)]
通過交叉驗證,使用0.20485的學習率學習50000輪后,最終模型為\(r=1591.84w^{-\frac{1}{4}}-33.64\),均方誤差為304.288
| 動物 | 實際心率 | 預測心率 | 偏差 |
|---|---|---|---|
| 田鼠 | 670 | 680 | +10 |
| 家鼠 | 420 | 390 | -30 |
| 兔 | 205 | 204 | -1 |
| 小狗 | 120 | 155 | +35 |
| 大狗 | 85 | 87 | +2 |
| 羊 | 70 | 72 | +2 |
| 人 | 72 | 63 | -9 |
| 馬 | 38 | 27 | -11 |

