x.norm()函數:
http://www.pythonheidong.com/blog/article/170104/
torch.autograd.grad函數:
計算張量的梯度函數,返回值的shape和函數輸入的值的shape一致
outputs:函數的輸出
inputs:函數的輸入 grad_outputs:權重,下面代碼是和輸出大小一致的全1張量
1 disc_interpolates = netD(interpolates) 2 gradients = torch.autograd.grad(outputs=disc_interpolates, inputs=interpolates, 3 grad_outputs=torch.ones(disc_interpolates.size()).to(device),
4 create_graph=True, retain_graph=True, only_inputs=True)[0]