訓練GAN net時經常遇到這個問題
RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling .backward() or autograd.grad() the first time.
翻譯一下就是 第二次嘗試在圖中向后遍歷時,保存的臨時變量已經被釋放
顯然,
GAN中有一個變量存在於gen和disc之間,就是fake
加上detach() 就行