报错RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed


训练GAN net时经常遇到这个问题

RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling .backward() or autograd.grad() the first time.

翻译一下就是 第二次尝试在图中向后遍历时,保存的临时变量已经被释放

显然,

GAN中有一个变量存在于gen和disc之间,就是fake

加上detach() 就行

 


免责声明!

本站转载的文章为个人学习借鉴使用,本站对版权不负任何法律责任。如果侵犯了您的隐私权益,请联系本站邮箱yoyou2525@163.com删除。



 
粤ICP备18138465号  © 2018-2025 CODEPRJ.COM