tensorflow2.0——自動求導GradientTape


 

 

該參數表示是否監視可訓練變量,若為False,則無法監視該變量,則輸出也為None

 

 

 手動添加監視

 

 

 

 

 

 

import tensorflow as tf

############################### tf.GradientTape(persistent,watch_accessed_variables)
print('###############一元函數求導##############')
x = tf.Variable(3.)
# x = tf.constant(3.)
with tf.GradientTape(persistent = True,watch_accessed_variables = True)as tape:                     #   persistent = True表示可以再次使用這個tape而不會立即銷毀
    # tape.watch(x)                           #   手動添加監視
    y = 3 * pow(x, 3) + 2 * x
    z = pow(x,4)
dy_dx = tape.gradient(y,x)
dz_dx = tape.gradient(z,x)
print('y:',y)
print('y對x的導數為:',dy_dx)
print('z:',z)
print('z對x的導數為:',dz_dx)
print()
del tape
print('###############一元函數求二階導##############')
x = tf.Variable(10.)
with tf.GradientTape() as tape1:
    with tf.GradientTape() as tape2:
        y = pow(x,2)
    y2 = tape2.gradient(y,x)
y3 = tape1.gradient(y2,x)
print('x**2在x=10的二階導數為:',y3)
print()

print('###############多元函數求偏導##############')
x = tf.Variable(4.)
y = tf.Variable(2.)
with tf.GradientTape(persistent = True) as tape:
    z = pow(x,2) + x * y
# dz_dx = tape.gradient(z,x)
# dz_dy = tape.gradient(z,y)
dz_dx,dz_dy = tape.gradient(z,[x,y])
result = tape.gradient(z,[x,y])
print('z:',z)
print('z對x的導數為:',dz_dx)
print('z對y的導數為:',dz_dy)
print('result:\n',result)
print()
print('###############對向量求偏導##############')
x = tf.Variable([[1.,2.,3.]])
with tf.GradientTape() as tape:
    y = 3 * pow(x,2)
dy_dx = tape.gradient(y,x)
print('向量求導dy_dx:',dy_dx)

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM