tf.train.AdamOptimizer 優化器


adaptive moment estimation(自適應矩估計)

tf.train.AdamOptimizer(
    learning_rate=0.001,
    beta1=0.9,
    beta2=0.999,
    epsilon=1e-08,
    use_locking=False,
    name='Adam'
)
參數:
learning_rate: (學習率)張量或者浮點數
beta1:  浮點數或者常量張量 ,表示 The exponential decay rate for the 1st moment estimates.
beta2:  浮點數或者常量張量 ,表示 The exponential decay rate for the 2nd moment estimates.
epsilon: A small constant for numerical stability. This epsilon is "epsilon hat" in the Kingma and Ba paper (in the formula just before Section 2.1), not the epsilon in         Algorithm 1 of the paper.
use_locking: 為True時鎖定更新
name:  梯度下降名稱,默認為 "Adam"


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM