adaptive moment estimation(自適應矩估計)
tf.train.AdamOptimizer(
learning_rate=0.001,
beta1=0.9,
beta2=0.999,
epsilon=1e-08,
use_locking=False,
name='Adam'
)
參數:
learning_rate: (學習率)張量或者浮點數
beta1: 浮點數或者常量張量 ,表示 The exponential decay rate for the 1st moment estimates.
beta2: 浮點數或者常量張量 ,表示 The exponential decay rate for the 2nd moment estimates.
epsilon: A small constant for numerical stability. This epsilon is "epsilon hat" in the Kingma and Ba paper (in the formula just before Section 2.1), not the epsilon in Algorithm 1 of the paper.
use_locking: 為True時鎖定更新
name: 梯度下降名稱,默認為 "Adam"