tf.layers.dense用法
dense:全連接層
相當於添加一個層,即初學的add_layer()函數
-
tf.layers.dense(
-
inputs,
-
units,
-
activation=None,
-
use_bias=True,
-
kernel_initializer=None,
-
bias_initializer=tf.zeros_initializer(),
-
kernel_regularizer=None,
-
bias_regularizer=None,
-
activity_regularizer=None,
-
kernel_constraint=None,
-
bias_constraint=None,
-
trainable=True,
-
name=None,
-
reuse=None
-
)
其中:
inputs: 該層的輸入
units: 輸出的大小(維數),整數或long
activation: 使用什么激活函數(神經網絡的非線性層),默認為None,不使用激活函數
use_bias: 使用bias為True(默認使用),不用bias改成False即可
kernel_initializer
: Initializer function for the weight matrix. IfNone
(default), weights are initialized using the default initializer used bytf.get_variable
.bias_initializer
: Initializer function for the bias.kernel_regularizer
: Regularizer function for the weight matrix.bias_regularizer
: Regularizer function for the bias.activity_regularizer
: Regularizer function for the output.kernel_constraint
: An optional projection function to be applied to the kernel after being updated by anOptimizer
(e.g. used to implement norm constraints or value constraints for layer weights). The function must take as input the unprojected variable and must return the projected variable (which must have the same shape). Constraints are not safe to use when doing asynchronous distributed training.bias_constraint
: An optional projection function to be applied to the bias after being updated by anOptimizer
.trainable
: Boolean, ifTrue
also add variables to the graph collectionGraphKeys.TRAINABLE_VARIABLES
(seetf.Variable
).name
: String, the name of the layer.reuse
: Boolean, whether to reuse the weights of a previous layer by the same name.