tf.layers.dense()


tf.layers.dense用法

dense:全連接層

相當於添加一個層,即初學的add_layer()函數

 

  1.  
    tf.layers.dense(
  2.  
        inputs,
  3.  
        units,
  4.  
        activation=None,
  5.  
        use_bias=True,
  6.  
        kernel_initializer=None,
  7.  
        bias_initializer=tf.zeros_initializer(),
  8.  
        kernel_regularizer=None,
  9.  
        bias_regularizer=None,
  10.  
        activity_regularizer=None,
  11.  
        kernel_constraint=None,
  12.  
        bias_constraint=None,
  13.  
        trainable=True,
  14.  
        name=None,
  15.  
        reuse=None
  16.  
    )

其中:

        inputs: 該層的輸入

        units: 輸出的大小(維數),整數或long

        activation: 使用什么激活函數(神經網絡的非線性層),默認為None,不使用激活函數

        use_bias: 使用bias為True(默認使用),不用bias改成False即可

 

  • kernel_initializer: Initializer function for the weight matrix. If None(default), weights are initialized using the default initializer used by tf.get_variable.
  • bias_initializer: Initializer function for the bias.
  • kernel_regularizer: Regularizer function for the weight matrix.
  • bias_regularizer: Regularizer function for the bias.
  • activity_regularizer: Regularizer function for the output.
  • kernel_constraint: An optional projection function to be applied to the kernel after being updated by an Optimizer (e.g. used to implement norm constraints or value constraints for layer weights). The function must take as input the unprojected variable and must return the projected variable (which must have the same shape). Constraints are not safe to use when doing asynchronous distributed training.
  • bias_constraint: An optional projection function to be applied to the bias after being updated by an Optimizer.
  • trainable: Boolean, if True also add variables to the graph collectionGraphKeys.TRAINABLE_VARIABLES (see tf.Variable).
  • name: String, the name of the layer.
  • reuse: Boolean, whether to reuse the weights of a previous layer by the same name.
(后面懶得翻譯了,有空再說吧,[微笑臉])


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM