Tensorflow學習筆記(2):tf.nn.dropout 與 tf.layers.dropout


A quick glance through tensorflow/python/layers/core.py and tensorflow/python/ops/nn_ops.pyreveals that tf.layers.dropout is a wrapper for tf.nn.dropout.

You want to use the dropout() function in tensorflow.contrib.layers, not the one in tensorflow.nn.

The only differences in the two functions are:

  1. The tf.nn.dropout has parameter keep_prob: "Probability that each element is kept"
    tf.layers.dropout has parameter rate: "The dropout rate"
    Thus, keep_prob = 1 - rate as defined here
  2. The tf.layers.dropout has training parameter: "Whether to return the output in training mode (apply dropout) or in inference mode (return the input untouched)."   The first one turns off(no-op) when not training, which is what you want, while the sec‐ond one does not.




免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM