tf.nn.droput() and tf.layers.dropout()
A quick glance through tensorflow/python/layers/core.py and tensorflow/python/ops/nn_ops.py reveals that tf.layers.dropout is a wrapper for tf.nn.dropout.
The only differences in the two functions are:
- The tf.nn.dropout has parameter keep_prob: "Probability that each element is kept"
tf.layers.dropout has parameter rate: "The dropout rate"
Thus, keep_prob = 1 - rate as defined here
- The tf.layers.dropout has training parameter: "Whether to return the output in training mode (apply dropout) or in inference mode (return the input untouched)."
tf.layers.dropout(
inputs,
rate=0.5,
noise_shape=None,
seed=None,
training=False, #True :학습시, False:test시
name=None
)
tf.nn.dropout(
x,
keep_prob=None,
noise_shape=None,
seed=None,
name=None,
rate=None
)
out=tf.layers.dense(
hidden = tf.layers.dense(inputs=input, units=1024, activation=tf.nn.relu)
dropout = tf.layers.dropout(inputs=hidden, rate=0.1, training=True)
hidden = tf.layers.dense(inputs=input, units=1024, activation=tf.nn.relu)
dropout = tf.layers.dropout(inputs=hidden, rate=1-0.1, training=True)