Tensorflow Language Model tutorial dropout twice? -


i'm working on language model tutorial of tensorflow. question is:

in this line, use wrapper apply dropout rnns

lstm_cell = tf.nn.rnn_cell.basiclstmcell(size, forget_bias=0.0) if is_training , config.keep_prob < 1:   lstm_cell = tf.nn.rnn_cell.dropoutwrapper(       lstm_cell, output_keep_prob=config.keep_prob) 

why have apply dropout again inputs in this line?

if is_training , config.keep_prob < 1:   inputs = tf.nn.dropout(inputs, config.keep_prob) 

thanks!

edit: ok didn't understand paper @ time wrote question. zambera suggested apply dropout everywhere except hidden hidden. however, layer's output next layer's input, apply dropout every layer's output, , input of first layer.


Comments

Popular posts from this blog

Export Excel workseet into txt file using vba - (text and numbers with formulas) -

wordpress - (T_ENDFOREACH) php error -

Using django-mptt to get only the categories that have items -