I'm trying to gain a good understanding of the mechanics of how neural nets execute. To that end I am working through some Tensor Flow examples and attempting to implement my own execution implementation.
While playing around with the MNSET example I grasped that the Dropout(0.2)
(here) layer is
is actually part of the struture of the trained net in the end. You can examine its weights using model.get_weights()[2]
. The documentation suggests that Dropout
is a noop at execution time. If that's true, are the weights ignored? Or are the trained weights used just like the regular Dense
layer at execution time?