Dropout was meant to drop some of the neuron or turn off them. But here the number of neurons in each layer is same as before applying dropouts to them. Why so?
And How is dense layer working? Like how it is going to transform 8 neurons to 10?