How to judge when to add dropouts , batch normalization etc.?

While building the AlexNet, how can we decide when to add a dropout layer or a batch normalization layer?

Hey @Kush-Ghilothia-201796510454222, there is no thumb rule that this should be the model architecture, to gain the best intuition try to read as many architectures available as possible to gain the insights. Also you can’t conclude to a architecture for a problem in a single go, you need to try more architectures before finalizing the results.

Generally batch normalization layer is added after CNN layer, dense layer. Dropout is added after BatchNormalization.

Hope this resolved your doubt.
Plz mark the doubt as resolved in my doubts section. :blush:

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.