Why are we doing batch normalization? Generally, when should batch normalization be used?
BatchNormalization in Generator in DCGAN's
Hey @raunaqsingh10, batch normalization in new layer introduced in recent years, many research scholars have proved that batchnormalization increases the speed of training by 1.5x-2x. Its introduced after maxpooling, or after dense. Moreover it is used in nearly maximum of all model newly built.
Hope this resolved your doubt.
Plz mark it as resolved in my doubts section. 
I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.
On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.