Like in CNN we keep non-decreasing to increase our receptive field but why here… i mean in MLP , is there any logic or practical reason of choice of 16 as sir did or its random?
Is there any logic in MLP like that of CNN?
Hey @nikhil_sarda, no these parameters are called hyperparameters and they have to be decided by experimenting only ! Sir has taken 16 units just so that the model can be trained easily and without any delay. You can experiment by taking different values of number of neurons in a single layer !
I hope this helps
Happy Learning
I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.
On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.