Regarding val_loss while training a Deep Learning Model

i want to ask if i have a dataset of 45k images and a label associated with each image.So its a regression problem and when i am training this model using CNN Model the val_loss firstly decrease and then fluctuating(sometimes increasing and sometimes decreasing) and sometime it is very very less and then a a sudden jump.I mean is this a normal behaviour until we reach minimum value of val_loss or we should stop at that point where val_loss increases after that epoch.I am confused in training the deep learning Models and cant figure out if i going in right direction or not.Plzz help me with this with a brief answer

patter of val_loss that i am getting while training my model is after every epoch [0.36 , 0.29 , 0.28 , 0.23 , 0.25 , 0.30 , 0.24 , 0.21, 0.15]

This defines the model has started to get over fitted.

No not at all.

To train your deep learning model properly , some conditions should match.
loss and val_loss should always decrease and accuracy and val_accuracy should increase.
There should be margin between there respective values , not to exactly same to each other as that defines the model is getting underfitted.

And to correclty train them , you need to first understand the problem , then create your model , correctly assign the final output layer activation , number of layers etc. Choose the correct optimizer,and loss function to let the model learn. And most important is learning rate.
And to check the performance, a correct metric to examine its learning.

If you implement all these correctly and also trained it without overfitting , then Bingo !!!
You got your model trained perfectly.

I hope this helps.
Thank You :slightly_smiling_face:

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.