Back Propagation

Why are we using backpropagation, why we backpropagate our loss and what it has to do with changes with forward propagation.

What if we don’t use backpropagation ?

What is the difference in result with and without using backpropagation?

hey @amanbh_123 ,
whenever you learn something at starting you always lack at something / don’t get everything in the first attempt. So , you work more harder and revise things again and again .
And as a reason you score good.

In the way our model learns. intially we provide it with some random values , do predictions and check how the model is close to the real answer. But in the first attempt we get the model is scoring very very bad. No we want to teach our model in such a way that it gets more closer to the correct answer , means that it should understand the task it is doing .
So , in this case , we use backpropagation mean applying penalty on the weights used with the current loss so that it reduces that loss value and hence improve the performance.
On applying this thing again and again the model starts learning and hence performs good in the end.

Its the same , that you need to give physics exam without learning anything anytime , and you don’t know what going to come.

when you learn your score good.
and when you don’t learn you don’t score good.
and the same happens with model too.

Note : assuming there no cheating done , when we haven’t learn anything about the exam.

I hope this helped you :slightly_smiling_face:.
Thank You and Happy Learning :slightly_smiling_face:.

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.