Why to use mean squared error over the average absolute error?

I didn’t get why we are using the mean squared value and the average absolute error. Can you explain a bit more about it?

hey @Mohit2000 ,
While training models , specifically for regression tasks , we need to find some loss function / error function , which will help us to improve the model’s performance ,i.e. reaching convergance.
Now as it is a regression task , means the output values can be anything and continuous in nature.
Hence , to check our model performance , we use root mean squared ( RMSE ) or average/mean absolute error (MAE) to check how our model is performing and based on it to improve the models performance.

I hope this helps.

I am asking why we are using mean squared value over the average absolute error?

There is not perfect answer to this.

RMSE has been found much better to reach convergance , faster and more better.
And also these metrics depend on the task too.
Like for a price prediction, as it cannot be negative hence RMSE is much better as compared to MAE.
and for a normal regression task that contains both positive and negative values as output , then MAE will be much better to check performance.

I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.

On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.