Preprocessing doubt

during preprocessing, why did sir normalised the X values but Y remains the same?

https://online.codingblocks.com/player/5219/content/1228?s=672

X is our input and Y is our output.we are doing normalisation on input data so our model is not skewed .
also we want our model to predict the output on the same scale as it is given to us thats why we are not normalising it.also normalising Y doesnt really make any sense .
But if u want u can normalize Y but after making prediction u will have to get it back to same scale as given to us originally but doing so will not help us improve our result so there is no point in normalizing y.

If we are changing the input , shouldn’t we change the output too. I didn’t understand completely

normalizaton is done to bring uniformity to our data .it is done when there are more than one feature in X and they are in different scale.so we normalize our data to bring it within certain range of values as having very high range can detoriate the learning speed of our model.Since Y only has a single feature we dont need to normalize it.
Also the weights will now be adjusted according to new X after normaliztion in such a way when multiplied by X normalized it gives the same Ypredicted on same scale as Y given to us.
Normalization doesnt change our input just scales it eg i can say an ant is 10000000micrometer in length or say an ant is 10 cm in lengh they both represent same data but in different styles that is what we are doing in normalization we are making sure our input is on same scale irrespective of the feature but it is still represnting same features.

1 Like

This was very clear. Now I understand it completely. Thanks for explaining it in a simple and clear way