RESNET and SKIP CONNECTION

How Skip Connection helps in Vanishing Gradient Problem.

Hey @Bhawna,

Skipping connections effectively simplifies the network, using fewer layers in the initial training stages. This speeds learning by reducing the impact of vanishing gradients, as there are fewer layers to propagate through. The network then gradually restores the skipped layers as it learns the feature space.

I hope you get the point!
Thank you.

means gradually network will learn weights of skipped layers?

Yes, As you can see in the above image we have added two extra layers to our deep learning model and made a skip connection. Initially, the weights and biases of the added layer are zero. Therefore the output will be the same irrespective of the layers added. After certain epochs, these added layers will have their weights tuned then it will be capable of learning more features than just identity function.