I’m very confused of what is the use of optimizers such as adam while compiling model ?
I know gradient descent is used to update model parameters. Is it also an optimization technique. What is the difference between gradient descent and adam.
If i compile a model and mention optimizer as adam and while training i passed batch size also then am i using gradient descent or adam to update weights?