we have to get sigmoid of (weight+bias) to get activation but here we are getting zero matrix for bias adding to which will be insignificant.
So how and where do we update bias and if we dont then why are we adding matrix of zeros as bias?
Why are we adding matrix of zeros as bias?
Hi please note that we are only adding the bias as a zeros matrix during creating the self model inside the init method. The bias is created as a matrix of zeros only in the init method, which means the self.model’ s biases are declared only once.But due to n no. of forward pases and backward prop (on the same biases), it will get adjusted.
And while calculating the activation or sigmoid we are doing the sigmoid of z, which is nothing but input*w + bias. After each of the forward and backward pass of a single epoch, the bias is updated. Like for first epoch b=0, for second epoch lets say its 0.1, for third its 0.2 and so on. So after every epoch it is updated. Hence adding bias for activation(sigmoid) is of significance.
I hope I’ve cleared your doubt. I ask you to please rate your experience here
Your feedback is very important. It helps us improve our platform and hence provide you
the learning experience you deserve.
On the off chance, you still have some questions or not find the answers satisfactory, you may reopen
the doubt.