17/10/30 MWhite's learning notes
Recap
Gradient descent algorithm
want to find w&b that minimize J(w,b)
w:=w-a*dJ(w,b)/dw
(a-learning rate)
Derivate&calculus
Skip
want to find w&b that minimize J(w,b)
w:=w-a*dJ(w,b)/dw
(a-learning rate)
Skip