Linear regression with one variable

Housing Prices

Hypothesis:

h_{\theta}(x)=\theta_0+\theta_1x

Parameters:

\theta_0\theta_1

Cost function:

Squared error
J(\theta_0,\theta_1)=\frac{1}{2m}\sum_{i=1}^{m}(h_\theta(x^i)-y^i)^2

Goal

Min J(\theta_0,\theta_1)

Gradient descent

  • start with some \theta_0,\theta_1
  • keep changing \theta_0,\theta_1 to reduce J(\theta_0,\theta_1) until we hopefully end up at a minimum

Gradient descent algorithm

repeat until convergence {
\space \theta_j:=\theta_j-\alpha\frac{\partial}{\partial\theta_j}J(\theta_0,\theta_1) (for \space j=0 \space and \space j=1)
}
Note : the Correct way is simultaneous update

  • temp0:=\theta_0:=\theta_0-\alpha\frac{\partial}{\partial\theta_0}J(\theta_0,\theta_1)
  • temp1:=\theta_1:=\theta_1-\alpha\frac{\partial}{\partial\theta_1}J(\theta_0,\theta_1)
  • \theta_0:=temp0
  • \theta_1:=temp1
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容