4. Multiple features

Multiple features

Multiple features

Notation:

  • n = number of features
  • x^{(i)} = input (features) of i^{th} training example.
  • x^{(i)}_j = value of feature j in i^{th} training example.

h_\Theta (x)=\Theta_0+\Theta_1x_1+\Theta_2x_2+...+\Theta_nx_n
For convenience of notation, define x_0=1
x= \left[ \begin{matrix} x_0 \\ x_1 \\ x_2 \\ ... \\ x_3 \\ \end{matrix} \right] $$ \Theta = \left[ \begin{matrix} \Theta_0 \\ \Theta_1 \\ \Theta_2 \\ ... \\ \Theta_3 \\ \end{matrix} \right]
h_\Theta(x)=\Theta^Tx

向量内积
Multivariate linear regression

Gradient descent for multiple variables

Hypothesis: h_\Theta (x)=\Theta_0+\Theta_1x_1+\Theta_2x_2+...+\Theta_nx_n
Parameters: \Theta(a n+1 dimensional vector)
Cost function: j(\Theta_0,\Theta_1,...,\Theta_n)=\frac{1}{2m}\sum^m_{i=1}(h_\Theta(x^{(i)}-y^{(i)})^2

Gradient descent:
Repeat {
\Theta_j:=\Theta_j-\alpha\frac{\partial}{\partial\Theta_j}J(\Theta_0,...,\Theta_n)
}

Feature Scaling

Idea:Make sure features are on a similar scale

一般认为在-3\le x\le +3就可以认为是可以的
Feature Scaling:Get every feature into approximately a -1\le x_i\le 1 range.
Mean normalization:Replace x_i with x_i-\mu_i to make features have approximately zero mean.(Do not apply to x_0=1)

Learning rate

  1. Make sure gradient descent is working correctly.
    J(\theta) should decrease on every iteration.But if \alpha is too small, gradient descent can be slow to converge.

Features and Polynomial Regression

Normal Equation

Normal equation: Method to solve for \theta ananlytically.

\Theta =(X^TX)^{-1}X^Ty
Octave:pinv(x'*x)*x'*y

m training examples, n features

Gradient Descent

  • Need to choose \alpha
  • Needs many iterations
  • Works well even when n is large

Normal Equation

  • No need to choose \alpha
  • Don't need to iterate
  • Need to compute (X^TX)^{-1}
  • Slow if n is very large(less than 10000)

Normal Equation Nonivertibility (Optional)

What if X^TX is non-invertible?

  • Redundant features (linearly dependent)
  • Too many features(e.g. m\le n)
    这种情况极少出现
©著作权归作者所有,转载或内容合作请联系作者
【社区内容提示】社区部分内容疑似由AI辅助生成,浏览时请结合常识与多方信息审慎甄别。
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

友情链接更多精彩内容