M.L.-Multivariate Linear Regression

Multiple Features


hypothesis function with multivariate


matrix form

1.Gradient Descent


gradient descent


Feature Scaling

We can speed up gradient descent by having each of our input values in roughly the same range. This is because θ will descend quickly on small ranges and slowly on large ranges, and so will oscillate inefficiently down to the optimum when the variables are very uneven.

x = x/(max-min,or range of feature)

Mean normalization

x = (x - u)/s , u is average of inputs, s is max - min

Learning Rate

Debugging gradient descent.Make a plot with number of iterations on the x-axis. Now plot the cost function, J(θ) over the number of iterations of gradient descent. If J(θ) ever increases, then you probably need to decrease α.

Automatic convergence test.Declare convergence if J(θ) decreases by less than E in one iteration, where E is some small value such as 10^−3. However in practice it's difficult to choose this threshold value.

To summarize:

If α is too small: slow convergence.

If α is too large: may not decrease on every iteration and thus may not converge.

Features and Polynomial Regression

Sometimes,the linear regression may not fit our data well,while there are polynomial regressions such as quadratic、cube、or square functions can do better.

polynomial regression

One important thing to keep in mind is, if you choose your features this way then feature scaling becomes very important.

2.Normal Equation


Normal equation is another way to minimize J,we will minimize J by explicitly taking its derivatives with respect to the θj ’s, and setting them to zero. This allows us to find the optimum theta without iteration.

normal equation formula

compare to gradient descent:

(1) No need to choose alpha

(2) No need to iterate

(3) O (n^3), need to calculate inverse of X'X

(4) slow,if n is large

if X’X is noninvertible, two reasons:

.redundant features,where two features are very closely related (i.e. they are linearly dependent)

.Too many features (e.g. m ≤ n). In this case, delete some features or use "regularization" (to be explained in a later lesson).

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容