stanford公开课Machine Learning视频教程

Machine Learning.png

Machine Learning

by Andrew Ng


Course Description

In this course, you'll learn about some of the most widely used and successful machine learning techniques. You'll have the opportunity to implement these algorithms yourself, and gain practice with them. You will also learn some of practical hands-on tricks and techniques (rarely discussed in textbooks) that help get learning algorithms to work well. This is an "applied" machine learning class, and we emphasize the intuitions and know-how needed to get learning algorithms to work in practice, rather than the mathematical derivations.
Familiarity with programming, basic linear algebra (matrices, vectors, matrix-vector multiplication), and basic probability (random variables, basic properties of probability) is assumed. Basic calculus (derivatives and partial derivatives) would be helpful and would give you additional intuitions about the algorithms, but isn't required to fully complete this course.

I. INTRODUCTION


Welcome

What is Machine Learning?

Supervised Learning Introduction

Unsupervised Learning Introduction

Installing Octave

II. LINEAR REGRESSION I


Supervised Learning Introduction(1.2x)(1.5x)

Model Representation(1.2x)(1.5x)

Cost Function(1.2x)(1.5x)

Gradient Descent(1.2x)(1.5x)

Gradient Descent for Linear Regression(1.2x)(1.5x)

Vectorized Implementation(1.2x)(1.5x)

Exercise 2

III. LINEAR REGRESSION II


Feature Scaling(1.2x)(1.5x)

Learning Rate(1.2x)(1.5x)

Features and Polynomial Regression(1.2x)(1.5x)

Normal Equations(1.2x)(1.5x)

Exercise 3

IV. LOGISTIC REGRESSION


Classification(1.2x)(1.5x)

Model(1.2x)(1.5x)

Optimization Objective I(1.2x)(1.5x)

Optimization Objective II(1.2x)(1.5x)

Gradient Descent(1.2x)(1.5x)

Newton's Method I(1.2x)(1.5x)

Newton's Method II(1.2x)(1.5x)

Gradient Descent vs Newton's Method(1.2x)(1.5x)

Exercise 4

V. REGULARIZATION


The Problem Of Overfitting(1.2x)(1.5x)

Optimization Objective(1.2x)(1.5x)

Common Variations(1.2x)(1.5x)

Regularized Linear Regression(1.2x)(1.5x)

Regularized Logistic Regression(1.2x)(1.5x)

Exercise 5

VI. NAIVE BAYES


Generative Learning Algorithms(1.2x)(1.5x)

Text Classification(1.2x)(1.5x)

Exercise 6

VII.


Exercise 7

VIII.


Exercise 8

IX.

Exercise 9
from stanford Machine Learning OpenClassroom

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
【社区内容提示】社区部分内容疑似由AI辅助生成,浏览时请结合常识与多方信息审慎甄别。
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

相关阅读更多精彩内容

  • 冷漠的热心的毫无差异 多多祈祷祈祷祈祷 明天的路上阳光洒满 灰色的雾霾天 温暖的感觉久违了 反而不适应 吸一口热气...
    年轮止阅读 326评论 2 3
  • 前天看了圆桌派,听窦大叔的侃侃而谈,其中谈到了日本的深夜食堂,日本人夜间的生活模式。不同的人物,不同的故事,确是日...
    厉羽悠君阅读 1,834评论 0 2

友情链接更多精彩内容