错题集

1. You are training a classification model with logistic regression. Which of the following statements are true? Check all that apply.【D】
A. Introducing regularization to the model always results in equal or better performance on the training set.
B. Introducing regularization to the model always results in equal or better performance on the training set.
【解析】如果引入的正则化的lambda参数过大,就会导致欠拟合,从而会导致最后的结果更糟。
C. Adding many new features to the model helps prevent overfitting on the training set.
【解析】增加许多新的特征到预测模型里会让预测模型更好的拟合训练集的数据,但是如果添加的特征太多,就会有可能导致过拟合,从而导致不能泛化到需要预测的数据,因而导致预测不够精准。
D. Adding a new feature to the model always results in equal or better performance on examples not in the training set.
【解析】增加新的变量可能会导致过度拟合,从而导致更糟糕的结果预测,而不是训练集的拟合。
E. Adding a new feature to the model always results in equal or better performance on the training set.
【解析】增加新的特征会让预测模型更佳具有表达性,从而会更好的拟合训练集。By adding a new feature, our model must be more (or just as) expressive, thus allowing it learn more complex hypotheses to fit the training set.


2.Which of the following statements are true? Check all that apply.【BD】
A. Suppose you have a multi-class classification problem with three classes, trained with a 3 layer network. Let a(3)1=(hΘ(x))1 be the activation of the first output unit, and similarly a(3)2=(hΘ(x))2 and a(3)3=(hΘ(x))3. Then for any input x, it must be the case that a(3)1+a(3)2+a(3)3=1.

B.In a neural network with many layers, we think of each successive layer as being able to use the earlier layers as features, so as to be able to compute increasingly complex functions.

C.If a neural network is overfitting the data, one solution would be to decrease the regularization parameter λ.

D.If a neural network is overfitting the data, one solution would be to increase the regularization parameter λ.


3.You are using the neural network pictured below and have learned the parameters Θ(1)=[11−1.55.13.72.3] (used to compute a(2)) and Θ(2)=[10.6−0.8] (used to compute a(3)} as a function of a(2)). Suppose you swap the parameters for the first hidden layer between its two units so Θ(1)=[115.1−1.52.33.7] and also swap the output layer so Θ(2)=[1−0.80.6]. How will this change the value of the output hΘ(x) ?【A】

A.It will stay the same.
B.It will increase.
C.It will decrease
D.Insufficient information to tell: it may increase or decrease.


4. Which of the following statements aretrue? Check all that apply. 【BD】
A. Suppose you are traininga logistic regression classifier using polynomial features and want to selectwhat degree polynomial (denoted d in thelecture videos) to use. After training the classifier on the entire trainingset, you decide to use a subset of the training examples as a validation set.This will work just as well as having a validation set that is separate(disjoint) from the training set.
B. Suppose you areusing linear regression to predict housing prices, and your dataset comessorted in order of increasing sizes of houses. It is then important to randomlyshuffle the dataset before splitting it into training, validation and testsets, so that we don’t have all the smallest houses going into the trainingset, and all the largest houses going into the test set.
**C. **It is okay touse data from the test set to choose the regularization parameter λ, but not themodel parameters (θ).
D. A typical splitof a dataset into training, validation and test sets might be 60% training set,20% validation set, and 20% test set.


5.Suppose you have a dataset with n = 10 features and m = 5000 examples. After training your logistic regression classifier with gradient descent, you find that it has underfit the training set and does not achieve the desired performance on the training or cross validation sets. Which of the following might be promising steps to take? Check all that apply.【AC】
A. Use an SVM with a Gaussian Kernel.
【解析】带有高斯核的SVM可以拟合出更复杂的决策边界,这意味着可以一定程度上修正前拟合。
B. Use a different optimization method since using gradient descent to train logistic regression might result in a local minimum.
C. Create / add new polynomial features.
D. Increase λ.

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 204,793评论 6 478
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 87,567评论 2 381
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 151,342评论 0 338
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 54,825评论 1 277
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 63,814评论 5 368
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 48,680评论 1 281
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 38,033评论 3 399
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 36,687评论 0 258
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 42,175评论 1 300
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 35,668评论 2 321
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 37,775评论 1 332
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 33,419评论 4 321
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 39,020评论 3 307
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 29,978评论 0 19
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 31,206评论 1 260
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 45,092评论 2 351
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 42,510评论 2 343

推荐阅读更多精彩内容

  • 一 时间过得可真快,转眼已经过了一年,我也开始着适应深圳的生活。回想起深圳一年,感触颇多,无论心态还是状态都和以前...
    氧气是个地铁阅读 362评论 1 4
  • 大家一定听到有女孩子因为男生不讲卫生,或者不记得自己生日、纪念日等一些小事怒而分手的故事。男生可能觉得女生小题大作...
    小步徐行阅读 173评论 0 3
  • 還沒開始上宗教課的時候心裡總是覺得很抗拒這個科目,心想著自己天天拿著本聖經跟著大家唱聖歌這還蠻難接受的,但是現在我...
    Denise0112阅读 367评论 0 0