[NN] Regularization Summary

Dropout:

  • Dropout is a regularization technique.
  • You only use dropout during training. Don't use dropout (randomly eliminate nodes) during test time.
  • Apply dropout both during forward and backward propagation.
  • 在训练的时候,记得除以keep_prob来保持输出相同的期望。During training time, divide each dropout layer by keep_prob to keep the same expected value for the activations. For example, if keep_prob is 0.5, then we will on average shut down half the nodes, so the output will be scaled by 0.5 since only the remaining half are contributing to the solution. Dividing by 0.5 is equivalent to multiplying by 2. Hence, the output now has the same expected value. You can check that this works even when keep_prob is other values than 0.5.

What we want you to remember from this notebook:

  • Regularization will help you reduce overfitting.
  • Regularization will drive your weights to lower values.
  • L2 regularization and Dropout are two very effective regularization techniques.
最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容

  • 近年来“跨界”已经成为一个娱乐圈的潮流 演员跨界做导演,导演忙着做演员,歌手转行做段子手 最绝的是运动员都转行做明...
    naib阅读 2,415评论 0 0
  • 姥姥的一生很传奇,可以说是一个铁娘子,可是现在却得了阿兹海默症,现在会经常忘记刚刚吃过的饭菜,经常自己发呆,嗜睡…...
    一叶彰木阅读 890评论 3 3
  • 今天用新的方法来练习写字啦,果然是很累很累的呀。累并快乐着。
    余一浳阅读 1,020评论 0 2