zoukankan      html  css  js  c++  java
  • 04 Regularization

    Regularization

    for Linear Regression and Logistic Regression

    Define

    1. under-fitting 欠拟合(high bias)
    2. over-fitting 过拟合 (high variance):have too many features, fail to generalize(泛化) to new examples.

    Addressing over-fitting

    1. Reduce number of features.
      • Manually select which features to keep.
      • Model selection algorithm.
    2. Regularization
      • Keep all the features. but reduce magnitude/values of parameters ( heta_j).
      • Works well when we have a lot of features, each of whitch contributes a bit to predicting (y).

    Regularized Cost Function

    • [min_ heta dfrac{1}{2m} sum_{i=1}^m (h_ heta(x^{(i)}) - y^{(i)})^2 + lambda sum_{j=1}^n heta_j^2]

    Regularized Linear Regression

    1. Gradient Descent
      [
      egin{align*} & ext{Repeat} lbrace ewline & heta_0 := heta_0 - alpha frac{1}{m} sum_{i=1}^m (h_ heta(x^{(i)}) - y^{(i)})x_0^{(i)} ewline & heta_j := heta_j - alpha left[ left( frac{1}{m} sum_{i=1}^m (h_ heta(x^{(i)}) - y^{(i)})x_j^{(i)} ight) + frac{lambda}{m} heta_j ight] & j in lbrace 1,2...n brace ewline & brace end{align*}
      ]

      • 等价于
        [
        heta_j := heta_j(1 - alphafrac{lambda}{m}) - alphafrac{1}{m} sum_{i=1}^m(h_ heta(x^{(i)}) - y^{(i)})x_j^{(i)}
        ]
    2. Normal Equation
      [
      egin{align*}& heta = left( X^TX + lambda cdot L ight)^{-1} X^Ty ewline& ext{where} L = egin{bmatrix} 0 & & & & ewline & 1 & & & ewline & & 1 & & ewline & & & ddots & ewline & & & & 1 ewlineend{bmatrix}end{align*}
      ]

    • 对于不可逆的((X^TX)), ((X^TX + lambda.L)) 会可逆

    Regularized Logistic Regression

    1. Cost Function

    [
    J( heta) = -frac{1}{m} sum_{i=1}^m[y^{(i)}log(h_ heta(x^{(i)})) + (1 - y^{(i)})log(1 - h_ heta(x^{(i)}))] + frac{lambda}{2m}sum_{j=1}^n heta_j^2
    ]

    1. Gradient descent
      [
      egin{align*} & ext{Repeat} lbrace ewline & heta_0 := heta_0 - alpha frac{1}{m} sum_{i=1}^m (h_ heta(x^{(i)}) - y^{(i)})x_0^{(i)} ewline & heta_j := heta_j - alpha left[ left( frac{1}{m} sum_{i=1}^m (h_ heta(x^{(i)}) - y^{(i)})x_j^{(i)} ight) + frac{lambda}{m} heta_j ight] & j in lbrace 1,2...n brace ewline & brace end{align*}
      ]
  • 相关阅读:
    [Codeforces-div.1 809C] Find a car
    [Codeforces-div.1 55D] Beautiful numbers
    [BZOJ3598] [Scoi2014]方伯伯的商场之旅
    [BZOJ3131] [Sdoi2013]淘金
    [BZOJ2757] [SCOI2012]Blinker的仰慕者
    [BZOJ3329] Xorequ
    [POJ3744] Scout YYF I
    angular-file-upload 回显已上传的文件
    angular-file-upload 限制文件上传个数 获取已上传文件队列
    angular-file-upload 一款好用的文件上传组件,基本应用
  • 原文地址:https://www.cnblogs.com/QQ-1615160629/p/04-Regularization.html
Copyright © 2011-2022 走看看