zoukankan      html  css  js  c++  java
  • L1和L2特征的适用场景

    How to decide which regularization (L1 or L2) to use?

    Is there collinearity among some features? L2 regularization can improve prediction quality in this case, as implied by its alternative name, "ridge regression." However, it is true in general that either form of regularization will improve out-of-sample prediction, whether or not there is multicollinearity and whether or not there are irrelevant features, simply because of the shrinkage properties of the regularized estimators. L1 regularization can't help with multicollinearity; it will just pick the feature with the largest correlation to the outcome. Ridge regression can obtain coefficient estimates even when you have more features than examples... but the probability that any will be estimated precisely at 0 is 0.

    What are the pros & cons of each of L1 / L2 regularization?

    L1 regularization can't help with multicollinearity. L2 regularization can't help with feature selection. Elastic net regression can solve both problems. L1 and L2 regularization are taught for pedagogical reasons, but I'm not aware of any situation where you want to use regularized regressions but not try an elastic net as a more general solution, since it includes both as special cases.

    实际使用过程中,如果数据量不是很大,用L2的精度要好。

    多重共线性(multicollinearity)指的是你建模的时候,解释变量之间有高度相关性。

  • 相关阅读:
    C# winform 学习(一)
    C# winform 学习(一)
    C# winform 学习(二)
    C# winform 学习(二)
    C# Winform 学习(四)
    C# Winform 学习(四)
    C# winform 学习(三)
    gcc编译动态和静态链接库
    对深拷贝与浅拷贝的再次理解(默认构造函数是浅拷贝)
    QWaitCondition(和Java的Notify机制非常相像)
  • 原文地址:https://www.cnblogs.com/wuxiangli/p/7488866.html
Copyright © 2011-2022 走看看