zoukankan      html  css  js  c++  java
  • L1和L2特征的适用场景

    How to decide which regularization (L1 or L2) to use?

    Is there collinearity among some features? L2 regularization can improve prediction quality in this case, as implied by its alternative name, "ridge regression." However, it is true in general that either form of regularization will improve out-of-sample prediction, whether or not there is multicollinearity and whether or not there are irrelevant features, simply because of the shrinkage properties of the regularized estimators. L1 regularization can't help with multicollinearity; it will just pick the feature with the largest correlation to the outcome. Ridge regression can obtain coefficient estimates even when you have more features than examples... but the probability that any will be estimated precisely at 0 is 0.

    What are the pros & cons of each of L1 / L2 regularization?

    L1 regularization can't help with multicollinearity. L2 regularization can't help with feature selection. Elastic net regression can solve both problems. L1 and L2 regularization are taught for pedagogical reasons, but I'm not aware of any situation where you want to use regularized regressions but not try an elastic net as a more general solution, since it includes both as special cases.

    实际使用过程中,如果数据量不是很大,用L2的精度要好。

    多重共线性(multicollinearity)指的是你建模的时候,解释变量之间有高度相关性。

  • 相关阅读:
    自定义CopyOnWriteHashMap
    NIO中Buffer缓冲区的实现
    TOMCAT原理详解及请求过程
    XSS的原理分析与解剖
    mysql分页查询优化
    java如何正确停止一个线程
    Centos搭建ElasticSearch
    redis集群原理
    Idea-每次修改JS文件都需要重启Idea才能生效解决方法
    java 加密 解密 Illegal key size
  • 原文地址:https://www.cnblogs.com/wuxiangli/p/7488866.html
Copyright © 2011-2022 走看看