zoukankan      html  css  js  c++  java
  • 特征与多项式回归

    Features and Polynomial Regression

    We can improve our features and the form of our hypothesis function in a couple different ways.

    We can combine multiple features into one. For example, we can combine x1 and x2 into a new feature x3 by taking x1x2.

    Polynomial Regression

    Our hypothesis function need not be linear (a straight line) if that does not fit the data well.

    We can change the behavior or curve of our hypothesis function by making it a quadratic, cubic or square root function (or any other form).

    For example, if our hypothesis function is hθ(x)=θ0+θ1x1 then we can create additional features based on x1, to get the quadratic function hθ(x)=θ0+θ1x1+θ2x21 or the cubic function hθ(x)=θ0+θ1x1+θ2x21+θ3x31

    In the cubic version, we have created new features x2 and x3 where x2=x21 and x3=x31.

    To make it a square root function, we could do: hθ(x)=θ0+θ1x1+θ2x1−−√

    One important thing to keep in mind is, if you choose your features this way then feature scaling becomes very important.

    eg. if x1 has range 1 - 1000 then range of x21 becomes 1 - 1000000 and that of x31 becomes 1 - 1000000000

  • 相关阅读:
    51Nod 1267 4个数和为0 二分
    51Nod 1090 3个数和为0 set 二分优化
    51Nod 1001 数组中和等于K的数对 Set
    Codeforces 890C
    Codeforces 890B
    Codeforces 890A
    51nod 1058 N的阶乘的长度 位数公式
    C#调用本机摄像头
    读取、写入excel数据
    数据库笔记--基本应用
  • 原文地址:https://www.cnblogs.com/ne-zha/p/7295333.html
Copyright © 2011-2022 走看看