zoukankan      html  css  js  c++  java
  • Local weighted regression

    Local weighted regression(we need to find theta which can minimize this formular for each prediction) 

    Local weighted regression is very costly. This algorithm will train the theta vector while predicting each time. 

    Besides Local weighted regression is kind of parameteric learning algorithm, which means the number of paramters is not fixed.

    Adding weight in this way : weight gives higher weights for the part of training samples which are close to the target input, and gives lower weights for the other part of training samples which are far from the target input. Approximately we can ignore the effect of lower weighted training samples, only consider the higher weighted samples, which transforms the original problem into a local linear regression problem. Intuitively in this way would lead to a good prediction. 

    As Andrew Ng mentioned, KD-tree may help improve the efficiency of this problem. Surely is the fact, that KD-tree is pretty good at finding the nearest neighbers.

    It would always be not good to use regression algorithms to sovle classification problems. 

  • 相关阅读:
    关于换行
    WebService
    C#操作XML的通用方法总结
    19个必须知道的Visual Studio快捷键
    Asp.net C# 把 Datatable转换成JSON 字符串
    C#中的多态性
    virtual和abstract
    DataTable的过滤需要的数据
    自定义控件之瀑布流与水波纹实现
    反转链表
  • 原文地址:https://www.cnblogs.com/flytomylife/p/3087485.html
Copyright © 2011-2022 走看看