zoukankan      html  css  js  c++  java
  • [ML L3] SVM Intro

    A support vector machine (SVM) is a supervised machine learning model that uses classification algorithms for two-group classification problems. After giving an SVM model sets of labeled training data for each  category, they’re able to categorize new text.

    So you’re working on a text classification problem. You’re refining your training data, and maybe you’ve even tried stuff out using Naive Bayes. But now you’re feeling confident in your dataset, and want to take it one step further. Enter Support Vector Machines (SVM): a fast and dependable classification algorithm that performs very well with a limited amount of data.

    How it works?

    The basics of Support Vector Machines and how it works are best understood with a simple example. Let’s imagine we have two tags: red and blue, and our data has two featuresx and y. We want a classifier that, given a pair of (x,y) coordinates, outputs if it’s either red or blue. We plot our already labeled training data on a plane:

    A support vector machine takes these data points and outputs the hyperplane (which in two dimensions it’s simply a line) that best separates the tags. This line is the decision boundary: anything that falls to one side of it we will classify as blue, and anything that falls to the other as red.

    support vector machines (svm)

    But, what exactly is the best hyperplane? For SVM, it’s the one that maximizes the margins from both tags. In other words: the hyperplane (remember it’s a line in this case) whose distance to the nearest element of each tag is the largest.

    support vector machines (svm)

    None linear data?

    Now this example was easy, since clearly the data was linearly separable — we could draw a straight line to separate red and blue. Sadly, usually things aren’t that simple. Take a look at this case:

    support vector machines (svm)

    We can introduce a new linear input param:

    z = x^2 + y^2 


    support vector machines (svm)

    That’s great! Note that since we are in three dimensions now, the hyperplane is a plane parallel to the x axis at a certain z (let’s say z = 1).

    What’s left is mapping it back to two dimensions:

    support vector machines (svm)

    In other words, we convert a none linear dataset by introduct a new dimensions

    [Ref]: https://monkeylearn.com/blog/introduction-to-support-vector-machines-svm/#:~:text=A%20support%20vector%20machine%20(SVM,on%20a%20text%20classification%20problem.

    from sklearn.svm import SVC
    clf = SVC(gamma='auto', kernel="rbf", C=10000.0)
    clf.fit(features_train, labels_train)
    accuracy = clf.score(features_test, labels_test)
    
    ## 1% data
    ##kernal="linear" accuracy=0.88
    ##kernal="rbf" accuracy=0.61
    ##kernal="rbf" C=10.0 accuracy=0.61
    ##kernal="rbf" C=100.0 accuracy=0.61
    ##kernal="rbf" C=1000.0 accuracy=0.82
    ##kernal="rbf" C=10000.0 accuracy=0.89
    
    ## 35% data
    ##kernal="rbf" C=10000.0 accuracy=0.96
    
    ## 50% data
    ##kernal="rbf" C=10000.0 accuracy=0.987
    
    ## 100% data
    ##kernal="rbf" C=10000.0 accuracy>0.99
  • 相关阅读:
    几个常用myeclipse快捷键
    5G layer
    customize the entry point of pod runtime
    关于JS Pormise的认识
    修改 /etc/pam.d/login, linux 本地账号密码无法登陆,一直返回 登陆的login界面
    Java支付宝PC网站支付功能开发(详细教程)
    支付宝PC支付功能异步通知签名验证失败解决方案
    提交代码出现 Push to origin/master was rejected 错误解决方法
    易语言连接RCON详细教程实例(演示连接Unturned服务器RCON)
    易语言调用外部DLL详细实例教程
  • 原文地址:https://www.cnblogs.com/Answer1215/p/13193715.html
Copyright © 2011-2022 走看看