zoukankan      html  css  js  c++  java
  • [coursera machine learning] Week 1

    1. machine learning 问题的分类:

    Supervised Learning: right answers given in samples

               Regression: continuous result

               Classification: discrete valued output

    Unsupervised Learning: learning about a dataset without correct answers

                Clustering: divide dataset into groups

                Non-clustering: separate different voices from a voice sample (cocktail party)

    2. Model Representation:

    training set -> learning algorithms -> hypothesis 

    x -> hypothesis -> y

    3. Cost Function:

    m is the number of samples

    4. Gradient Descent (not only for linear regression)

    n is the number of features

    minimization a function (ect. cost function)

    the alpha is learning rate

    all theta should be updated simultaneously.

    5. Normal Equation Formula

    comparison of gradient descent and normal equation formula.

    normal equation is faster with less features.

    gradient descent is faster with more features.

  • 相关阅读:
    Python调用C++的DLL
    Go-map
    Go-切片
    Go-数组
    Go-流程控制
    Go-运算符
    Go-变量和常量
    Go-VS Code配置Go语言开发环境
    Go-跨平台编译
    Go-从零开始搭建Go语言开发环境
  • 原文地址:https://www.cnblogs.com/Gryffin/p/6687973.html
Copyright © 2011-2022 走看看