zoukankan      html  css  js  c++  java
  • Various Optimization Algorithms For Training Neural Network

    from

    https://towardsdatascience.com/optimizers-for-training-neural-network-59450d71caf6

     

    Optimizers help to get results faster

    Gradient Descent

    Stochastic Gradient Descent

    Mini-Batch Gradient Descent

    Momentum

    Nesterov Accelerated Gradient

     

    NAG vs momentum at local minima

    Adagrad

     

    A derivative of loss function for given parameters at a given time t.

     

    Update parameters for given input i and at time/iteration t

    AdaDelta

     

    Update the parameters

    Adam

     

    First and second order of momentum

    Update the parameters

    Comparison between various optimizers

     

    Comparison 1

     

    comparison 2

    Conclusions

    出处:http://www.cnblogs.com/lightsong/ 本文版权归作者和博客园共有,欢迎转载,但未经作者同意必须保留此段声明,且在文章页面明显位置给出原文连接。
  • 相关阅读:
    1.计算机初识
    re模块前瞻后顾 快速
    getattr 对类使用
    sklearn iris快速
    numpy c_
    sorted函数 字典按值倒序 实现
    logging快速入门
    configparser快速应用
    reduce 和 map 函数
    一个简单的类继承
  • 原文地址:https://www.cnblogs.com/lightsong/p/14643083.html
Copyright © 2011-2022 走看看