zoukankan      html  css  js  c++  java
  • Various Optimization Algorithms For Training Neural Network

    from

    https://towardsdatascience.com/optimizers-for-training-neural-network-59450d71caf6

     

    Optimizers help to get results faster

    Gradient Descent

    Stochastic Gradient Descent

    Mini-Batch Gradient Descent

    Momentum

    Nesterov Accelerated Gradient

     

    NAG vs momentum at local minima

    Adagrad

     

    A derivative of loss function for given parameters at a given time t.

     

    Update parameters for given input i and at time/iteration t

    AdaDelta

     

    Update the parameters

    Adam

     

    First and second order of momentum

    Update the parameters

    Comparison between various optimizers

     

    Comparison 1

     

    comparison 2

    Conclusions

    出处:http://www.cnblogs.com/lightsong/ 本文版权归作者和博客园共有,欢迎转载,但未经作者同意必须保留此段声明,且在文章页面明显位置给出原文连接。
  • 相关阅读:
    如何删除或更改已经释放的TR
    [问题解决]调用BAPI_ACC_DOCUMENT_POST时报错“被合并的公司 XXXX 和 XXXX 是不同的”
    如何判断暂存采购订单(EKKO-MEMORY)
    Django之HttpRequest和HttpReponse
    Django之模板继承
    Django之模板语法
    python库之selectors
    python库之threading
    JDBC学习笔记(9)——DBUtils的使用
    XML学习笔记(1)--XML概述
  • 原文地址:https://www.cnblogs.com/lightsong/p/14643083.html
Copyright © 2011-2022 走看看