zoukankan      html  css  js  c++  java
  • Various Optimization Algorithms For Training Neural Network

    from

    https://towardsdatascience.com/optimizers-for-training-neural-network-59450d71caf6

     

    Optimizers help to get results faster

    Gradient Descent

    Stochastic Gradient Descent

    Mini-Batch Gradient Descent

    Momentum

    Nesterov Accelerated Gradient

     

    NAG vs momentum at local minima

    Adagrad

     

    A derivative of loss function for given parameters at a given time t.

     

    Update parameters for given input i and at time/iteration t

    AdaDelta

     

    Update the parameters

    Adam

     

    First and second order of momentum

    Update the parameters

    Comparison between various optimizers

     

    Comparison 1

     

    comparison 2

    Conclusions

    出处:http://www.cnblogs.com/lightsong/ 本文版权归作者和博客园共有,欢迎转载,但未经作者同意必须保留此段声明,且在文章页面明显位置给出原文连接。
  • 相关阅读:
    mongodb分片集群报错261
    Galera集群安装报错Operation not permitted
    k8s部署php程序访问显示报错
    k8使用secret拉取镜像失败的问题
    nginx反向代理与负载均衡
    ---connet MPT device to PC under Ubuntu
    ----vysor
    ---su
    ---mysql utf8 设置
    ---Android logcat 记录日志
  • 原文地址:https://www.cnblogs.com/lightsong/p/14643083.html
Copyright © 2011-2022 走看看