zoukankan      html  css  js  c++  java
  • Tensorflow 的优化器

    class tf.train.GradientDescentOptimizer

    tf.train.GradientDescentOptimizer.__init__(learning_rate, use_locking=False, name='GradientDescent')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate to use.
    • use_locking: If True use locks for update operation.s
    • name: Optional name prefix for the operations created when applying gradients. Defaults to "GradientDescent".

    class tf.train.AdagradOptimizer

    tf.train.AdagradOptimizer.__init__(learning_rate, initial_accumulator_value=0.1, use_locking=False, name='Adagrad')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate.
    • initial_accumulator_value: A floating point value. Starting value for the accumulators, must be positive.
    • use_locking: If True use locks for update operations.
    • name: Optional name prefix for the operations created when applying gradients. Defaults to "Adagrad".
      Raises:
    • ValueError: If the initial_accumulator_value is invalid.

    class tf.train.MomentumOptimizer

    tf.train.MomentumOptimizer.__init__(learning_rate, momentum, use_locking=False, name='Momentum')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate.
    • momentum: A Tensor or a floating point value. The momentum.
    • use_locking: If True use locks for update operations.
    • name: Optional name prefix for the operations created when applying gradients. Defaults to "Momentum".

    class tf.train.AdamOptimizer

    tf.train.AdamOptimizer.__init__(learning_rate=0.001, beta1=0.9, beta2=0.999, epsilon=1e-08, use_locking=False, name='Adam')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate.
    • beta1: A float value or a constant float tensor. The exponential decay rate for the 1st moment estimates.
    • beta2: A float value or a constant float tensor. The exponential decay rate for the 2st moment estimates.
    • epsilon: A small constant for numerical stability.
    • use_locking: If True use locks for update operation.s
    • name: Optional name for the operations created when applying gradients. Defaults to "Adam".

    class tf.train.FtrlOptimizer

    tf.train.FtrlOptimizer.__init__(learning_rate, learning_rate_power=-0.5, initial_accumulator_value=0.1, l1_regularization_strength=0.0, l2_regularization_strength=0.0, use_locking=False, name='Ftrl')
    Args:

    • learning_rate: A float value or a constant float Tensor.
    • learning_rate_power: A float value, must be less or equal to zero.
    • initial_accumulator_value: The starting value for accumulators. Only positive values are allowed.
    • l1_regularization_strength: A float value, must be greater than or equal to zero.
    • l2_regularization_strength: A float value, must be greater than or equal to zero.
    • use_locking: If True use locks for update operations.
    • name: Optional name prefix for the operations created when applying gradients. Defaults to "Ftrl".
      Raises:
    • ValueError: if one of the arguments is invalid.

    class tf.train.RMSPropOptimizer

    tf.train.RMSPropOptimizer.__init__(learning_rate, decay, momentum=0.0, epsilon=1e-10, use_locking=False, name='RMSProp')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate.
    • decay: discounting factor for the history/coming gradient
    • momentum: a scalar tensor.
    • epsilon: small value to avoid zero denominator.
    • use_locking: If True use locks for update operation.
    • name: Optional name prefic for the operations created when applying gradients. Defaults to "RMSProp".
    学而不思则惘
  • 相关阅读:
    解析SparkStreaming和Kafka集成的两种方式
    不可不知的资源管理调度器Hadoop Yarn
    linux系统层面调优和常见的面试题
    OpenLayer加载百度坐标偏移问题解决(方案二)
    LeaFlet之GeoJson类介绍
    OpenLayer加载百度坐标偏移问题解决(方案一)
    LeaFlet迁徙图的制作
    OpenLayer4与mapV结合蜂巢图效果
    ArcGIS API For JS 实现右键菜单栏的功能
    OpenLayer4结合高德地图API实现交通态势的获取信息
  • 原文地址:https://www.cnblogs.com/keyky/p/7899051.html
Copyright © 2011-2022 走看看