zoukankan      html  css  js  c++  java
  • Tensorflow 的优化器

    class tf.train.GradientDescentOptimizer

    tf.train.GradientDescentOptimizer.__init__(learning_rate, use_locking=False, name='GradientDescent')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate to use.
    • use_locking: If True use locks for update operation.s
    • name: Optional name prefix for the operations created when applying gradients. Defaults to "GradientDescent".

    class tf.train.AdagradOptimizer

    tf.train.AdagradOptimizer.__init__(learning_rate, initial_accumulator_value=0.1, use_locking=False, name='Adagrad')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate.
    • initial_accumulator_value: A floating point value. Starting value for the accumulators, must be positive.
    • use_locking: If True use locks for update operations.
    • name: Optional name prefix for the operations created when applying gradients. Defaults to "Adagrad".
      Raises:
    • ValueError: If the initial_accumulator_value is invalid.

    class tf.train.MomentumOptimizer

    tf.train.MomentumOptimizer.__init__(learning_rate, momentum, use_locking=False, name='Momentum')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate.
    • momentum: A Tensor or a floating point value. The momentum.
    • use_locking: If True use locks for update operations.
    • name: Optional name prefix for the operations created when applying gradients. Defaults to "Momentum".

    class tf.train.AdamOptimizer

    tf.train.AdamOptimizer.__init__(learning_rate=0.001, beta1=0.9, beta2=0.999, epsilon=1e-08, use_locking=False, name='Adam')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate.
    • beta1: A float value or a constant float tensor. The exponential decay rate for the 1st moment estimates.
    • beta2: A float value or a constant float tensor. The exponential decay rate for the 2st moment estimates.
    • epsilon: A small constant for numerical stability.
    • use_locking: If True use locks for update operation.s
    • name: Optional name for the operations created when applying gradients. Defaults to "Adam".

    class tf.train.FtrlOptimizer

    tf.train.FtrlOptimizer.__init__(learning_rate, learning_rate_power=-0.5, initial_accumulator_value=0.1, l1_regularization_strength=0.0, l2_regularization_strength=0.0, use_locking=False, name='Ftrl')
    Args:

    • learning_rate: A float value or a constant float Tensor.
    • learning_rate_power: A float value, must be less or equal to zero.
    • initial_accumulator_value: The starting value for accumulators. Only positive values are allowed.
    • l1_regularization_strength: A float value, must be greater than or equal to zero.
    • l2_regularization_strength: A float value, must be greater than or equal to zero.
    • use_locking: If True use locks for update operations.
    • name: Optional name prefix for the operations created when applying gradients. Defaults to "Ftrl".
      Raises:
    • ValueError: if one of the arguments is invalid.

    class tf.train.RMSPropOptimizer

    tf.train.RMSPropOptimizer.__init__(learning_rate, decay, momentum=0.0, epsilon=1e-10, use_locking=False, name='RMSProp')
    Args:

    • learning_rate: A Tensor or a floating point value. The learning rate.
    • decay: discounting factor for the history/coming gradient
    • momentum: a scalar tensor.
    • epsilon: small value to avoid zero denominator.
    • use_locking: If True use locks for update operation.
    • name: Optional name prefic for the operations created when applying gradients. Defaults to "RMSProp".
    学而不思则惘
  • 相关阅读:
    如何使用android模拟器截图
    android SD卡文件的读写
    res/raw下的资源文件读写
    window 运行指令(1)
    javax.swing.JOptionPane.showMessageDialog() 方法
    Eclipse快捷键
    EditPlus怎样自动换行
    java的HashCode方法
    eclipse汉化全程
    (转载)jdbc事务处理
  • 原文地址:https://www.cnblogs.com/keyky/p/7899051.html
Copyright © 2011-2022 走看看