zoukankan      html  css  js  c++  java
  • tensorflow API _ 4 (优化器配置)

    """Configures the optimizer used for training.

    Args:
    learning_rate: A scalar or `Tensor` learning rate.

    Returns:
    An instance of an optimizer.

    Raises:
    ValueError: if FLAGS.optimizer is not recognized.
    """
    if FLAGS.optimizer == 'adadelta':
    optimizer = tf.train.AdadeltaOptimizer(
    learning_rate,
    rho=FLAGS.adadelta_rho,
    epsilon=FLAGS.opt_epsilon)
    elif FLAGS.optimizer == 'adagrad':
    optimizer = tf.train.AdagradOptimizer(
    learning_rate,
    initial_accumulator_value=FLAGS.adagrad_initial_accumulator_value)
    elif FLAGS.optimizer == 'adam':
    optimizer = tf.train.AdamOptimizer(
    learning_rate,
    beta1=FLAGS.adam_beta1,
    beta2=FLAGS.adam_beta2,
    epsilon=FLAGS.opt_epsilon)
    elif FLAGS.optimizer == 'ftrl':
    optimizer = tf.train.FtrlOptimizer(
    learning_rate,
    learning_rate_power=FLAGS.ftrl_learning_rate_power,
    initial_accumulator_value=FLAGS.ftrl_initial_accumulator_value,
    l1_regularization_strength=FLAGS.ftrl_l1,
    l2_regularization_strength=FLAGS.ftrl_l2)
    elif FLAGS.optimizer == 'momentum':
    optimizer = tf.train.MomentumOptimizer(
    learning_rate,
    momentum=FLAGS.momentum,
    name='Momentum')
    elif FLAGS.optimizer == 'rmsprop':
    optimizer = tf.train.RMSPropOptimizer(
    learning_rate,
    decay=FLAGS.rmsprop_decay,
    momentum=FLAGS.rmsprop_momentum,
    epsilon=FLAGS.opt_epsilon)
    elif FLAGS.optimizer == 'sgd':
    optimizer = tf.train.GradientDescentOptimizer(learning_rate)
    else:
    raise ValueError('Optimizer [%s] was not recognized', FLAGS.optimizer)
    return optimizer
  • 相关阅读:
    CF238B Boring Partition
    CF1424G Years
    CF995D Game
    CF468C Hack it!
    CF1417A Copy-paste
    CF1417B Two Arrays
    CF849B Tell Your World
    [洛谷P3389][模板]高斯消元法
    CF1225D
    P6687
  • 原文地址:https://www.cnblogs.com/Libo-Master/p/8926154.html
Copyright © 2011-2022 走看看