zoukankan      html  css  js  c++  java
  • tf_upgrade_v2.exe实验

    实验前

    import tensorflow as tf
    import numpy as np
    #create data
    x_data=np.random.rand(100).astype(np.float32)#训练样本
    y_data=x_data*0.1+0.3#求参数(隐去真实参数和函数式)怎么知道样本符合的这是线性函数呢?如果假设样本符合的是二次函数呢?能求出参数值吗?
    ###create tensorflow structure start###
    Weights = tf.Variable(tf.random_uniform([1],-1.0,1.0))#随机参数初值
    biases = tf.Variable(tf.zeros([1]))
    y=Weights*x_data+biases#按随机参数拟合的y值一开始和y_data真值差很大
    loss = tf.reduce_mean(tf.square(y-y_data))#损失值
    optimizer = tf.train.GradientDescentOptimizer(0.5)
    ###create tensorflow structure end###
    train = optimizer.minimize(loss)#训练
    init = tf.initialize_all_variables()#初始化
    sess = tf.Session()
    sess.run(init)
    for step in range(201):
        sess.run(train)
        if step % 20 == 0:
            print(step, sess.run(Weights), sess.run(biases))

    实验后:Weights、biases初始值为随机值,但是随着迭代它们会趋近于真值。条件为loss最小。

    import tensorflow as tf
    import numpy as np
    #create data
    x_data=np.random.rand(100).astype(np.float32)
    y_data=x_data*0.1+0.3
    ###create tensorflow structure start###
    Weights = tf.Variable(tf.random.uniform([1],-1.0,1.0))
    biases = tf.Variable(tf.zeros([1]))
    y=Weights*x_data+biases
    loss = tf.reduce_mean(input_tensor=tf.square(y-y_data))
    optimizer = tf.compat.v1.train.GradientDescentOptimizer(0.5)
    ###create tensorflow structure end###
    train = optimizer.minimize(loss)
    init = tf.compat.v1.initialize_all_variables()
    sess = tf.compat.v1.Session()
    sess.run(init)
    for step in range(201):
        sess.run(train)
        if step % 20 == 0:
            print(step, sess.run(Weights), sess.run(biases))

    代码对比可看出代码前后的变化

    https://blog.csdn.net/u012223913/article/details/79097297

  • 相关阅读:
    mysql数据库(1)
    通过全局异常处理机制实现接口参数校验返回指定返回类型
    http接口安全校验
    java 锁机制介绍
    通过反射获取类的所有属性值拼接成字符串工具类
    Mybatis中出现java.sql.SQLException: 无效的列类型: 1111
    判断两个Long相等
    jwt工具类
    mybatis #{}和${}的区别是什么
    报错解决NoSuchMethod。。。
  • 原文地址:https://www.cnblogs.com/2008nmj/p/11849957.html
Copyright © 2011-2022 走看看