zoukankan      html  css  js  c++  java
  • TFLearn构建神经网络

    TFLearn构建神经网络

    Building the network

    TFLearn lets you build the network by defining the layers.

    Input layer

    For the input layer, you just need to tell it how many units you have. For example,

    net = tflearn.input_data([None, 100])
    

    would create a network with 100 input units. The first element in the list, None in this case, sets the batch size. Setting it to None here leaves it at the default batch size.

    The number of inputs to your network needs to match the size of your data. For this example, we're using 10000 element long vectors to encode our input data, so we need 10000 input units.

    Adding layers

    To add new hidden layers, you use

    net = tflearn.fully_connected(net, n_units, activation='ReLU')
    

    This adds a fully connected layer where every unit in the previous layer is connected to every unit in this layer. The first argument net is the network you created in the tflearn.input_data call. It's telling the network to use the output of the previous layer as the input to this layer. You can set the number of units in the layer with n_units, and set the activation function with the activation keyword. You can keep adding layers to your network by repeated calling net = tflearn.fully_connected(net, n_units).

    Output layer

    The last layer you add is used as the output layer. Therefore, you need to set the number of units to match the target data. In this case we are predicting two classes, positive or negative sentiment. You also need to set the activation function so it's appropriate for your model. Again, we're trying to predict if some input data belongs to one of two classes, so we should use softmax.

    net = tflearn.fully_connected(net, 2, activation='softmax')
    

    Training

    To set how you train the network, use

    net = tflearn.regression(net, optimizer='sgd', learning_rate=0.1, loss='categorical_crossentropy')
    

    Again, this is passing in the network you've been building. The keywords:

    • optimizer sets the training method, here stochastic gradient descent
    • learning_rate is the learning rate
    • loss determines how the network error is calculated. In this example, with the categorical cross-entropy.

    Finally you put all this together to create the model with tflearn.DNN(net). So it ends up looking something like

    net = tflearn.input_data([None, 10])                          # Input
    net = tflearn.fully_connected(net, 5, activation='ReLU')      # Hidden
    net = tflearn.fully_connected(net, 2, activation='softmax')   # Output
    net = tflearn.regression(net, optimizer='sgd', learning_rate=0.1, loss='categorical_crossentropy')
    model = tflearn.DNN(net)
    
  • 相关阅读:
    如何下载、安装、启动WebTours
    Jmeter--录制脚本-用户参数化-添加断言
    UTF-8与GBK的区别
    日志的等级
    程序员与英语
    遇到安装app不识别的情况
    苹果手机怎么追踪定位
    如何制作微信表情
    比美图还要简单的在线photoshop
    Mac 下 搭建 svn 服务器
  • 原文地址:https://www.cnblogs.com/songdanzju/p/7441700.html
Copyright © 2011-2022 走看看