zoukankan      html  css  js  c++  java
  • caffe搭建以及初步学习--win7-vs2013-gtx650tiboost-cuda8.0-cifar10训练和测试-2-完整解决方案cifar10_full_solver.prototxt

    首先总结前一节的内容。

    简单的讲,就是训练并测试了快速解决方案。

    转换数据格式:

    convert_cifar_data.exe data/cifar10 examples/cifar10 lmdb

    计算平均值

    compute_image_mean.exe -backend=lmdb examples/cifar10/cifar10_train_lmdb examples/cifar10/mean.binaryproto

    训练网络--快速解决方案

     caffe train --solver=examples/cifar10/cifar10_quick_solver.prototxt

    继续训练网络--再多迭代1000次

    caffe train --solver=examples/cifar10/cifar10_quick_solver_lr1.prototxt --snapshot=examples/cifar10/cifar10_quick_iter_4000.solverstate

    测试模型的准确率

    caffe test -model examples/cifar10/cifar10_quick_train_test.prototxt -weights examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 -iterations 100

    识别一只猫

     classification.exe examples/cifar10/cifar10_quick.prototxt examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 examples/cifar10/mean.binaryproto data/cifar10/synset_words.txt examples/images/cat.jpg

    识别人鱼自行车

    classification.exe examples/cifar10/cifar10_quick.prototxt examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 examples/cifar10/mean.binaryproto data/cifar10/synset_words.txt examples/images/fish-bike

    ===========================================华丽的分割线=============================================

    现在开始 尝试 完整解决方案  cifar10_full_solver.prototxt

     1.  训练

     caffe train --solver=examples/cifar10/cifar10_full_solver.prototxt

    原来以为半个小时能跑完的,结果发现,40分钟只跑了10000次迭代。总数是6万。所以,跑了很久。

    好在最后还是跑完了。以后事先要先预估一下计算量才好,不然,跑了也是白跑。

    2.测试

    测试:做10次迭代的结果

    caffe test -model examples/cifar10/cifar10_full_train_test.prototxt -weights examples/cifar10

    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$ caffe test -model examples/cifar10/cifar10_full_train_test.prototxt -weights examples/cifar10/cifar10_full_iter_60000.caffemodel.h5  -iterations 10 

    ------------------------------------------------------------------------------
    I0704 12:26:43.571471  5465 caffe.cpp:284] Use CPU.
    I0704 12:26:45.302641  5465 net.cpp:294] The NetState phase (1) differed from the phase (0) specified by a rule in layer cifar
    I0704 12:26:45.302878  5465 net.cpp:51] Initializing net from parameters:
    name: "CIFAR10_full"
    state {
      phase: TEST
      level: 0
      stage: ""
    }
    layer {
      name: "cifar"
      type: "Data"
      top: "data"
      top: "label"
      include {
        phase: TEST
      }
      transform_param {
        mean_file: "examples/cifar10/mean.binaryproto"
      }
      data_param {
        source: "examples/cifar10/cifar10_test_lmdb"
        batch_size: 100
        backend: LMDB
      }
    }
    layer {
      name: "conv1"
      type: "Convolution"
      bottom: "data"
      top: "conv1"
      param {
        lr_mult: 1
      }
      param {
        lr_mult: 2
      }
      convolution_param {
        num_output: 32
        pad: 2
        kernel_size: 5
        stride: 1
        weight_filler {
          type: "gaussian"
          std: 0.0001
        }
        bias_filler {
          type: "constant"
        }
      }
    }
    layer {
      name: "pool1"
      type: "Pooling"
      bottom: "conv1"
      top: "pool1"
      pooling_param {
        pool: MAX
        kernel_size: 3
        stride: 2
      }
    }
    layer {
      name: "relu1"
      type: "ReLU"
      bottom: "pool1"
      top: "pool1"
    }
    layer {
      name: "norm1"
      type: "LRN"
      bottom: "pool1"
      top: "norm1"
      lrn_param {
        local_size: 3
        alpha: 5e-05
        beta: 0.75
        norm_region: WITHIN_CHANNEL
      }
    }
    layer {
      name: "conv2"
      type: "Convolution"
      bottom: "norm1"
      top: "conv2"
      param {
        lr_mult: 1
      }
      param {
        lr_mult: 2
      }
      convolution_param {
        num_output: 32
        pad: 2
        kernel_size: 5
        stride: 1
        weight_filler {
          type: "gaussian"
          std: 0.01
        }
        bias_filler {
          type: "constant"
        }
      }
    }
    layer {
      name: "relu2"
      type: "ReLU"
      bottom: "conv2"
      top: "conv2"
    }
    layer {
      name: "pool2"
      type: "Pooling"
      bottom: "conv2"
      top: "pool2"
      pooling_param {
        pool: AVE
        kernel_size: 3
        stride: 2
      }
    }
    layer {
      name: "norm2"
      type: "LRN"
      bottom: "pool2"
      top: "norm2"
      lrn_param {
        local_size: 3
        alpha: 5e-05
        beta: 0.75
        norm_region: WITHIN_CHANNEL
      }
    }
    layer {
      name: "conv3"
      type: "Convolution"
      bottom: "norm2"
      top: "conv3"
      convolution_param {
        num_output: 64
        pad: 2
        kernel_size: 5
        stride: 1
        weight_filler {
          type: "gaussian"
          std: 0.01
        }
        bias_filler {
          type: "constant"
        }
      }
    }
    layer {
      name: "relu3"
      type: "ReLU"
      bottom: "conv3"
      top: "conv3"
    }
    layer {
      name: "pool3"
      type: "Pooling"
      bottom: "conv3"
      top: "pool3"
      pooling_param {
        pool: AVE
        kernel_size: 3
        stride: 2
      }
    }
    layer {
      name: "ip1"
      type: "InnerProduct"
      bottom: "pool3"
      top: "ip1"
      param {
        lr_mult: 1
        decay_mult: 250
      }
      param {
        lr_mult: 2
        decay_mult: 0
      }
      inner_product_param {
        num_output: 10
        weight_filler {
          type: "gaussian"
          std: 0.01
        }
        bias_filler {
          type: "constant"
        }
      }
    }
    layer {
      name: "accuracy"
      type: "Accuracy"
      bottom: "ip1"
      bottom: "label"
      top: "accuracy"
      include {
        phase: TEST
      }
    }
    layer {
      name: "loss"
      type: "SoftmaxWithLoss"
      bottom: "ip1"
      bottom: "label"
      top: "loss"
    }
    I0704 12:26:45.329610  5465 layer_factory.hpp:77] Creating layer cifar
    I0704 12:26:45.329771  5465 db_lmdb.cpp:35] Opened lmdb examples/cifar10/cifar10_test_lmdb
    I0704 12:26:45.329805  5465 net.cpp:84] Creating Layer cifar
    I0704 12:26:45.329823  5465 net.cpp:380] cifar -> data
    I0704 12:26:45.329852  5465 net.cpp:380] cifar -> label
    I0704 12:26:45.329874  5465 data_transformer.cpp:25] Loading mean file from: examples/cifar10/mean.binaryproto
    I0704 12:26:45.329970  5465 data_layer.cpp:45] output data size: 100,3,32,32
    I0704 12:26:45.342496  5465 net.cpp:122] Setting up cifar
    I0704 12:26:45.342577  5465 net.cpp:129] Top shape: 100 3 32 32 (307200)
    I0704 12:26:45.342600  5465 net.cpp:129] Top shape: 100 (100)
    I0704 12:26:45.342612  5465 net.cpp:137] Memory required for data: 1229200
    I0704 12:26:45.342634  5465 layer_factory.hpp:77] Creating layer label_cifar_1_split
    I0704 12:26:45.352715  5465 net.cpp:84] Creating Layer label_cifar_1_split
    I0704 12:26:45.352736  5465 net.cpp:406] label_cifar_1_split <- label
    I0704 12:26:45.352809  5465 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_0
    I0704 12:26:45.352833  5465 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_1
    I0704 12:26:45.352854  5465 net.cpp:122] Setting up label_cifar_1_split
    I0704 12:26:45.352866  5465 net.cpp:129] Top shape: 100 (100)
    I0704 12:26:45.352890  5465 net.cpp:129] Top shape: 100 (100)
    I0704 12:26:45.352898  5465 net.cpp:137] Memory required for data: 1230000
    I0704 12:26:45.352908  5465 layer_factory.hpp:77] Creating layer conv1
    I0704 12:26:45.352936  5465 net.cpp:84] Creating Layer conv1
    I0704 12:26:45.352947  5465 net.cpp:406] conv1 <- data
    I0704 12:26:45.352963  5465 net.cpp:380] conv1 -> conv1
    I0704 12:26:47.343575  5465 net.cpp:122] Setting up conv1
    I0704 12:26:47.343633  5465 net.cpp:129] Top shape: 100 32 32 32 (3276800)
    I0704 12:26:47.343646  5465 net.cpp:137] Memory required for data: 14337200
    I0704 12:26:47.343694  5465 layer_factory.hpp:77] Creating layer pool1
    I0704 12:26:47.343719  5465 net.cpp:84] Creating Layer pool1
    I0704 12:26:47.343730  5465 net.cpp:406] pool1 <- conv1
    I0704 12:26:47.343745  5465 net.cpp:380] pool1 -> pool1
    I0704 12:26:47.343776  5465 net.cpp:122] Setting up pool1
    I0704 12:26:47.343789  5465 net.cpp:129] Top shape: 100 32 16 16 (819200)
    I0704 12:26:47.343799  5465 net.cpp:137] Memory required for data: 17614000
    I0704 12:26:47.343809  5465 layer_factory.hpp:77] Creating layer relu1
    I0704 12:26:47.343827  5465 net.cpp:84] Creating Layer relu1
    I0704 12:26:47.343838  5465 net.cpp:406] relu1 <- pool1
    I0704 12:26:47.343852  5465 net.cpp:367] relu1 -> pool1 (in-place)
    I0704 12:26:47.344049  5465 net.cpp:122] Setting up relu1
    I0704 12:26:47.344064  5465 net.cpp:129] Top shape: 100 32 16 16 (819200)
    I0704 12:26:47.344074  5465 net.cpp:137] Memory required for data: 20890800
    I0704 12:26:47.344084  5465 layer_factory.hpp:77] Creating layer norm1
    I0704 12:26:47.344103  5465 net.cpp:84] Creating Layer norm1
    I0704 12:26:47.344115  5465 net.cpp:406] norm1 <- pool1
    I0704 12:26:47.344127  5465 net.cpp:380] norm1 -> norm1
    I0704 12:26:47.344894  5465 net.cpp:122] Setting up norm1
    I0704 12:26:47.344913  5465 net.cpp:129] Top shape: 100 32 16 16 (819200)
    I0704 12:26:47.344923  5465 net.cpp:137] Memory required for data: 24167600
    I0704 12:26:47.344933  5465 layer_factory.hpp:77] Creating layer conv2
    I0704 12:26:47.344955  5465 net.cpp:84] Creating Layer conv2
    I0704 12:26:47.344966  5465 net.cpp:406] conv2 <- norm1
    I0704 12:26:47.344981  5465 net.cpp:380] conv2 -> conv2
    I0704 12:26:47.346536  5465 net.cpp:122] Setting up conv2
    I0704 12:26:47.346554  5465 net.cpp:129] Top shape: 100 32 16 16 (819200)
    I0704 12:26:47.346565  5465 net.cpp:137] Memory required for data: 27444400
    I0704 12:26:47.346583  5465 layer_factory.hpp:77] Creating layer relu2
    I0704 12:26:47.346596  5465 net.cpp:84] Creating Layer relu2
    I0704 12:26:47.346607  5465 net.cpp:406] relu2 <- conv2
    I0704 12:26:47.346621  5465 net.cpp:367] relu2 -> conv2 (in-place)
    I0704 12:26:47.346993  5465 net.cpp:122] Setting up relu2
    I0704 12:26:47.347010  5465 net.cpp:129] Top shape: 100 32 16 16 (819200)
    I0704 12:26:47.347021  5465 net.cpp:137] Memory required for data: 30721200
    I0704 12:26:47.347031  5465 layer_factory.hpp:77] Creating layer pool2
    I0704 12:26:47.347048  5465 net.cpp:84] Creating Layer pool2
    I0704 12:26:47.347059  5465 net.cpp:406] pool2 <- conv2
    I0704 12:26:47.347071  5465 net.cpp:380] pool2 -> pool2
    I0704 12:26:47.347445  5465 net.cpp:122] Setting up pool2
    I0704 12:26:47.347462  5465 net.cpp:129] Top shape: 100 32 8 8 (204800)
    I0704 12:26:47.347472  5465 net.cpp:137] Memory required for data: 31540400
    I0704 12:26:47.347482  5465 layer_factory.hpp:77] Creating layer norm2
    I0704 12:26:47.347497  5465 net.cpp:84] Creating Layer norm2
    I0704 12:26:47.347507  5465 net.cpp:406] norm2 <- pool2
    I0704 12:26:47.347522  5465 net.cpp:380] norm2 -> norm2
    I0704 12:26:47.348095  5465 net.cpp:122] Setting up norm2
    I0704 12:26:47.348112  5465 net.cpp:129] Top shape: 100 32 8 8 (204800)
    I0704 12:26:47.348122  5465 net.cpp:137] Memory required for data: 32359600
    I0704 12:26:47.348132  5465 layer_factory.hpp:77] Creating layer conv3
    I0704 12:26:47.348186  5465 net.cpp:84] Creating Layer conv3
    I0704 12:26:47.348197  5465 net.cpp:406] conv3 <- norm2
    I0704 12:26:47.348212  5465 net.cpp:380] conv3 -> conv3
    I0704 12:26:47.358871  5465 net.cpp:122] Setting up conv3
    I0704 12:26:47.358929  5465 net.cpp:129] Top shape: 100 64 8 8 (409600)
    I0704 12:26:47.358940  5465 net.cpp:137] Memory required for data: 33998000
    I0704 12:26:47.358973  5465 layer_factory.hpp:77] Creating layer relu3
    I0704 12:26:47.358996  5465 net.cpp:84] Creating Layer relu3
    I0704 12:26:47.359007  5465 net.cpp:406] relu3 <- conv3
    I0704 12:26:47.359025  5465 net.cpp:367] relu3 -> conv3 (in-place)
    I0704 12:26:47.359354  5465 net.cpp:122] Setting up relu3
    I0704 12:26:47.359377  5465 net.cpp:129] Top shape: 100 64 8 8 (409600)
    I0704 12:26:47.359387  5465 net.cpp:137] Memory required for data: 35636400
    I0704 12:26:47.359397  5465 layer_factory.hpp:77] Creating layer pool3
    I0704 12:26:47.359411  5465 net.cpp:84] Creating Layer pool3
    I0704 12:26:47.359422  5465 net.cpp:406] pool3 <- conv3
    I0704 12:26:47.359439  5465 net.cpp:380] pool3 -> pool3
    I0704 12:26:47.359905  5465 net.cpp:122] Setting up pool3
    I0704 12:26:47.359925  5465 net.cpp:129] Top shape: 100 64 4 4 (102400)
    I0704 12:26:47.359935  5465 net.cpp:137] Memory required for data: 36046000
    I0704 12:26:47.359944  5465 layer_factory.hpp:77] Creating layer ip1
    I0704 12:26:47.359964  5465 net.cpp:84] Creating Layer ip1
    I0704 12:26:47.359975  5465 net.cpp:406] ip1 <- pool3
    I0704 12:26:47.359992  5465 net.cpp:380] ip1 -> ip1
    I0704 12:26:47.360213  5465 net.cpp:122] Setting up ip1
    I0704 12:26:47.360225  5465 net.cpp:129] Top shape: 100 10 (1000)
    I0704 12:26:47.360234  5465 net.cpp:137] Memory required for data: 36050000
    I0704 12:26:47.360249  5465 layer_factory.hpp:77] Creating layer ip1_ip1_0_split
    I0704 12:26:47.360265  5465 net.cpp:84] Creating Layer ip1_ip1_0_split
    I0704 12:26:47.360276  5465 net.cpp:406] ip1_ip1_0_split <- ip1
    I0704 12:26:47.360291  5465 net.cpp:380] ip1_ip1_0_split -> ip1_ip1_0_split_0
    I0704 12:26:47.360307  5465 net.cpp:380] ip1_ip1_0_split -> ip1_ip1_0_split_1
    I0704 12:26:47.360324  5465 net.cpp:122] Setting up ip1_ip1_0_split
    I0704 12:26:47.360337  5465 net.cpp:129] Top shape: 100 10 (1000)
    I0704 12:26:47.360347  5465 net.cpp:129] Top shape: 100 10 (1000)
    I0704 12:26:47.360355  5465 net.cpp:137] Memory required for data: 36058000
    I0704 12:26:47.360365  5465 layer_factory.hpp:77] Creating layer accuracy
    I0704 12:26:47.360388  5465 net.cpp:84] Creating Layer accuracy
    I0704 12:26:47.360399  5465 net.cpp:406] accuracy <- ip1_ip1_0_split_0
    I0704 12:26:47.360410  5465 net.cpp:406] accuracy <- label_cifar_1_split_0
    I0704 12:26:47.360426  5465 net.cpp:380] accuracy -> accuracy
    I0704 12:26:47.360442  5465 net.cpp:122] Setting up accuracy
    I0704 12:26:47.360455  5465 net.cpp:129] Top shape: (1)
    I0704 12:26:47.360463  5465 net.cpp:137] Memory required for data: 36058004
    I0704 12:26:47.360472  5465 layer_factory.hpp:77] Creating layer loss
    I0704 12:26:47.360492  5465 net.cpp:84] Creating Layer loss
    I0704 12:26:47.360503  5465 net.cpp:406] loss <- ip1_ip1_0_split_1
    I0704 12:26:47.360514  5465 net.cpp:406] loss <- label_cifar_1_split_1
    I0704 12:26:47.360528  5465 net.cpp:380] loss -> loss
    I0704 12:26:47.360553  5465 layer_factory.hpp:77] Creating layer loss
    I0704 12:26:47.360777  5465 net.cpp:122] Setting up loss
    I0704 12:26:47.360795  5465 net.cpp:129] Top shape: (1)
    I0704 12:26:47.360805  5465 net.cpp:132]     with loss weight 1
    I0704 12:26:47.360834  5465 net.cpp:137] Memory required for data: 36058008
    I0704 12:26:47.360846  5465 net.cpp:198] loss needs backward computation.
    I0704 12:26:47.360860  5465 net.cpp:200] accuracy does not need backward computation.
    I0704 12:26:47.360872  5465 net.cpp:198] ip1_ip1_0_split needs backward computation.
    I0704 12:26:47.360882  5465 net.cpp:198] ip1 needs backward computation.
    I0704 12:26:47.360891  5465 net.cpp:198] pool3 needs backward computation.
    I0704 12:26:47.360901  5465 net.cpp:198] relu3 needs backward computation.
    I0704 12:26:47.360911  5465 net.cpp:198] conv3 needs backward computation.
    I0704 12:26:47.360954  5465 net.cpp:198] norm2 needs backward computation.
    I0704 12:26:47.360965  5465 net.cpp:198] pool2 needs backward computation.
    I0704 12:26:47.360975  5465 net.cpp:198] relu2 needs backward computation.
    I0704 12:26:47.360985  5465 net.cpp:198] conv2 needs backward computation.
    I0704 12:26:47.360996  5465 net.cpp:198] norm1 needs backward computation.
    I0704 12:26:47.361006  5465 net.cpp:198] relu1 needs backward computation.
    I0704 12:26:47.361016  5465 net.cpp:198] pool1 needs backward computation.
    I0704 12:26:47.361026  5465 net.cpp:198] conv1 needs backward computation.
    I0704 12:26:47.361037  5465 net.cpp:200] label_cifar_1_split does not need backward computation.
    I0704 12:26:47.361047  5465 net.cpp:200] cifar does not need backward computation.
    I0704 12:26:47.361057  5465 net.cpp:242] This network produces output accuracy
    I0704 12:26:47.361068  5465 net.cpp:242] This network produces output loss
    I0704 12:26:47.361099  5465 net.cpp:255] Network initialization done.
    I0704 12:26:47.440634  5465 hdf5.cpp:32] Datatype class: H5T_FLOAT
    I0704 12:26:47.442189  5465 caffe.cpp:290] Running for 10 iterations.
    I0704 12:26:48.122941  5465 caffe.cpp:313] Batch 0, accuracy = 0.81
    I0704 12:26:48.123001  5465 caffe.cpp:313] Batch 0, loss = 0.669872
    I0704 12:26:48.722308  5465 caffe.cpp:313] Batch 1, accuracy = 0.81
    I0704 12:26:48.722373  5465 caffe.cpp:313] Batch 1, loss = 0.600901
    I0704 12:26:49.322628  5465 caffe.cpp:313] Batch 2, accuracy = 0.72
    I0704 12:26:49.322688  5465 caffe.cpp:313] Batch 2, loss = 0.712268
    I0704 12:26:49.923393  5465 caffe.cpp:313] Batch 3, accuracy = 0.79
    I0704 12:26:49.923454  5465 caffe.cpp:313] Batch 3, loss = 0.630273
    I0704 12:26:50.522907  5465 caffe.cpp:313] Batch 4, accuracy = 0.8
    I0704 12:26:50.522970  5465 caffe.cpp:313] Batch 4, loss = 0.519629
    I0704 12:26:51.123852  5465 caffe.cpp:313] Batch 5, accuracy = 0.84
    I0704 12:26:51.123913  5465 caffe.cpp:313] Batch 5, loss = 0.398108
    I0704 12:26:51.724124  5465 caffe.cpp:313] Batch 6, accuracy = 0.73
    I0704 12:26:51.724195  5465 caffe.cpp:313] Batch 6, loss = 0.712703
    I0704 12:26:52.333739  5465 caffe.cpp:313] Batch 7, accuracy = 0.76
    I0704 12:26:52.333806  5465 caffe.cpp:313] Batch 7, loss = 0.711946
    I0704 12:26:52.934726  5465 caffe.cpp:313] Batch 8, accuracy = 0.8
    I0704 12:26:52.934793  5465 caffe.cpp:313] Batch 8, loss = 0.67638
    I0704 12:26:53.534425  5465 caffe.cpp:313] Batch 9, accuracy = 0.73
    I0704 12:26:53.534492  5465 caffe.cpp:313] Batch 9, loss = 0.703718
    I0704 12:26:53.534503  5465 caffe.cpp:318] Loss: 0.63358
    I0704 12:26:53.534533  5465 caffe.cpp:330] accuracy = 0.779
    I0704 12:26:53.534556  5465 caffe.cpp:330] loss = 0.63358 (* 1 = 0.63358 loss)
    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$

     -------------------------------------------------------------------------------------------------------------------------------------------------------

    测试:做100次迭代的结果

    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$
    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$ caffe test -model examples/cifar10/cifar10_full_train_test.prototxt -weights examples/cifar10/cifar10_full_iter_60000.caffemodel.h5  -iterations 100
    I0704 13:12:44.414038  6622 caffe.cpp:284] Use CPU.
    I0704 13:12:44.681273  6622 net.cpp:294] The NetState phase (1) differed from the phase (0) specified by a rule in layer cifar
    I0704 13:12:44.681501  6622 net.cpp:51] Initializing net from parameters:
    ..............................

    ..............................
    I0704 13:12:44.998669  6622 caffe.cpp:290] Running for 100 iterations.
    I0704 13:12:45.619782  6622 caffe.cpp:313] Batch 0, accuracy = 0.81
    I0704 13:12:45.619843  6622 caffe.cpp:313] Batch 0, loss = 0.669872
    I0704 13:12:46.217348  6622 caffe.cpp:313] Batch 1, accuracy = 0.81
    I0704 13:12:46.217408  6622 caffe.cpp:313] Batch 1, loss = 0.600901
    I0704 13:12:46.818264  6622 caffe.cpp:313] Batch 2, accuracy = 0.72
    I0704 13:12:46.818323  6622 caffe.cpp:313] Batch 2, loss = 0.712268
    I0704 13:12:47.415946  6622 caffe.cpp:313] Batch 3, accuracy = 0.79
    I0704 13:12:47.416007  6622 caffe.cpp:313] Batch 3, loss = 0.630273
    I0704 13:12:48.015729  6622 caffe.cpp:313] Batch 4, accuracy = 0.8
    I0704 13:12:48.015792  6622 caffe.cpp:313] Batch 4, loss = 0.519629
    I0704 13:12:48.621882  6622 caffe.cpp:313] Batch 5, accuracy = 0.84
    I0704 13:12:48.621939  6622 caffe.cpp:313] Batch 5, loss = 0.398108
    I0704 13:12:49.227767  6622 caffe.cpp:313] Batch 6, accuracy = 0.73
    I0704 13:12:49.227831  6622 caffe.cpp:313] Batch 6, loss = 0.712703
    I0704 13:12:49.833628  6622 caffe.cpp:313] Batch 7, accuracy = 0.76
    I0704 13:12:49.833700  6622 caffe.cpp:313] Batch 7, loss = 0.711946
    I0704 13:12:50.431000  6622 caffe.cpp:313] Batch 8, accuracy = 0.8
    I0704 13:12:50.431063  6622 caffe.cpp:313] Batch 8, loss = 0.67638
    I0704 13:12:51.036695  6622 caffe.cpp:313] Batch 9, accuracy = 0.73
    I0704 13:12:51.036754  6622 caffe.cpp:313] Batch 9, loss = 0.703718
    I0704 13:12:51.654356  6622 caffe.cpp:313] Batch 10, accuracy = 0.81
    I0704 13:12:51.654417  6622 caffe.cpp:313] Batch 10, loss = 0.628615
    I0704 13:12:52.271215  6622 caffe.cpp:313] Batch 11, accuracy = 0.77
    I0704 13:12:52.271272  6622 caffe.cpp:313] Batch 11, loss = 0.640812
    I0704 13:12:52.871871  6622 caffe.cpp:313] Batch 12, accuracy = 0.84
    I0704 13:12:52.871930  6622 caffe.cpp:313] Batch 12, loss = 0.48857
    I0704 13:12:53.500851  6622 caffe.cpp:313] Batch 13, accuracy = 0.82
    I0704 13:12:53.500921  6622 caffe.cpp:313] Batch 13, loss = 0.545125
    I0704 13:12:54.106851  6622 caffe.cpp:313] Batch 14, accuracy = 0.83
    I0704 13:12:54.106916  6622 caffe.cpp:313] Batch 14, loss = 0.454256
    I0704 13:12:54.712241  6622 caffe.cpp:313] Batch 15, accuracy = 0.8
    I0704 13:12:54.712308  6622 caffe.cpp:313] Batch 15, loss = 0.604123
    I0704 13:12:55.317904  6622 caffe.cpp:313] Batch 16, accuracy = 0.82
    I0704 13:12:55.317970  6622 caffe.cpp:313] Batch 16, loss = 0.602975
    I0704 13:12:55.915468  6622 caffe.cpp:313] Batch 17, accuracy = 0.78
    I0704 13:12:55.915534  6622 caffe.cpp:313] Batch 17, loss = 0.653725
    I0704 13:12:56.521464  6622 caffe.cpp:313] Batch 18, accuracy = 0.74
    I0704 13:12:56.521531  6622 caffe.cpp:313] Batch 18, loss = 0.775862
    I0704 13:12:57.127270  6622 caffe.cpp:313] Batch 19, accuracy = 0.7
    I0704 13:12:57.127341  6622 caffe.cpp:313] Batch 19, loss = 0.927896
    I0704 13:12:57.724392  6622 caffe.cpp:313] Batch 20, accuracy = 0.75
    I0704 13:12:57.724458  6622 caffe.cpp:313] Batch 20, loss = 0.631937
    I0704 13:12:58.330080  6622 caffe.cpp:313] Batch 21, accuracy = 0.75
    I0704 13:12:58.330200  6622 caffe.cpp:313] Batch 21, loss = 0.689163
    I0704 13:12:58.927534  6622 caffe.cpp:313] Batch 22, accuracy = 0.79
    I0704 13:12:58.927600  6622 caffe.cpp:313] Batch 22, loss = 0.718018
    I0704 13:12:59.533542  6622 caffe.cpp:313] Batch 23, accuracy = 0.76
    I0704 13:12:59.533630  6622 caffe.cpp:313] Batch 23, loss = 0.772289
    I0704 13:13:00.131110  6622 caffe.cpp:313] Batch 24, accuracy = 0.78
    I0704 13:13:00.131175  6622 caffe.cpp:313] Batch 24, loss = 0.616908
    I0704 13:13:00.727895  6622 caffe.cpp:313] Batch 25, accuracy = 0.69
    I0704 13:13:00.727962  6622 caffe.cpp:313] Batch 25, loss = 0.921352
    I0704 13:13:01.324116  6622 caffe.cpp:313] Batch 26, accuracy = 0.86
    I0704 13:13:01.324188  6622 caffe.cpp:313] Batch 26, loss = 0.43752
    I0704 13:13:01.920688  6622 caffe.cpp:313] Batch 27, accuracy = 0.76
    I0704 13:13:01.920760  6622 caffe.cpp:313] Batch 27, loss = 0.694387
    I0704 13:13:02.518180  6622 caffe.cpp:313] Batch 28, accuracy = 0.88
    I0704 13:13:02.518250  6622 caffe.cpp:313] Batch 28, loss = 0.503546
    I0704 13:13:03.123641  6622 caffe.cpp:313] Batch 29, accuracy = 0.76
    I0704 13:13:03.123706  6622 caffe.cpp:313] Batch 29, loss = 0.675348
    I0704 13:13:03.729601  6622 caffe.cpp:313] Batch 30, accuracy = 0.75
    I0704 13:13:03.729679  6622 caffe.cpp:313] Batch 30, loss = 0.641321
    I0704 13:13:04.326668  6622 caffe.cpp:313] Batch 31, accuracy = 0.79
    I0704 13:13:04.326732  6622 caffe.cpp:313] Batch 31, loss = 0.668134
    I0704 13:13:04.932490  6622 caffe.cpp:313] Batch 32, accuracy = 0.78
    I0704 13:13:04.932554  6622 caffe.cpp:313] Batch 32, loss = 0.587462
    I0704 13:13:05.529435  6622 caffe.cpp:313] Batch 33, accuracy = 0.69
    I0704 13:13:05.529508  6622 caffe.cpp:313] Batch 33, loss = 0.809499
    I0704 13:13:06.125875  6622 caffe.cpp:313] Batch 34, accuracy = 0.72
    I0704 13:13:06.125944  6622 caffe.cpp:313] Batch 34, loss = 0.841917
    I0704 13:13:06.722806  6622 caffe.cpp:313] Batch 35, accuracy = 0.81
    I0704 13:13:06.722877  6622 caffe.cpp:313] Batch 35, loss = 0.65033
    I0704 13:13:07.328688  6622 caffe.cpp:313] Batch 36, accuracy = 0.74
    I0704 13:13:07.328760  6622 caffe.cpp:313] Batch 36, loss = 0.73518
    I0704 13:13:07.926215  6622 caffe.cpp:313] Batch 37, accuracy = 0.77
    I0704 13:13:07.926282  6622 caffe.cpp:313] Batch 37, loss = 0.626204
    I0704 13:13:08.531821  6622 caffe.cpp:313] Batch 38, accuracy = 0.84
    I0704 13:13:08.531885  6622 caffe.cpp:313] Batch 38, loss = 0.50705
    I0704 13:13:09.128522  6622 caffe.cpp:313] Batch 39, accuracy = 0.86
    I0704 13:13:09.128587  6622 caffe.cpp:313] Batch 39, loss = 0.45618
    I0704 13:13:09.725127  6622 caffe.cpp:313] Batch 40, accuracy = 0.82
    I0704 13:13:09.725200  6622 caffe.cpp:313] Batch 40, loss = 0.594011
    I0704 13:13:10.321892  6622 caffe.cpp:313] Batch 41, accuracy = 0.83
    I0704 13:13:10.321964  6622 caffe.cpp:313] Batch 41, loss = 0.673196
    I0704 13:13:10.918488  6622 caffe.cpp:313] Batch 42, accuracy = 0.85
    I0704 13:13:10.918555  6622 caffe.cpp:313] Batch 42, loss = 0.45519
    I0704 13:13:11.524237  6622 caffe.cpp:313] Batch 43, accuracy = 0.79
    I0704 13:13:11.524307  6622 caffe.cpp:313] Batch 43, loss = 0.628985
    I0704 13:13:12.129947  6622 caffe.cpp:313] Batch 44, accuracy = 0.78
    I0704 13:13:12.130018  6622 caffe.cpp:313] Batch 44, loss = 0.704623
    I0704 13:13:12.735232  6622 caffe.cpp:313] Batch 45, accuracy = 0.71
    I0704 13:13:12.735304  6622 caffe.cpp:313] Batch 45, loss = 0.695174
    I0704 13:13:13.340771  6622 caffe.cpp:313] Batch 46, accuracy = 0.84
    I0704 13:13:13.340842  6622 caffe.cpp:313] Batch 46, loss = 0.552928
    I0704 13:13:13.947043  6622 caffe.cpp:313] Batch 47, accuracy = 0.76
    I0704 13:13:13.947109  6622 caffe.cpp:313] Batch 47, loss = 0.651739
    I0704 13:13:14.553036  6622 caffe.cpp:313] Batch 48, accuracy = 0.82
    I0704 13:13:14.553324  6622 caffe.cpp:313] Batch 48, loss = 0.441534
    I0704 13:13:15.180112  6622 caffe.cpp:313] Batch 49, accuracy = 0.76
    I0704 13:13:15.180176  6622 caffe.cpp:313] Batch 49, loss = 0.729064
    I0704 13:13:15.779243  6622 caffe.cpp:313] Batch 50, accuracy = 0.8
    I0704 13:13:15.779307  6622 caffe.cpp:313] Batch 50, loss = 0.584773
    I0704 13:13:16.378615  6622 caffe.cpp:313] Batch 51, accuracy = 0.79
    I0704 13:13:16.378679  6622 caffe.cpp:313] Batch 51, loss = 0.541237
    I0704 13:13:16.978050  6622 caffe.cpp:313] Batch 52, accuracy = 0.79
    I0704 13:13:16.978111  6622 caffe.cpp:313] Batch 52, loss = 0.656132
    I0704 13:13:17.577879  6622 caffe.cpp:313] Batch 53, accuracy = 0.79
    I0704 13:13:17.577941  6622 caffe.cpp:313] Batch 53, loss = 0.568454
    I0704 13:13:18.177491  6622 caffe.cpp:313] Batch 54, accuracy = 0.72
    I0704 13:13:18.177562  6622 caffe.cpp:313] Batch 54, loss = 0.797033
    I0704 13:13:18.776235  6622 caffe.cpp:313] Batch 55, accuracy = 0.82
    I0704 13:13:18.776298  6622 caffe.cpp:313] Batch 55, loss = 0.679914
    I0704 13:13:19.375723  6622 caffe.cpp:313] Batch 56, accuracy = 0.76
    I0704 13:13:19.375792  6622 caffe.cpp:313] Batch 56, loss = 0.681724
    I0704 13:13:19.974643  6622 caffe.cpp:313] Batch 57, accuracy = 0.86
    I0704 13:13:19.974711  6622 caffe.cpp:313] Batch 57, loss = 0.460299
    I0704 13:13:20.574430  6622 caffe.cpp:313] Batch 58, accuracy = 0.78
    I0704 13:13:20.574501  6622 caffe.cpp:313] Batch 58, loss = 0.694127
    I0704 13:13:21.174055  6622 caffe.cpp:313] Batch 59, accuracy = 0.74
    I0704 13:13:21.174118  6622 caffe.cpp:313] Batch 59, loss = 0.748216
    I0704 13:13:21.804786  6622 caffe.cpp:313] Batch 60, accuracy = 0.79
    I0704 13:13:21.804847  6622 caffe.cpp:313] Batch 60, loss = 0.566229
    I0704 13:13:22.410190  6622 caffe.cpp:313] Batch 61, accuracy = 0.8
    I0704 13:13:22.410251  6622 caffe.cpp:313] Batch 61, loss = 0.53662
    I0704 13:13:23.011648  6622 caffe.cpp:313] Batch 62, accuracy = 0.77
    I0704 13:13:23.011713  6622 caffe.cpp:313] Batch 62, loss = 0.630629
    I0704 13:13:23.618448  6622 caffe.cpp:313] Batch 63, accuracy = 0.8
    I0704 13:13:23.618508  6622 caffe.cpp:313] Batch 63, loss = 0.553974
    I0704 13:13:24.215498  6622 caffe.cpp:313] Batch 64, accuracy = 0.82
    I0704 13:13:24.215561  6622 caffe.cpp:313] Batch 64, loss = 0.640087
    I0704 13:13:24.812703  6622 caffe.cpp:313] Batch 65, accuracy = 0.78
    I0704 13:13:24.812767  6622 caffe.cpp:313] Batch 65, loss = 0.720206
    I0704 13:13:25.417578  6622 caffe.cpp:313] Batch 66, accuracy = 0.85
    I0704 13:13:25.417637  6622 caffe.cpp:313] Batch 66, loss = 0.469227
    I0704 13:13:26.015293  6622 caffe.cpp:313] Batch 67, accuracy = 0.74
    I0704 13:13:26.015355  6622 caffe.cpp:313] Batch 67, loss = 0.662439
    I0704 13:13:26.621019  6622 caffe.cpp:313] Batch 68, accuracy = 0.73
    I0704 13:13:26.621078  6622 caffe.cpp:313] Batch 68, loss = 0.832033
    I0704 13:13:27.226202  6622 caffe.cpp:313] Batch 69, accuracy = 0.74
    I0704 13:13:27.226261  6622 caffe.cpp:313] Batch 69, loss = 0.7851
    I0704 13:13:27.831529  6622 caffe.cpp:313] Batch 70, accuracy = 0.79
    I0704 13:13:27.831593  6622 caffe.cpp:313] Batch 70, loss = 0.594752
    I0704 13:13:28.437043  6622 caffe.cpp:313] Batch 71, accuracy = 0.81
    I0704 13:13:28.437101  6622 caffe.cpp:313] Batch 71, loss = 0.600966
    I0704 13:13:29.034024  6622 caffe.cpp:313] Batch 72, accuracy = 0.89
    I0704 13:13:29.034085  6622 caffe.cpp:313] Batch 72, loss = 0.544593
    I0704 13:13:29.639663  6622 caffe.cpp:313] Batch 73, accuracy = 0.81
    I0704 13:13:29.639719  6622 caffe.cpp:313] Batch 73, loss = 0.578454
    I0704 13:13:30.245527  6622 caffe.cpp:313] Batch 74, accuracy = 0.71
    I0704 13:13:30.245589  6622 caffe.cpp:313] Batch 74, loss = 0.85487
    I0704 13:13:30.851264  6622 caffe.cpp:313] Batch 75, accuracy = 0.75
    I0704 13:13:30.851325  6622 caffe.cpp:313] Batch 75, loss = 0.703642
    I0704 13:13:31.448464  6622 caffe.cpp:313] Batch 76, accuracy = 0.77
    I0704 13:13:31.448526  6622 caffe.cpp:313] Batch 76, loss = 0.702493
    I0704 13:13:32.053625  6622 caffe.cpp:313] Batch 77, accuracy = 0.77
    I0704 13:13:32.053694  6622 caffe.cpp:313] Batch 77, loss = 0.616773
    I0704 13:13:32.659436  6622 caffe.cpp:313] Batch 78, accuracy = 0.77
    I0704 13:13:32.659543  6622 caffe.cpp:313] Batch 78, loss = 0.624216
    I0704 13:13:33.256774  6622 caffe.cpp:313] Batch 79, accuracy = 0.76
    I0704 13:13:33.256835  6622 caffe.cpp:313] Batch 79, loss = 0.702435
    I0704 13:13:33.862572  6622 caffe.cpp:313] Batch 80, accuracy = 0.83
    I0704 13:13:33.862634  6622 caffe.cpp:313] Batch 80, loss = 0.479035
    I0704 13:13:34.460429  6622 caffe.cpp:313] Batch 81, accuracy = 0.77
    I0704 13:13:34.460491  6622 caffe.cpp:313] Batch 81, loss = 0.679997
    I0704 13:13:35.066623  6622 caffe.cpp:313] Batch 82, accuracy = 0.76
    I0704 13:13:35.066686  6622 caffe.cpp:313] Batch 82, loss = 0.668821
    I0704 13:13:35.672404  6622 caffe.cpp:313] Batch 83, accuracy = 0.79
    I0704 13:13:35.672462  6622 caffe.cpp:313] Batch 83, loss = 0.59493
    I0704 13:13:36.277426  6622 caffe.cpp:313] Batch 84, accuracy = 0.77
    I0704 13:13:36.277488  6622 caffe.cpp:313] Batch 84, loss = 0.636169
    I0704 13:13:36.883759  6622 caffe.cpp:313] Batch 85, accuracy = 0.79
    I0704 13:13:36.883818  6622 caffe.cpp:313] Batch 85, loss = 0.655568
    I0704 13:13:37.489516  6622 caffe.cpp:313] Batch 86, accuracy = 0.79
    I0704 13:13:37.489574  6622 caffe.cpp:313] Batch 86, loss = 0.637793
    I0704 13:13:38.086877  6622 caffe.cpp:313] Batch 87, accuracy = 0.84
    I0704 13:13:38.086935  6622 caffe.cpp:313] Batch 87, loss = 0.604765
    I0704 13:13:38.691869  6622 caffe.cpp:313] Batch 88, accuracy = 0.81
    I0704 13:13:38.691931  6622 caffe.cpp:313] Batch 88, loss = 0.525659
    I0704 13:13:39.297114  6622 caffe.cpp:313] Batch 89, accuracy = 0.76
    I0704 13:13:39.297176  6622 caffe.cpp:313] Batch 89, loss = 0.657071
    I0704 13:13:39.902731  6622 caffe.cpp:313] Batch 90, accuracy = 0.81
    I0704 13:13:39.902787  6622 caffe.cpp:313] Batch 90, loss = 0.5901
    I0704 13:13:40.499342  6622 caffe.cpp:313] Batch 91, accuracy = 0.85
    I0704 13:13:40.499403  6622 caffe.cpp:313] Batch 91, loss = 0.433673
    I0704 13:13:41.105139  6622 caffe.cpp:313] Batch 92, accuracy = 0.76
    I0704 13:13:41.105197  6622 caffe.cpp:313] Batch 92, loss = 0.68505
    I0704 13:13:41.711143  6622 caffe.cpp:313] Batch 93, accuracy = 0.84
    I0704 13:13:41.711205  6622 caffe.cpp:313] Batch 93, loss = 0.526908
    I0704 13:13:42.317083  6622 caffe.cpp:313] Batch 94, accuracy = 0.81
    I0704 13:13:42.317149  6622 caffe.cpp:313] Batch 94, loss = 0.54016
    I0704 13:13:42.923281  6622 caffe.cpp:313] Batch 95, accuracy = 0.81
    I0704 13:13:42.923348  6622 caffe.cpp:313] Batch 95, loss = 0.568361
    I0704 13:13:42.924571  6628 data_layer.cpp:73] Restarting data prefetching from start.
    I0704 13:13:43.529124  6622 caffe.cpp:313] Batch 96, accuracy = 0.82
    I0704 13:13:43.529189  6622 caffe.cpp:313] Batch 96, loss = 0.455237
    I0704 13:13:44.135169  6622 caffe.cpp:313] Batch 97, accuracy = 0.78
    I0704 13:13:44.135228  6622 caffe.cpp:313] Batch 97, loss = 0.753922
    I0704 13:13:44.740659  6622 caffe.cpp:313] Batch 98, accuracy = 0.73
    I0704 13:13:44.740911  6622 caffe.cpp:313] Batch 98, loss = 0.709454
    I0704 13:13:45.338565  6622 caffe.cpp:313] Batch 99, accuracy = 0.78
    I0704 13:13:45.338631  6622 caffe.cpp:313] Batch 99, loss = 0.737646
    I0704 13:13:45.338644  6622 caffe.cpp:318] Loss: 0.634399
    I0704 13:13:45.338677  6622 caffe.cpp:330] accuracy = 0.7859
    I0704 13:13:45.338701  6622 caffe.cpp:330] loss = 0.634399 (* 1 = 0.634399 loss)

     识别猫:

    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$ classification   examples/cifar10/cifar10_full.prototxt examples/cifar10/cifar10_full_iter_60000.caffemodel.h5 examples/cifar10/mean.binaryproto data/cifar10/synset_words.txt examples/images/cat.jpg
    ---------- Prediction for examples/images/cat.jpg ----------
    0.7481 - "    deer  "
    0.1352 - "    bird  "
    0.0476 - "    cat  "
    0.0162 - "    frog  "
    0.0143 - "    horse  "
    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$

    灰色猫:

    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$ classification   examples/cifar10/cifar10_full.prototxt examples/cifar10/cifar10_full_iter_60000.caffemodel.h5 examples/cifar10/mean.binaryproto data/cifar10/synset_words.txt examples/images/cat_gray.jpg
    ---------- Prediction for examples/images/cat_gray.jpg ----------
    0.2487 - "    bird  "
    0.2476 - "    horse  "
    0.1985 - "    dog  "
    0.1277 - "    cat  "
    0.0853 - "    deer  "
    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$

    人鱼自行车:

    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$ classification   examples/cifar10/cifar10_full.prototxt examples/cifar10/cifar10_full_iter_60000.caffemodel.h5 examples/cifar10/mean.binaryproto data/cifar10/synset_words.txt examples/images/fish-bike.jpg
    ---------- Prediction for examples/images/fish-bike.jpg ----------
    0.6517 - "    horse  "
    0.1291 - "    truck "
    0.0530 - "    deer  "
    0.0441 - "    cat  "
    0.0435 - "    frog  "
    seag@seag-G41MT-S2PT:~/wsCaffe/caffe$

    看起来效果很糟糕的。

  • 相关阅读:
    当使用了相对路径 <base href="<%= basePath %>" /> 后,全局都只能使用相对路径
    springmvc controller转发setViewName时找不到路径的问题以及转发视图时出现找不到样式的问题
    springmvc 使用jq传递json数据时出现415错误
    eclipse
    渗透测试记录
    在CentOS上安装Mysql使用yum安装mysql
    centos 安装 jdk
    wget和curl方式下载JDK
    Python程序的首行
    打印标准目录
  • 原文地址:https://www.cnblogs.com/leoking01/p/7112430.html
Copyright © 2011-2022 走看看