zoukankan      html  css  js  c++  java
  • LSTM

    Training iter #1:   Batch Loss = 1.467979, Accuracy = 0.3059999942779541
    Training iter #20:   Batch Loss = 1.155874, Accuracy = 0.3400000035762787
    Training iter #40:   Batch Loss = 1.119148, Accuracy = 0.4169999957084656
    Training iter #60:   Batch Loss = 1.099285, Accuracy = 0.49399998784065247
    Training iter #90:   Batch Loss = 0.991692, Accuracy = 0.527999997138977 
    Training iter #140:   Batch Loss = 0.940209, Accuracy = 0.5320000052452087 
    Training iter #190:   Batch Loss = 0.907819, Accuracy = 0.5609999895095825
    Training iter #220:   Batch Loss = 0.869303, Accuracy = 0.578000009059906
    Training iter #410:   Batch Loss = 0.847943, Accuracy = 0.5979999899864197
    Training iter #490:   Batch Loss = 0.852090, Accuracy = 0.6129999756813049
    Training iter #540:   Batch Loss = 0.843135, Accuracy = 0.6389999985694885
    Training iter #1490:   Batch Loss = 0.836091, Accuracy = 0.6489999890327454
    Training iter #1750:   Batch Loss = 0.812885, Accuracy = 0.6510000228881836
    Training iter #1970:   Batch Loss = 0.791033, Accuracy = 0.6679999828338623
    Training iter #2110:   Batch Loss = 0.739782, Accuracy = 0.6800000071525574
    Training iter #2470:   Batch Loss = 0.714706, Accuracy = 0.6909999847412109

    Training iter #2670: Batch Loss = 0.694483, Accuracy = 0.6919999718666077 
    Training iter #5290:   Batch Loss = 0.681658, Accuracy = 0.7070000171661377
    Training iter #5820:   Batch Loss = 0.676098, Accuracy = 0.7200000286102295 
    Training iter #6130:   Batch Loss = 0.642826, Accuracy = 0.734000027179718
    Training iter #31760:   Batch Loss = 0.575368, Accuracy = 0.7400000095367432
    Training iter #71890:   Batch Loss = 0.577792, Accuracy = 0.7570000290870667
    Training iter #72230:   Batch Loss = 0.552322, Accuracy = 0.7670000195503235
    Training iter #87790:   Batch Loss = 0.551816, Accuracy = 0.7789999842643738
    Training iter #89180:   Batch Loss = 0.542968, Accuracy = 0.7829999923706055
    
    Training iter #90970:   Batch Loss = 0.532735, Accuracy = 0.7990000247955322
    Training iter #121200:   Batch Loss = 0.525607, Accuracy = 0.8040000200271606
    

     Training iter #122180: Batch Loss = 0.516407, Accuracy = 0.8109999895095825

    Training iter #149140:   Batch Loss = 0.528955, Accuracy = 0.8230000138282776
    Training iter #186710:   Batch Loss = 0.515277, Accuracy = 0.8349999785423279
    
    Training iter #186960:   Batch Loss = 0.512902, Accuracy = 0.8450000286102295
    
    Training iter #188530:   Batch Loss = 0.470322, Accuracy = 0.8510000109672546 
    Training iter #190900:   Batch Loss = 0.467161, Accuracy = 0.86473999829229711
    
    Iter:    700, Train Loss:   0.11, Train Acc:  95.31%, Val Loss:   0.36, Test Acc:  90.28%, Time: 2:52:12 *
    





















  • 相关阅读:
    PAT——1007. 素数对猜想
    PAT——1006. 换个格式输出整数
    PAT——1005. 继续(3n+1)猜想 (25)
    PAT——1003. 我要通过!
    PAT——1002. 写出这个数
    PAT——1001. 害死人不偿命的(3n+1)猜想
    PAT——年会抽奖(错位 排序 )
    PAT——年会抽奖(错位 排序)
    PAT——不吉利的日期(java中date和Calendar使用)
    MapReduce的输入格式
  • 原文地址:https://www.cnblogs.com/herd/p/10783739.html
Copyright © 2011-2022 走看看