zoukankan      html  css  js  c++  java
  • LSTM Accuracy

    Training iter #1: Batch Loss = 1.234543, Accuracy = 0.29866665601730347
    PERFORMANCE ON TEST SET: Batch Loss = 1.146768569946289, Accuracy = 0.370287150144577
    Training iter #10: Batch Loss = 1.137510, Accuracy = 0.3580000102519989
    PERFORMANCE ON TEST SET: Batch Loss = 1.1342425346374512, Accuracy = 0.35780274868011475
    Training iter #20: Batch Loss = 1.125727, Accuracy = 0.3700000047683716
    PERFORMANCE ON TEST SET: Batch Loss = 1.1261093616485596, Accuracy = 0.3747815191745758
    Training iter #30: Batch Loss = 1.120824, Accuracy = 0.35633334517478943
    PERFORMANCE ON TEST SET: Batch Loss = 1.116803765296936, Accuracy = 0.39450687170028687
    Training iter #40: Batch Loss = 1.099077, Accuracy = 0.3773333430290222
    PERFORMANCE ON TEST SET: Batch Loss = 1.0879905223846436, Accuracy = 0.41198500990867615
    Training iter #50: Batch Loss = 1.086642, Accuracy = 0.38499999046325684
    PERFORMANCE ON TEST SET: Batch Loss = 1.0070323944091797, Accuracy = 0.5106117129325867
    Training iter #60: Batch Loss = 0.987274, Accuracy = 0.47733333706855774
    PERFORMANCE ON TEST SET: Batch Loss = 0.9292565584182739, Accuracy = 0.5320848822593689
    Training iter #70: Batch Loss = 0.956025, Accuracy = 0.5013333559036255
    PERFORMANCE ON TEST SET: Batch Loss = 0.9105101227760315, Accuracy = 0.5305867791175842
    Training iter #80: Batch Loss = 0.873186, Accuracy = 0.5416666865348816
    PERFORMANCE ON TEST SET: Batch Loss = 0.8798474669456482, Accuracy = 0.558302104473114
    Training iter #90: Batch Loss = 0.861485, Accuracy = 0.5896666646003723
    PERFORMANCE ON TEST SET: Batch Loss = 0.8562622666358948, Accuracy = 0.5787765383720398
    Training iter #100: Batch Loss = 0.833264, Accuracy = 0.5973333120346069
    PERFORMANCE ON TEST SET: Batch Loss = 0.8457058072090149, Accuracy = 0.5900124907493591
    Training iter #110: Batch Loss = 0.831916, Accuracy = 0.596666693687439
    PERFORMANCE ON TEST SET: Batch Loss = 0.8394112586975098, Accuracy = 0.5980024933815002
    Training iter #120: Batch Loss = 0.832810, Accuracy = 0.5916666388511658
    PERFORMANCE ON TEST SET: Batch Loss = 0.8366561532020569, Accuracy = 0.5932584404945374
    Training iter #130: Batch Loss = 0.817408, Accuracy = 0.596666693687439
    PERFORMANCE ON TEST SET: Batch Loss = 0.8336422443389893, Accuracy = 0.5950062274932861
    Training iter #140: Batch Loss = 0.811242, Accuracy = 0.5943333506584167
    PERFORMANCE ON TEST SET: Batch Loss = 0.8296205401420593, Accuracy = 0.5952559113502502
    Training iter #150: Batch Loss = 0.817136, Accuracy = 0.5866666436195374
    PERFORMANCE ON TEST SET: Batch Loss = 0.8277447819709778, Accuracy = 0.5987515449523926
    Training iter #160: Batch Loss = 0.829761, Accuracy = 0.5723333358764648
    PERFORMANCE ON TEST SET: Batch Loss = 0.8267015814781189, Accuracy = 0.597503125667572
    Training iter #170: Batch Loss = 0.845276, Accuracy = 0.5716666579246521
    PERFORMANCE ON TEST SET: Batch Loss = 0.8275431394577026, Accuracy = 0.5980024933815002
    Training iter #180: Batch Loss = 0.833005, Accuracy = 0.5916666388511658
    PERFORMANCE ON TEST SET: Batch Loss = 0.8252542018890381, Accuracy = 0.5977528095245361
    Training iter #190: Batch Loss = 0.825047, Accuracy = 0.6000000238418579
    PERFORMANCE ON TEST SET: Batch Loss = 0.8231379985809326, Accuracy = 0.5985018610954285
    Training iter #200: Batch Loss = 0.819469, Accuracy = 0.6146666407585144
    PERFORMANCE ON TEST SET: Batch Loss = 0.8222453594207764, Accuracy = 0.5980024933815002
    Training iter #210: Batch Loss = 0.822358, Accuracy = 0.6176666617393494
    PERFORMANCE ON TEST SET: Batch Loss = 0.8218708038330078, Accuracy = 0.5987515449523926
    Training iter #220: Batch Loss = 0.829910, Accuracy = 0.6129999756813049
    PERFORMANCE ON TEST SET: Batch Loss = 0.8212188482284546, Accuracy = 0.5987515449523926
    Training iter #230: Batch Loss = 0.811814, Accuracy = 0.6303333044052124
    PERFORMANCE ON TEST SET: Batch Loss = 0.82057785987854, Accuracy = 0.597503125667572
    Training iter #240: Batch Loss = 0.800283, Accuracy = 0.6353333592414856
    PERFORMANCE ON TEST SET: Batch Loss = 0.8199676275253296, Accuracy = 0.5977528095245361
    Training iter #250: Batch Loss = 0.806871, Accuracy = 0.621666669845581
    PERFORMANCE ON TEST SET: Batch Loss = 0.8187385201454163, Accuracy = 0.5987515449523926
    Training iter #260: Batch Loss = 0.809453, Accuracy = 0.6186666488647461
    PERFORMANCE ON TEST SET: Batch Loss = 0.8178005814552307, Accuracy = 0.6000000238418579
    Training iter #270: Batch Loss = 0.821572, Accuracy = 0.606333315372467
    PERFORMANCE ON TEST SET: Batch Loss = 0.8175872564315796, Accuracy = 0.5990012288093567

    Training iter #500: Batch Loss = 0.825050, Accuracy = 0.6051999926567078
    PERFORMANCE ON TEST SET: Batch Loss = 0.8133077621459961, Accuracy = 0.6037453413009644
    Training iter #1000: Batch Loss = 0.814610, Accuracy = 0.607200026512146
    PERFORMANCE ON TEST SET: Batch Loss = 0.7948320508003235, Accuracy = 0.6152309775352478
    Training iter #1500: Batch Loss = 0.780438, Accuracy = 0.6136000156402588
    PERFORMANCE ON TEST SET: Batch Loss = 0.7731343507766724, Accuracy = 0.6307116150856018
    Training iter #2000: Batch Loss = 0.774651, Accuracy = 0.6151999831199646
    PERFORMANCE ON TEST SET: Batch Loss = 0.757510244846344, Accuracy = 0.6436953544616699
    =================================================
    0.00216
    0.0001
    Training iter #2500: Batch Loss = 0.755460, Accuracy = 0.635200023651123
    PERFORMANCE ON TEST SET: Batch Loss = 0.7465106844902039, Accuracy = 0.6461922526359558
    Training iter #3000: Batch Loss = 0.714036, Accuracy = 0.6571999788284302
    PERFORMANCE ON TEST SET: Batch Loss = 0.7433488965034485, Accuracy = 0.6454432010650635
    Training iter #3500: Batch Loss = 0.688170, Accuracy = 0.6859999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.7374157309532166, Accuracy = 0.6454432010650635
    Training iter #4000: Batch Loss = 0.741185, Accuracy = 0.6416000127792358
    PERFORMANCE ON TEST SET: Batch Loss = 0.7342274188995361, Accuracy = 0.6489388346672058
    Training iter #4500: Batch Loss = 0.732755, Accuracy = 0.6444000005722046
    PERFORMANCE ON TEST SET: Batch Loss = 0.7318059206008911, Accuracy = 0.6539325714111328
    =================================================
    0.000216
    1e-05
    Training iter #5000: Batch Loss = 0.702451, Accuracy = 0.6567999720573425
    PERFORMANCE ON TEST SET: Batch Loss = 0.7260335683822632, Accuracy = 0.6534332036972046
    Training iter #5500: Batch Loss = 0.713943, Accuracy = 0.671999990940094
    PERFORMANCE ON TEST SET: Batch Loss = 0.7275610566139221, Accuracy = 0.6511859893798828
    Training iter #6000: Batch Loss = 0.738860, Accuracy = 0.656000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.7375240325927734, Accuracy = 0.652184784412384
    Training iter #6500: Batch Loss = 0.738290, Accuracy = 0.6704000234603882
    PERFORMANCE ON TEST SET: Batch Loss = 0.7226717472076416, Accuracy = 0.6596754193305969
    Training iter #7000: Batch Loss = 0.717244, Accuracy = 0.671999990940094
    PERFORMANCE ON TEST SET: Batch Loss = 0.7188556790351868, Accuracy = 0.662671685218811
    =================================================
    2.16e-05
    1.0000000000000002e-06
    Training iter #7500: Batch Loss = 0.728312, Accuracy = 0.6592000126838684
    PERFORMANCE ON TEST SET: Batch Loss = 0.7187650799751282, Accuracy = 0.6641697883605957
    Training iter #8000: Batch Loss = 0.723907, Accuracy = 0.6628000140190125
    PERFORMANCE ON TEST SET: Batch Loss = 0.7304818630218506, Accuracy = 0.6631710529327393
    Training iter #8500: Batch Loss = 0.672251, Accuracy = 0.6904000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.715739369392395, Accuracy = 0.6669163703918457
    Training iter #9000: Batch Loss = 0.647901, Accuracy = 0.7084000110626221
    PERFORMANCE ON TEST SET: Batch Loss = 0.719252347946167, Accuracy = 0.6679151058197021
    Training iter #9500: Batch Loss = 0.691495, Accuracy = 0.6859999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.7128927707672119, Accuracy = 0.6694132089614868
    =================================================
    2.16e-06
    1.0000000000000002e-07
    Training iter #10000: Batch Loss = 0.695144, Accuracy = 0.6704000234603882
    PERFORMANCE ON TEST SET: Batch Loss = 0.7163955569267273, Accuracy = 0.6661673188209534
    Training iter #10500: Batch Loss = 0.697749, Accuracy = 0.66839998960495
    PERFORMANCE ON TEST SET: Batch Loss = 0.7181190252304077, Accuracy = 0.6631710529327393
    Training iter #11000: Batch Loss = 0.689020, Accuracy = 0.6895999908447266
    PERFORMANCE ON TEST SET: Batch Loss = 0.7101808190345764, Accuracy = 0.6749063730239868
    Training iter #11500: Batch Loss = 0.705837, Accuracy = 0.6851999759674072
    PERFORMANCE ON TEST SET: Batch Loss = 0.7095412015914917, Accuracy = 0.6784020066261292
    Training iter #12000: Batch Loss = 0.707522, Accuracy = 0.6872000098228455
    PERFORMANCE ON TEST SET: Batch Loss = 0.7098798751831055, Accuracy = 0.6759051084518433
    =================================================
    2.16e-07
    1.0000000000000002e-08
    Training iter #12500: Batch Loss = 0.697654, Accuracy = 0.6876000165939331
    PERFORMANCE ON TEST SET: Batch Loss = 0.7154601812362671, Accuracy = 0.6719101071357727
    Training iter #13000: Batch Loss = 0.704678, Accuracy = 0.6787999868392944
    PERFORMANCE ON TEST SET: Batch Loss = 0.707797110080719, Accuracy = 0.6821473240852356
    Training iter #13500: Batch Loss = 0.695376, Accuracy = 0.6844000220298767
    PERFORMANCE ON TEST SET: Batch Loss = 0.7044758796691895, Accuracy = 0.6818976402282715
    Training iter #14000: Batch Loss = 0.645093, Accuracy = 0.7131999731063843
    PERFORMANCE ON TEST SET: Batch Loss = 0.7041301131248474, Accuracy = 0.6836454272270203
    Training iter #14500: Batch Loss = 0.636284, Accuracy = 0.7107999920845032
    PERFORMANCE ON TEST SET: Batch Loss = 0.7164897918701172, Accuracy = 0.6789013743400574
    =================================================
    2.16e-08
    1.0000000000000003e-09
    Training iter #15000: Batch Loss = 0.688199, Accuracy = 0.6916000247001648
    PERFORMANCE ON TEST SET: Batch Loss = 0.7065286040306091, Accuracy = 0.6843944787979126
    Training iter #15500: Batch Loss = 0.669913, Accuracy = 0.6980000138282776
    PERFORMANCE ON TEST SET: Batch Loss = 0.7027671933174133, Accuracy = 0.680898904800415
    Training iter #16000: Batch Loss = 0.658352, Accuracy = 0.7027999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.7021478414535522, Accuracy = 0.6896379590034485
    Training iter #16500: Batch Loss = 0.665983, Accuracy = 0.7103999853134155
    PERFORMANCE ON TEST SET: Batch Loss = 0.7007059454917908, Accuracy = 0.6951310634613037
    Training iter #17000: Batch Loss = 0.693359, Accuracy = 0.6988000273704529
    PERFORMANCE ON TEST SET: Batch Loss = 0.7025511264801025, Accuracy = 0.6961298584938049
    =================================================
    2.16e-09
    1.0000000000000003e-10
    Training iter #17500: Batch Loss = 0.679996, Accuracy = 0.6984000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.71058189868927, Accuracy = 0.6853932738304138
    Training iter #18000: Batch Loss = 0.683246, Accuracy = 0.6940000057220459
    PERFORMANCE ON TEST SET: Batch Loss = 0.72711580991745, Accuracy = 0.6791510581970215
    Training iter #18500: Batch Loss = 0.679159, Accuracy = 0.6959999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.7019215822219849, Accuracy = 0.6958801746368408
    Training iter #19000: Batch Loss = 0.671984, Accuracy = 0.6995999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.7006646990776062, Accuracy = 0.698626697063446
    Training iter #19500: Batch Loss = 0.640010, Accuracy = 0.7200000286102295
    PERFORMANCE ON TEST SET: Batch Loss = 0.6955743432044983, Accuracy = 0.6983770132064819
    =================================================
    2.16e-10
    1.0000000000000003e-11
    Training iter #20000: Batch Loss = 0.611544, Accuracy = 0.7351999878883362
    PERFORMANCE ON TEST SET: Batch Loss = 0.6992015242576599, Accuracy = 0.7043695449829102
    Training iter #20500: Batch Loss = 0.667043, Accuracy = 0.7095999717712402
    PERFORMANCE ON TEST SET: Batch Loss = 0.696792721748352, Accuracy = 0.7061173319816589
    Training iter #21000: Batch Loss = 0.656293, Accuracy = 0.7164000272750854
    PERFORMANCE ON TEST SET: Batch Loss = 0.7079028487205505, Accuracy = 0.6966292262077332
    Training iter #21500: Batch Loss = 0.640921, Accuracy = 0.7139999866485596
    PERFORMANCE ON TEST SET: Batch Loss = 0.6939886808395386, Accuracy = 0.7046192288398743
    Training iter #22000: Batch Loss = 0.651124, Accuracy = 0.717199981212616
    PERFORMANCE ON TEST SET: Batch Loss = 0.6967673897743225, Accuracy = 0.706367015838623
    =================================================
    2.1600000000000002e-11
    1.0000000000000002e-12
    Training iter #22500: Batch Loss = 0.653100, Accuracy = 0.7203999757766724
    PERFORMANCE ON TEST SET: Batch Loss = 0.693270742893219, Accuracy = 0.7078651785850525
    Training iter #23000: Batch Loss = 0.657673, Accuracy = 0.717199981212616
    PERFORMANCE ON TEST SET: Batch Loss = 0.6962982416152954, Accuracy = 0.704119861125946
    Training iter #23500: Batch Loss = 0.700617, Accuracy = 0.6904000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.7090349197387695, Accuracy = 0.6973782777786255
    Training iter #24000: Batch Loss = 0.651257, Accuracy = 0.7192000150680542
    PERFORMANCE ON TEST SET: Batch Loss = 0.6911227703094482, Accuracy = 0.7098626494407654
    Training iter #24500: Batch Loss = 0.652979, Accuracy = 0.72079998254776
    PERFORMANCE ON TEST SET: Batch Loss = 0.692528247833252, Accuracy = 0.7143570780754089
    =================================================
    2.16e-12
    1.0000000000000002e-13
    Training iter #25000: Batch Loss = 0.616090, Accuracy = 0.7436000108718872
    PERFORMANCE ON TEST SET: Batch Loss = 0.6948117613792419, Accuracy = 0.7096129655838013
    Training iter #25500: Batch Loss = 0.589063, Accuracy = 0.7516000270843506
    PERFORMANCE ON TEST SET: Batch Loss = 0.6928555965423584, Accuracy = 0.711860179901123
    Training iter #26000: Batch Loss = 0.638964, Accuracy = 0.7268000245094299
    PERFORMANCE ON TEST SET: Batch Loss = 0.6901557445526123, Accuracy = 0.7173532843589783
    Training iter #26500: Batch Loss = 0.641086, Accuracy = 0.7260000109672546
    PERFORMANCE ON TEST SET: Batch Loss = 0.6944947242736816, Accuracy = 0.7136079668998718
    Training iter #27000: Batch Loss = 0.615000, Accuracy = 0.7423999905586243
    PERFORMANCE ON TEST SET: Batch Loss = 0.6865741014480591, Accuracy = 0.7205992341041565
    =================================================
    2.1600000000000001e-13
    1.0000000000000002e-14
    Training iter #27500: Batch Loss = 0.611174, Accuracy = 0.7472000122070312
    PERFORMANCE ON TEST SET: Batch Loss = 0.6839150786399841, Accuracy = 0.7225967645645142
    Training iter #28000: Batch Loss = 0.638136, Accuracy = 0.7355999946594238
    PERFORMANCE ON TEST SET: Batch Loss = 0.6877487897872925, Accuracy = 0.7203495502471924
    Training iter #28500: Batch Loss = 0.627424, Accuracy = 0.7444000244140625
    PERFORMANCE ON TEST SET: Batch Loss = 0.6832399964332581, Accuracy = 0.72234708070755
    Training iter #29000: Batch Loss = 0.628430, Accuracy = 0.7391999959945679
    PERFORMANCE ON TEST SET: Batch Loss = 0.6910812258720398, Accuracy = 0.7183520793914795
    Training iter #29500: Batch Loss = 0.634209, Accuracy = 0.7383999824523926
    PERFORMANCE ON TEST SET: Batch Loss = 0.7009758353233337, Accuracy = 0.7163545489311218
    =================================================
    2.16e-14
    1e-15
    Training iter #30000: Batch Loss = 0.627723, Accuracy = 0.7444000244140625
    PERFORMANCE ON TEST SET: Batch Loss = 0.6863027811050415, Accuracy = 0.7285892367362976
    Training iter #30500: Batch Loss = 0.595734, Accuracy = 0.7620000243186951
    PERFORMANCE ON TEST SET: Batch Loss = 0.6786280274391174, Accuracy = 0.7288389801979065
    Training iter #31000: Batch Loss = 0.568211, Accuracy = 0.7675999999046326
    PERFORMANCE ON TEST SET: Batch Loss = 0.6821891069412231, Accuracy = 0.7275905013084412
    Training iter #31500: Batch Loss = 0.624895, Accuracy = 0.7376000285148621
    PERFORMANCE ON TEST SET: Batch Loss = 0.6780741810798645, Accuracy = 0.732833981513977
    Training iter #32000: Batch Loss = 0.612694, Accuracy = 0.746399998664856
    PERFORMANCE ON TEST SET: Batch Loss = 0.6757426261901855, Accuracy = 0.7363296151161194
    =================================================
    2.16e-15
    1.0000000000000001e-16
    Training iter #32500: Batch Loss = 0.595000, Accuracy = 0.7612000107765198
    PERFORMANCE ON TEST SET: Batch Loss = 0.6903315782546997, Accuracy = 0.7308364510536194
    Training iter #33000: Batch Loss = 0.589033, Accuracy = 0.7692000269889832
    PERFORMANCE ON TEST SET: Batch Loss = 0.6793196201324463, Accuracy = 0.7360798716545105
    Training iter #33500: Batch Loss = 0.618454, Accuracy = 0.7523999810218811
    PERFORMANCE ON TEST SET: Batch Loss = 0.6783280372619629, Accuracy = 0.7393258213996887
    Training iter #34000: Batch Loss = 0.599179, Accuracy = 0.7648000121116638
    PERFORMANCE ON TEST SET: Batch Loss = 0.6773130297660828, Accuracy = 0.7385767698287964
    Training iter #34500: Batch Loss = 0.624641, Accuracy = 0.7419999837875366
    PERFORMANCE ON TEST SET: Batch Loss = 0.6878896355628967, Accuracy = 0.7375780344009399
    =================================================
    2.16e-16
    1e-17
    Training iter #35000: Batch Loss = 0.613337, Accuracy = 0.7648000121116638
    PERFORMANCE ON TEST SET: Batch Loss = 0.6770281791687012, Accuracy = 0.7448189854621887
    Training iter #35500: Batch Loss = 0.605400, Accuracy = 0.7648000121116638
    PERFORMANCE ON TEST SET: Batch Loss = 0.6708619594573975, Accuracy = 0.7523096203804016
    Training iter #36000: Batch Loss = 0.590622, Accuracy = 0.7675999999046326
    PERFORMANCE ON TEST SET: Batch Loss = 0.6779251098632812, Accuracy = 0.7423220872879028
    Training iter #36500: Batch Loss = 0.564312, Accuracy = 0.7879999876022339
    PERFORMANCE ON TEST SET: Batch Loss = 0.6768258213996887, Accuracy = 0.7470661401748657
    Training iter #37000: Batch Loss = 0.592966, Accuracy = 0.770799994468689
    PERFORMANCE ON TEST SET: Batch Loss = 0.6917567849159241, Accuracy = 0.7365792989730835
    =================================================
    2.1600000000000002e-17
    1e-18
    Training iter #37500: Batch Loss = 0.639897, Accuracy = 0.741599977016449
    PERFORMANCE ON TEST SET: Batch Loss = 0.6790437698364258, Accuracy = 0.7465667724609375
    Training iter #38000: Batch Loss = 0.583993, Accuracy = 0.7731999754905701
    PERFORMANCE ON TEST SET: Batch Loss = 0.6851035356521606, Accuracy = 0.7440699338912964
    Training iter #38500: Batch Loss = 0.569888, Accuracy = 0.7832000255584717
    PERFORMANCE ON TEST SET: Batch Loss = 0.6740916967391968, Accuracy = 0.7520599365234375
    Training iter #39000: Batch Loss = 0.604226, Accuracy = 0.7635999917984009
    PERFORMANCE ON TEST SET: Batch Loss = 0.6679624915122986, Accuracy = 0.7595505714416504
    Training iter #39500: Batch Loss = 0.600842, Accuracy = 0.7748000025749207
    PERFORMANCE ON TEST SET: Batch Loss = 0.6737722754478455, Accuracy = 0.7555555701255798
    =================================================
    2.16e-18
    1.0000000000000001e-19
    Training iter #40000: Batch Loss = 0.588070, Accuracy = 0.774399995803833
    PERFORMANCE ON TEST SET: Batch Loss = 0.6626477241516113, Accuracy = 0.7570536732673645
    Training iter #40500: Batch Loss = 0.597579, Accuracy = 0.7731999754905701
    PERFORMANCE ON TEST SET: Batch Loss = 0.6663970351219177, Accuracy = 0.7573033571243286
    Training iter #41000: Batch Loss = 0.583809, Accuracy = 0.775600016117096
    PERFORMANCE ON TEST SET: Batch Loss = 0.6653241515159607, Accuracy = 0.7665418386459351
    Training iter #41500: Batch Loss = 0.568615, Accuracy = 0.7843999862670898
    PERFORMANCE ON TEST SET: Batch Loss = 0.6795607209205627, Accuracy = 0.748314619064331
    Training iter #42000: Batch Loss = 0.537671, Accuracy = 0.8040000200271606
    PERFORMANCE ON TEST SET: Batch Loss = 0.6718056797981262, Accuracy = 0.7525593042373657
    =================================================
    2.16e-19
    1.0000000000000001e-20
    Training iter #42500: Batch Loss = 0.567536, Accuracy = 0.7847999930381775
    PERFORMANCE ON TEST SET: Batch Loss = 0.65623539686203, Accuracy = 0.7660424709320068
    Training iter #43000: Batch Loss = 0.582168, Accuracy = 0.7724000215530396
    PERFORMANCE ON TEST SET: Batch Loss = 0.6674412488937378, Accuracy = 0.7610486745834351
    Training iter #43500: Batch Loss = 0.565976, Accuracy = 0.7924000024795532
    PERFORMANCE ON TEST SET: Batch Loss = 0.6825441718101501, Accuracy = 0.7553058862686157
    Training iter #44000: Batch Loss = 0.574812, Accuracy = 0.7820000052452087
    PERFORMANCE ON TEST SET: Batch Loss = 0.6686702370643616, Accuracy = 0.766292154788971
    Training iter #44500: Batch Loss = 0.583507, Accuracy = 0.7847999930381775
    PERFORMANCE ON TEST SET: Batch Loss = 0.6611884832382202, Accuracy = 0.7690386772155762
    =================================================
    2.16e-20
    1.0000000000000001e-21
    Training iter #45000: Batch Loss = 0.582836, Accuracy = 0.7811999917030334
    PERFORMANCE ON TEST SET: Batch Loss = 0.6542681455612183, Accuracy = 0.776779055595398
    Training iter #45500: Batch Loss = 0.607225, Accuracy = 0.7652000188827515
    PERFORMANCE ON TEST SET: Batch Loss = 0.6677001118659973, Accuracy = 0.7675405740737915
    Training iter #46000: Batch Loss = 0.616859, Accuracy = 0.7671999931335449
    PERFORMANCE ON TEST SET: Batch Loss = 0.6810206174850464, Accuracy = 0.7588015198707581
    Training iter #46500: Batch Loss = 0.570793, Accuracy = 0.7907999753952026
    PERFORMANCE ON TEST SET: Batch Loss = 0.6666198968887329, Accuracy = 0.776279628276825
    Training iter #47000: Batch Loss = 0.580349, Accuracy = 0.77920001745224
    PERFORMANCE ON TEST SET: Batch Loss = 0.6666780710220337, Accuracy = 0.7742821574211121
    =================================================
    2.16e-21
    1e-22
    Training iter #47500: Batch Loss = 0.535346, Accuracy = 0.8080000281333923
    PERFORMANCE ON TEST SET: Batch Loss = 0.6765244007110596, Accuracy = 0.7707865238189697
    Training iter #48000: Batch Loss = 0.562411, Accuracy = 0.7991999983787537
    PERFORMANCE ON TEST SET: Batch Loss = 0.6715932488441467, Accuracy = 0.7675405740737915
    Training iter #48500: Batch Loss = 0.552190, Accuracy = 0.8083999752998352
    PERFORMANCE ON TEST SET: Batch Loss = 0.6822556257247925, Accuracy = 0.7632958889007568
    Training iter #49000: Batch Loss = 0.568871, Accuracy = 0.7940000295639038
    PERFORMANCE ON TEST SET: Batch Loss = 0.6716148853302002, Accuracy = 0.776779055595398
    Training iter #49500: Batch Loss = 0.559952, Accuracy = 0.7919999957084656
    PERFORMANCE ON TEST SET: Batch Loss = 0.6995131373405457, Accuracy = 0.7538077235221863
    =================================================
    2.1600000000000003e-22
    1.0000000000000001e-23
    Training iter #50000: Batch Loss = 0.569105, Accuracy = 0.7996000051498413
    PERFORMANCE ON TEST SET: Batch Loss = 0.6660012602806091, Accuracy = 0.7780274748802185
    Training iter #50500: Batch Loss = 0.539616, Accuracy = 0.8208000063896179
    PERFORMANCE ON TEST SET: Batch Loss = 0.6588647961616516, Accuracy = 0.7815231084823608
    Training iter #51000: Batch Loss = 0.569443, Accuracy = 0.7991999983787537
    PERFORMANCE ON TEST SET: Batch Loss = 0.6573840975761414, Accuracy = 0.7857677936553955
    Training iter #51500: Batch Loss = 0.555185, Accuracy = 0.8087999820709229
    PERFORMANCE ON TEST SET: Batch Loss = 0.6696431636810303, Accuracy = 0.7742821574211121
    Training iter #52000: Batch Loss = 0.571105, Accuracy = 0.7940000295639038
    PERFORMANCE ON TEST SET: Batch Loss = 0.6589933633804321, Accuracy = 0.7897627949714661
    =================================================
    2.1600000000000003e-23
    1.0000000000000001e-24
    Training iter #52500: Batch Loss = 0.569271, Accuracy = 0.7964000105857849
    PERFORMANCE ON TEST SET: Batch Loss = 0.6855847239494324, Accuracy = 0.7722846269607544
    Training iter #53000: Batch Loss = 0.586872, Accuracy = 0.7784000039100647
    PERFORMANCE ON TEST SET: Batch Loss = 0.690345823764801, Accuracy = 0.7700374722480774
    Training iter #53500: Batch Loss = 0.529707, Accuracy = 0.8227999806404114
    PERFORMANCE ON TEST SET: Batch Loss = 0.6698615550994873, Accuracy = 0.7832708954811096
    Training iter #54000: Batch Loss = 0.569092, Accuracy = 0.7900000214576721
    PERFORMANCE ON TEST SET: Batch Loss = 0.6649947166442871, Accuracy = 0.7880150079727173
    Training iter #54500: Batch Loss = 0.566388, Accuracy = 0.7904000282287598
    PERFORMANCE ON TEST SET: Batch Loss = 0.6969018578529358, Accuracy = 0.7632958889007568
    =================================================
    2.1600000000000005e-24
    1.0000000000000002e-25
    Training iter #55000: Batch Loss = 0.569184, Accuracy = 0.7955999970436096
    PERFORMANCE ON TEST SET: Batch Loss = 0.6748400330543518, Accuracy = 0.7800249457359314
    Training iter #55500: Batch Loss = 0.546087, Accuracy = 0.8223999738693237
    PERFORMANCE ON TEST SET: Batch Loss = 0.653386116027832, Accuracy = 0.7967540621757507
    Training iter #56000: Batch Loss = 0.531786, Accuracy = 0.829200029373169
    PERFORMANCE ON TEST SET: Batch Loss = 0.6639797687530518, Accuracy = 0.7912608981132507
    Training iter #56500: Batch Loss = 0.584991, Accuracy = 0.7911999821662903
    PERFORMANCE ON TEST SET: Batch Loss = 0.6806936264038086, Accuracy = 0.7757802605628967
    Training iter #57000: Batch Loss = 0.591779, Accuracy = 0.7879999876022339
    PERFORMANCE ON TEST SET: Batch Loss = 0.668867826461792, Accuracy = 0.7877652645111084
    =================================================
    2.1600000000000003e-25
    1.0000000000000002e-26
    Training iter #57500: Batch Loss = 0.540489, Accuracy = 0.8240000009536743
    PERFORMANCE ON TEST SET: Batch Loss = 0.6516690254211426, Accuracy = 0.802746593952179
    Training iter #58000: Batch Loss = 0.534895, Accuracy = 0.8227999806404114
    PERFORMANCE ON TEST SET: Batch Loss = 0.6776871681213379, Accuracy = 0.7835205793380737
    Training iter #58500: Batch Loss = 0.509669, Accuracy = 0.8388000130653381
    PERFORMANCE ON TEST SET: Batch Loss = 0.6551820039749146, Accuracy = 0.7952559590339661
    Training iter #59000: Batch Loss = 0.590606, Accuracy = 0.77920001745224
    PERFORMANCE ON TEST SET: Batch Loss = 0.7006279826164246, Accuracy = 0.7627965211868286
    Training iter #59500: Batch Loss = 0.527984, Accuracy = 0.8256000280380249
    PERFORMANCE ON TEST SET: Batch Loss = 0.654113233089447, Accuracy = 0.8039950132369995
    =================================================
    2.1600000000000002e-26
    1.0000000000000002e-27
    Training iter #60000: Batch Loss = 0.549410, Accuracy = 0.8119999766349792
    PERFORMANCE ON TEST SET: Batch Loss = 0.6640188694000244, Accuracy = 0.7937577962875366
    Training iter #60500: Batch Loss = 0.567482, Accuracy = 0.7979999780654907
    PERFORMANCE ON TEST SET: Batch Loss = 0.6707872748374939, Accuracy = 0.7910112142562866
    Training iter #61000: Batch Loss = 0.554849, Accuracy = 0.8112000226974487
    PERFORMANCE ON TEST SET: Batch Loss = 0.6551629900932312, Accuracy = 0.8019974827766418
    Training iter #61500: Batch Loss = 0.622129, Accuracy = 0.7656000256538391
    PERFORMANCE ON TEST SET: Batch Loss = 0.6678410172462463, Accuracy = 0.7900124788284302
    Training iter #62000: Batch Loss = 0.515824, Accuracy = 0.8443999886512756
    PERFORMANCE ON TEST SET: Batch Loss = 0.6489068269729614, Accuracy = 0.8044943809509277
    =================================================
    2.1600000000000003e-27
    1.0000000000000002e-28
    Training iter #62500: Batch Loss = 0.519032, Accuracy = 0.840399980545044
    PERFORMANCE ON TEST SET: Batch Loss = 0.6713894605636597, Accuracy = 0.7935081124305725
    Training iter #63000: Batch Loss = 0.543645, Accuracy = 0.823199987411499
    PERFORMANCE ON TEST SET: Batch Loss = 0.6626157164573669, Accuracy = 0.8019974827766418
    Training iter #63500: Batch Loss = 0.540004, Accuracy = 0.8184000253677368
    PERFORMANCE ON TEST SET: Batch Loss = 0.6536903381347656, Accuracy = 0.8042446970939636
    Training iter #64000: Batch Loss = 0.503839, Accuracy = 0.8464000225067139
    PERFORMANCE ON TEST SET: Batch Loss = 0.6685147881507874, Accuracy = 0.7947565317153931
    Training iter #64500: Batch Loss = 0.525321, Accuracy = 0.8303999900817871
    PERFORMANCE ON TEST SET: Batch Loss = 0.6932826042175293, Accuracy = 0.7760299444198608
    =================================================
    2.16e-28
    1.0000000000000002e-29
    Training iter #65000: Batch Loss = 0.549937, Accuracy = 0.8191999793052673
    PERFORMANCE ON TEST SET: Batch Loss = 0.6539942026138306, Accuracy = 0.8092384338378906
    Training iter #65500: Batch Loss = 0.538498, Accuracy = 0.8203999996185303
    PERFORMANCE ON TEST SET: Batch Loss = 0.7288631796836853, Accuracy = 0.7578027248382568
    Training iter #66000: Batch Loss = 0.553156, Accuracy = 0.8127999901771545
    PERFORMANCE ON TEST SET: Batch Loss = 0.6806886196136475, Accuracy = 0.7827715277671814
    Training iter #66500: Batch Loss = 0.519954, Accuracy = 0.8420000076293945
    PERFORMANCE ON TEST SET: Batch Loss = 0.6504929065704346, Accuracy = 0.8072409629821777
    Training iter #67000: Batch Loss = 0.509730, Accuracy = 0.8503999710083008
    PERFORMANCE ON TEST SET: Batch Loss = 0.6454535722732544, Accuracy = 0.8109862804412842
    =================================================
    2.1600000000000002e-29
    1.0000000000000003e-30
    Training iter #67500: Batch Loss = 0.536366, Accuracy = 0.8295999765396118
    PERFORMANCE ON TEST SET: Batch Loss = 0.661019504070282, Accuracy = 0.802247166633606
    Training iter #68000: Batch Loss = 0.537860, Accuracy = 0.829200029373169
    PERFORMANCE ON TEST SET: Batch Loss = 0.6615339517593384, Accuracy = 0.8042446970939636
    Training iter #68500: Batch Loss = 0.539609, Accuracy = 0.8263999819755554
    PERFORMANCE ON TEST SET: Batch Loss = 0.675279974937439, Accuracy = 0.7947565317153931
    Training iter #69000: Batch Loss = 0.528023, Accuracy = 0.8375999927520752
    PERFORMANCE ON TEST SET: Batch Loss = 0.684288501739502, Accuracy = 0.7865168452262878
    Training iter #69500: Batch Loss = 0.525820, Accuracy = 0.8307999968528748
    PERFORMANCE ON TEST SET: Batch Loss = 0.6930934190750122, Accuracy = 0.7787765264511108
    =================================================
    2.16e-30
    1.0000000000000003e-31
    Training iter #70000: Batch Loss = 0.557976, Accuracy = 0.8148000240325928
    PERFORMANCE ON TEST SET: Batch Loss = 0.6530328392982483, Accuracy = 0.8089887499809265
    Training iter #70500: Batch Loss = 0.576441, Accuracy = 0.8068000078201294
    PERFORMANCE ON TEST SET: Batch Loss = 0.6769931316375732, Accuracy = 0.7947565317153931
    Training iter #71000: Batch Loss = 0.529608, Accuracy = 0.83160001039505
    PERFORMANCE ON TEST SET: Batch Loss = 0.6578515768051147, Accuracy = 0.8129837512969971
    Training iter #71500: Batch Loss = 0.509318, Accuracy = 0.8411999940872192
    PERFORMANCE ON TEST SET: Batch Loss = 0.6508171558380127, Accuracy = 0.8129837512969971
    Training iter #72000: Batch Loss = 0.524992, Accuracy = 0.8356000185012817
    PERFORMANCE ON TEST SET: Batch Loss = 0.6563501358032227, Accuracy = 0.8029962778091431
    =================================================
    2.1600000000000003e-31
    1.0000000000000003e-32
    Training iter #72500: Batch Loss = 0.520006, Accuracy = 0.8392000198364258
    PERFORMANCE ON TEST SET: Batch Loss = 0.6511105298995972, Accuracy = 0.8107365965843201
    Training iter #73000: Batch Loss = 0.505442, Accuracy = 0.853600025177002
    PERFORMANCE ON TEST SET: Batch Loss = 0.6530881524085999, Accuracy = 0.8092384338378906
    Training iter #73500: Batch Loss = 0.644485, Accuracy = 0.7735999822616577
    PERFORMANCE ON TEST SET: Batch Loss = 0.6734681129455566, Accuracy = 0.8042446970939636
    Training iter #74000: Batch Loss = 0.525250, Accuracy = 0.8384000062942505
    PERFORMANCE ON TEST SET: Batch Loss = 0.6551859378814697, Accuracy = 0.8109862804412842
    Training iter #74500: Batch Loss = 0.517598, Accuracy = 0.8399999737739563
    PERFORMANCE ON TEST SET: Batch Loss = 0.6593272686004639, Accuracy = 0.812734067440033
    =================================================
    2.1600000000000003e-32
    1.0000000000000004e-33
    Training iter #75000: Batch Loss = 0.544363, Accuracy = 0.8172000050544739
    PERFORMANCE ON TEST SET: Batch Loss = 0.6750629544258118, Accuracy = 0.8007490634918213
    Training iter #75500: Batch Loss = 0.516808, Accuracy = 0.8379999995231628
    PERFORMANCE ON TEST SET: Batch Loss = 0.6629161834716797, Accuracy = 0.8067415952682495
    Training iter #76000: Batch Loss = 0.536788, Accuracy = 0.828000009059906
    PERFORMANCE ON TEST SET: Batch Loss = 0.6684167981147766, Accuracy = 0.802746593952179
    Training iter #76500: Batch Loss = 0.551132, Accuracy = 0.8119999766349792
    PERFORMANCE ON TEST SET: Batch Loss = 0.7350499629974365, Accuracy = 0.7605493068695068
    Training iter #77000: Batch Loss = 0.518125, Accuracy = 0.8379999995231628
    PERFORMANCE ON TEST SET: Batch Loss = 0.6601676344871521, Accuracy = 0.8052434325218201
    =================================================
    2.1600000000000005e-33
    1.0000000000000004e-34
    Training iter #77500: Batch Loss = 0.527858, Accuracy = 0.8384000062942505
    PERFORMANCE ON TEST SET: Batch Loss = 0.6648092269897461, Accuracy = 0.8067415952682495
    Training iter #78000: Batch Loss = 0.524066, Accuracy = 0.83160001039505
    PERFORMANCE ON TEST SET: Batch Loss = 0.6546053886413574, Accuracy = 0.8117353320121765
    Training iter #78500: Batch Loss = 0.501700, Accuracy = 0.8528000116348267
    PERFORMANCE ON TEST SET: Batch Loss = 0.6609326004981995, Accuracy = 0.8059924840927124
    Training iter #79000: Batch Loss = 0.495579, Accuracy = 0.8583999872207642
    PERFORMANCE ON TEST SET: Batch Loss = 0.679245114326477, Accuracy = 0.8014981150627136
    Training iter #79500: Batch Loss = 0.528608, Accuracy = 0.8384000062942505
    PERFORMANCE ON TEST SET: Batch Loss = 0.6538097858428955, Accuracy = 0.8194756507873535
    =================================================
    2.1600000000000006e-34
    1.0000000000000004e-35
    Training iter #80000: Batch Loss = 0.499089, Accuracy = 0.8560000061988831
    PERFORMANCE ON TEST SET: Batch Loss = 0.6604946851730347, Accuracy = 0.8144819140434265
    Training iter #80500: Batch Loss = 0.490940, Accuracy = 0.8615999817848206
    PERFORMANCE ON TEST SET: Batch Loss = 0.6419547200202942, Accuracy = 0.8222222328186035
    Training iter #81000: Batch Loss = 0.481990, Accuracy = 0.8636000156402588
    PERFORMANCE ON TEST SET: Batch Loss = 0.6479967832565308, Accuracy = 0.8194756507873535
    Training iter #81500: Batch Loss = 0.539815, Accuracy = 0.8240000009536743
    PERFORMANCE ON TEST SET: Batch Loss = 0.7052075266838074, Accuracy = 0.781772792339325
    Training iter #82000: Batch Loss = 0.541493, Accuracy = 0.8203999996185303
    PERFORMANCE ON TEST SET: Batch Loss = 0.6794268488883972, Accuracy = 0.7980024814605713
    =================================================
    2.1600000000000006e-35
    1.0000000000000004e-36
    Training iter #82500: Batch Loss = 0.509468, Accuracy = 0.8388000130653381
    PERFORMANCE ON TEST SET: Batch Loss = 0.6556344628334045, Accuracy = 0.818227231502533
    Training iter #83000: Batch Loss = 0.535530, Accuracy = 0.8320000171661377
    PERFORMANCE ON TEST SET: Batch Loss = 0.6691509485244751, Accuracy = 0.8057428002357483
    Training iter #83500: Batch Loss = 0.500989, Accuracy = 0.8543999791145325
    PERFORMANCE ON TEST SET: Batch Loss = 0.6882047653198242, Accuracy = 0.7990012764930725
    Training iter #84000: Batch Loss = 0.487877, Accuracy = 0.8659999966621399
    PERFORMANCE ON TEST SET: Batch Loss = 0.6849247217178345, Accuracy = 0.7927590608596802
    Training iter #84500: Batch Loss = 0.507246, Accuracy = 0.8479999899864197
    PERFORMANCE ON TEST SET: Batch Loss = 0.6567555665969849, Accuracy = 0.8199750185012817
    =================================================
    2.1600000000000005e-36
    1.0000000000000005e-37
    Training iter #85000: Batch Loss = 0.522877, Accuracy = 0.8452000021934509
    PERFORMANCE ON TEST SET: Batch Loss = 0.6849155426025391, Accuracy = 0.8024968504905701
    Training iter #85500: Batch Loss = 0.515481, Accuracy = 0.8435999751091003
    PERFORMANCE ON TEST SET: Batch Loss = 0.6705456972122192, Accuracy = 0.8094881176948547
    Training iter #86000: Batch Loss = 0.555353, Accuracy = 0.8123999834060669
    PERFORMANCE ON TEST SET: Batch Loss = 0.7142224311828613, Accuracy = 0.7802746295928955
    Training iter #86500: Batch Loss = 0.495908, Accuracy = 0.8596000075340271
    PERFORMANCE ON TEST SET: Batch Loss = 0.6620432734489441, Accuracy = 0.8122346997261047
    Training iter #87000: Batch Loss = 0.564643, Accuracy = 0.8119999766349792
    PERFORMANCE ON TEST SET: Batch Loss = 0.6883030533790588, Accuracy = 0.7982521653175354
    =================================================
    2.1600000000000005e-37
    1.0000000000000005e-38
    Training iter #87500: Batch Loss = 0.521879, Accuracy = 0.8303999900817871
    PERFORMANCE ON TEST SET: Batch Loss = 0.6782575845718384, Accuracy = 0.8079900145530701
    Training iter #88000: Batch Loss = 0.499717, Accuracy = 0.852400004863739
    PERFORMANCE ON TEST SET: Batch Loss = 0.6524430513381958, Accuracy = 0.8254681825637817
    Training iter #88500: Batch Loss = 0.556648, Accuracy = 0.8119999766349792
    PERFORMANCE ON TEST SET: Batch Loss = 0.6907994747161865, Accuracy = 0.7975031137466431
    Training iter #89000: Batch Loss = 0.541843, Accuracy = 0.8324000239372253
    PERFORMANCE ON TEST SET: Batch Loss = 0.6895750761032104, Accuracy = 0.8014981150627136
    Training iter #89500: Batch Loss = 0.493911, Accuracy = 0.8664000034332275
    PERFORMANCE ON TEST SET: Batch Loss = 0.7751883864402771, Accuracy = 0.7543070912361145
    =================================================
    2.1600000000000006e-38
    1.0000000000000004e-39
    Training iter #90000: Batch Loss = 0.490797, Accuracy = 0.8628000020980835
    PERFORMANCE ON TEST SET: Batch Loss = 0.6737024784088135, Accuracy = 0.815480649471283
    Training iter #90500: Batch Loss = 0.551458, Accuracy = 0.8263999819755554
    PERFORMANCE ON TEST SET: Batch Loss = 0.6610298752784729, Accuracy = 0.8249688148498535
    Training iter #91000: Batch Loss = 0.519748, Accuracy = 0.8396000266075134
    PERFORMANCE ON TEST SET: Batch Loss = 0.6533381938934326, Accuracy = 0.8279650211334229
    Training iter #91500: Batch Loss = 0.497474, Accuracy = 0.86080002784729
    PERFORMANCE ON TEST SET: Batch Loss = 0.6440478563308716, Accuracy = 0.8277153372764587
    Training iter #92000: Batch Loss = 0.495979, Accuracy = 0.8543999791145325
    PERFORMANCE ON TEST SET: Batch Loss = 0.6800777316093445, Accuracy = 0.8084893822669983
    =================================================
    2.1600000000000007e-39
    1.0000000000000003e-40
    Training iter #92500: Batch Loss = 0.536917, Accuracy = 0.8327999711036682
    PERFORMANCE ON TEST SET: Batch Loss = 0.6647095680236816, Accuracy = 0.8194756507873535
    Training iter #93000: Batch Loss = 0.498767, Accuracy = 0.8575999736785889
    PERFORMANCE ON TEST SET: Batch Loss = 0.6972509622573853, Accuracy = 0.7997503280639648
    Training iter #93500: Batch Loss = 0.518971, Accuracy = 0.8348000049591064
    PERFORMANCE ON TEST SET: Batch Loss = 0.6616702079772949, Accuracy = 0.8187265992164612
    Training iter #94000: Batch Loss = 0.512432, Accuracy = 0.8464000225067139
    PERFORMANCE ON TEST SET: Batch Loss = 0.6552411913871765, Accuracy = 0.8252184987068176
    Training iter #94500: Batch Loss = 0.520650, Accuracy = 0.8424000144004822
    PERFORMANCE ON TEST SET: Batch Loss = 0.6727169752120972, Accuracy = 0.8112359642982483
    =================================================
    2.160000000000001e-40
    1.0000000000000004e-41
    Training iter #95000: Batch Loss = 0.494834, Accuracy = 0.8632000088691711
    PERFORMANCE ON TEST SET: Batch Loss = 0.6535991430282593, Accuracy = 0.8252184987068176
    Training iter #95500: Batch Loss = 0.478122, Accuracy = 0.8704000115394592
    PERFORMANCE ON TEST SET: Batch Loss = 0.6568895578384399, Accuracy = 0.8332085013389587
    Training iter #96000: Batch Loss = 0.515483, Accuracy = 0.8492000102996826
    PERFORMANCE ON TEST SET: Batch Loss = 0.6746082901954651, Accuracy = 0.8172284364700317
    Training iter #96500: Batch Loss = 0.529833, Accuracy = 0.8307999968528748
    PERFORMANCE ON TEST SET: Batch Loss = 0.6877593398094177, Accuracy = 0.8024968504905701
    Training iter #97000: Batch Loss = 0.698923, Accuracy = 0.7444000244140625
    PERFORMANCE ON TEST SET: Batch Loss = 0.7520477771759033, Accuracy = 0.7690386772155762
    =================================================
    2.1600000000000008e-41
    1.0000000000000004e-42
    Training iter #97500: Batch Loss = 0.552682, Accuracy = 0.8208000063896179
    PERFORMANCE ON TEST SET: Batch Loss = 0.6551406979560852, Accuracy = 0.8257178664207458
    Training iter #98000: Batch Loss = 0.531230, Accuracy = 0.840399980545044
    PERFORMANCE ON TEST SET: Batch Loss = 0.6653801202774048, Accuracy = 0.8277153372764587
    Training iter #98500: Batch Loss = 0.486381, Accuracy = 0.864799976348877
    PERFORMANCE ON TEST SET: Batch Loss = 0.6562387347221375, Accuracy = 0.8312109708786011
    Training iter #99000: Batch Loss = 0.476910, Accuracy = 0.8672000169754028
    PERFORMANCE ON TEST SET: Batch Loss = 0.649324893951416, Accuracy = 0.8349562883377075
    Training iter #99500: Batch Loss = 0.469605, Accuracy = 0.8776000142097473
    PERFORMANCE ON TEST SET: Batch Loss = 0.6442927718162537, Accuracy = 0.8342072367668152
    =================================================
    2.160000000000001e-42
    1.0000000000000003e-43
    Training iter #100000: Batch Loss = 0.512480, Accuracy = 0.848800003528595
    PERFORMANCE ON TEST SET: Batch Loss = 0.6763448119163513, Accuracy = 0.8142322301864624
    Training iter #100500: Batch Loss = 0.569198, Accuracy = 0.8131999969482422
    PERFORMANCE ON TEST SET: Batch Loss = 0.7648707628250122, Accuracy = 0.7622971534729004
    Training iter #101000: Batch Loss = 0.502843, Accuracy = 0.8568000197410583
    PERFORMANCE ON TEST SET: Batch Loss = 0.6656599640846252, Accuracy = 0.8222222328186035
    Training iter #101500: Batch Loss = 0.505157, Accuracy = 0.8579999804496765
    PERFORMANCE ON TEST SET: Batch Loss = 0.6529954075813293, Accuracy = 0.8339575529098511
    Training iter #102000: Batch Loss = 0.502496, Accuracy = 0.8564000129699707
    PERFORMANCE ON TEST SET: Batch Loss = 0.6551215052604675, Accuracy = 0.8324594497680664
    =================================================
    2.1600000000000007e-43
    1.0000000000000003e-44
    Training iter #102500: Batch Loss = 0.479375, Accuracy = 0.8704000115394592
    PERFORMANCE ON TEST SET: Batch Loss = 0.6519379615783691, Accuracy = 0.830961287021637
    Training iter #103000: Batch Loss = 0.478945, Accuracy = 0.8676000237464905
    PERFORMANCE ON TEST SET: Batch Loss = 0.6506968140602112, Accuracy = 0.8327091336250305
    Training iter #103500: Batch Loss = 0.520221, Accuracy = 0.8507999777793884
    PERFORMANCE ON TEST SET: Batch Loss = 0.7234860062599182, Accuracy = 0.7917603254318237
    Training iter #104000: Batch Loss = 0.481034, Accuracy = 0.8727999925613403
    PERFORMANCE ON TEST SET: Batch Loss = 0.6448845267295837, Accuracy = 0.8374531865119934
    Training iter #104500: Batch Loss = 0.467022, Accuracy = 0.8787999749183655
    PERFORMANCE ON TEST SET: Batch Loss = 0.654399037361145, Accuracy = 0.830961287021637
    =================================================
    2.1600000000000006e-44
    1.0000000000000003e-45
    Training iter #105000: Batch Loss = 0.490332, Accuracy = 0.8640000224113464
    PERFORMANCE ON TEST SET: Batch Loss = 0.6696444749832153, Accuracy = 0.8162297010421753
    Training iter #105500: Batch Loss = 0.516833, Accuracy = 0.8471999764442444
    PERFORMANCE ON TEST SET: Batch Loss = 0.6472347974777222, Accuracy = 0.8349562883377075
    Training iter #106000: Batch Loss = 0.479441, Accuracy = 0.8748000264167786
    PERFORMANCE ON TEST SET: Batch Loss = 0.6687641143798828, Accuracy = 0.8249688148498535
    Training iter #106500: Batch Loss = 0.482581, Accuracy = 0.868399977684021
    PERFORMANCE ON TEST SET: Batch Loss = 0.651081919670105, Accuracy = 0.8374531865119934
    Training iter #107000: Batch Loss = 0.494576, Accuracy = 0.8664000034332275
    PERFORMANCE ON TEST SET: Batch Loss = 0.6757979393005371, Accuracy = 0.8237203359603882
    =================================================
    2.1600000000000006e-45
    1.0000000000000002e-46
    Training iter #107500: Batch Loss = 0.508835, Accuracy = 0.8564000129699707
    PERFORMANCE ON TEST SET: Batch Loss = 0.701384425163269, Accuracy = 0.8019974827766418
    Training iter #108000: Batch Loss = 0.482804, Accuracy = 0.8708000183105469
    PERFORMANCE ON TEST SET: Batch Loss = 0.6488062143325806, Accuracy = 0.8329588174819946
    Training iter #108500: Batch Loss = 0.570780, Accuracy = 0.8059999942779541
    PERFORMANCE ON TEST SET: Batch Loss = 0.6672287583351135, Accuracy = 0.8167290687561035
    Training iter #109000: Batch Loss = 0.501609, Accuracy = 0.8600000143051147
    PERFORMANCE ON TEST SET: Batch Loss = 0.701431155204773, Accuracy = 0.8057428002357483
    Training iter #109500: Batch Loss = 0.473564, Accuracy = 0.878000020980835
    PERFORMANCE ON TEST SET: Batch Loss = 0.6431317925453186, Accuracy = 0.844194769859314
    =================================================
    2.1600000000000006e-46
    1.0000000000000002e-47
    Training iter #110000: Batch Loss = 0.491123, Accuracy = 0.8615999817848206
    PERFORMANCE ON TEST SET: Batch Loss = 0.7031930685043335, Accuracy = 0.7960050106048584
    Training iter #110500: Batch Loss = 0.466903, Accuracy = 0.8787999749183655
    PERFORMANCE ON TEST SET: Batch Loss = 0.6470354199409485, Accuracy = 0.8344569206237793
    Training iter #111000: Batch Loss = 0.474051, Accuracy = 0.8787999749183655
    PERFORMANCE ON TEST SET: Batch Loss = 0.6587664484977722, Accuracy = 0.8339575529098511
    Training iter #111500: Batch Loss = 0.465318, Accuracy = 0.88919997215271
    PERFORMANCE ON TEST SET: Batch Loss = 0.66572505235672, Accuracy = 0.8227216005325317
    Training iter #112000: Batch Loss = 0.479601, Accuracy = 0.8787999749183655
    PERFORMANCE ON TEST SET: Batch Loss = 0.6577253937721252, Accuracy = 0.8404494524002075
    =================================================
    2.1600000000000006e-47
    1.0000000000000003e-48
    Training iter #112500: Batch Loss = 0.488104, Accuracy = 0.8759999871253967
    PERFORMANCE ON TEST SET: Batch Loss = 0.6566717624664307, Accuracy = 0.8419475555419922
    Training iter #113000: Batch Loss = 0.484451, Accuracy = 0.8748000264167786
    PERFORMANCE ON TEST SET: Batch Loss = 0.6448106169700623, Accuracy = 0.8429462909698486
    Training iter #113500: Batch Loss = 0.496705, Accuracy = 0.8632000088691711
    PERFORMANCE ON TEST SET: Batch Loss = 0.6537712216377258, Accuracy = 0.8384519219398499
    Training iter #114000: Batch Loss = 0.466210, Accuracy = 0.8827999830245972
    PERFORMANCE ON TEST SET: Batch Loss = 0.6457043886184692, Accuracy = 0.8416978716850281
    Training iter #114500: Batch Loss = 0.493125, Accuracy = 0.8679999709129333
    PERFORMANCE ON TEST SET: Batch Loss = 0.653386116027832, Accuracy = 0.8392009735107422
    =================================================
    2.1600000000000006e-48
    1.0000000000000003e-49
    Training iter #115000: Batch Loss = 0.476083, Accuracy = 0.8755999803543091
    PERFORMANCE ON TEST SET: Batch Loss = 0.6873400211334229, Accuracy = 0.8119850158691406
    Training iter #115500: Batch Loss = 0.542773, Accuracy = 0.8199999928474426
    PERFORMANCE ON TEST SET: Batch Loss = 0.6588715314865112, Accuracy = 0.8244693875312805
    Training iter #116000: Batch Loss = 0.479472, Accuracy = 0.8744000196456909
    PERFORMANCE ON TEST SET: Batch Loss = 0.6470535397529602, Accuracy = 0.8359550833702087
    Training iter #116500: Batch Loss = 0.479374, Accuracy = 0.876800000667572
    PERFORMANCE ON TEST SET: Batch Loss = 0.6859444975852966, Accuracy = 0.8189762830734253
    Training iter #117000: Batch Loss = 0.469604, Accuracy = 0.8820000290870667
    PERFORMANCE ON TEST SET: Batch Loss = 0.7198197841644287, Accuracy = 0.8004993796348572
    =================================================
    2.1600000000000007e-49
    1.0000000000000004e-50
    Training iter #117500: Batch Loss = 0.476870, Accuracy = 0.8744000196456909
    PERFORMANCE ON TEST SET: Batch Loss = 0.6666823625564575, Accuracy = 0.8327091336250305
    Training iter #118000: Batch Loss = 0.477139, Accuracy = 0.8844000101089478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6470620632171631, Accuracy = 0.8446941375732422
    Training iter #118500: Batch Loss = 0.469249, Accuracy = 0.8863999843597412
    PERFORMANCE ON TEST SET: Batch Loss = 0.6403672695159912, Accuracy = 0.8479400873184204
    Training iter #119000: Batch Loss = 0.479992, Accuracy = 0.8740000128746033
    PERFORMANCE ON TEST SET: Batch Loss = 0.6444868445396423, Accuracy = 0.8454431891441345
    Training iter #119500: Batch Loss = 0.492181, Accuracy = 0.8600000143051147
    PERFORMANCE ON TEST SET: Batch Loss = 0.6415693163871765, Accuracy = 0.8471910357475281
    =================================================
    2.1600000000000007e-50
    1.0000000000000003e-51
    Training iter #120000: Batch Loss = 0.525256, Accuracy = 0.8468000292778015
    PERFORMANCE ON TEST SET: Batch Loss = 0.7127851247787476, Accuracy = 0.8047440648078918
    Training iter #120500: Batch Loss = 0.561231, Accuracy = 0.823199987411499
    PERFORMANCE ON TEST SET: Batch Loss = 0.7282019853591919, Accuracy = 0.789513111114502
    Training iter #121000: Batch Loss = 0.465927, Accuracy = 0.8744000196456909
    PERFORMANCE ON TEST SET: Batch Loss = 0.6678003072738647, Accuracy = 0.8319600224494934
    Training iter #121500: Batch Loss = 0.451552, Accuracy = 0.8939999938011169
    PERFORMANCE ON TEST SET: Batch Loss = 0.6696289777755737, Accuracy = 0.8257178664207458
    Training iter #122000: Batch Loss = 0.475242, Accuracy = 0.8835999965667725
    PERFORMANCE ON TEST SET: Batch Loss = 0.6741013526916504, Accuracy = 0.8292135000228882
    =================================================
    2.1600000000000007e-51
    1.0000000000000004e-52
    Training iter #122500: Batch Loss = 0.456778, Accuracy = 0.8920000195503235
    PERFORMANCE ON TEST SET: Batch Loss = 0.6563262343406677, Accuracy = 0.8374531865119934
    Training iter #123000: Batch Loss = 0.485642, Accuracy = 0.8679999709129333
    PERFORMANCE ON TEST SET: Batch Loss = 0.6820767521858215, Accuracy = 0.815480649471283
    Training iter #123500: Batch Loss = 0.493647, Accuracy = 0.8691999912261963
    PERFORMANCE ON TEST SET: Batch Loss = 0.6520815491676331, Accuracy = 0.8444444537162781
    Training iter #124000: Batch Loss = 0.515505, Accuracy = 0.852400004863739
    PERFORMANCE ON TEST SET: Batch Loss = 0.6431238651275635, Accuracy = 0.8461922407150269
    Training iter #124500: Batch Loss = 0.466187, Accuracy = 0.8859999775886536
    PERFORMANCE ON TEST SET: Batch Loss = 0.6431993246078491, Accuracy = 0.8489388227462769
    =================================================
    2.1600000000000007e-52
    1.0000000000000004e-53
    Training iter #125000: Batch Loss = 0.454148, Accuracy = 0.8895999789237976
    PERFORMANCE ON TEST SET: Batch Loss = 0.6394115686416626, Accuracy = 0.8481897711753845
    Training iter #125500: Batch Loss = 0.491863, Accuracy = 0.871999979019165
    PERFORMANCE ON TEST SET: Batch Loss = 0.6544270515441895, Accuracy = 0.8426966071128845
    Training iter #126000: Batch Loss = 0.477812, Accuracy = 0.8787999749183655
    PERFORMANCE ON TEST SET: Batch Loss = 0.6974741816520691, Accuracy = 0.8114856481552124
    Training iter #126500: Batch Loss = 0.460281, Accuracy = 0.8823999762535095
    PERFORMANCE ON TEST SET: Batch Loss = 0.6434280872344971, Accuracy = 0.8529338240623474
    Training iter #127000: Batch Loss = 0.482153, Accuracy = 0.8632000088691711
    PERFORMANCE ON TEST SET: Batch Loss = 0.7269370555877686, Accuracy = 0.7825218439102173
    =================================================
    2.1600000000000007e-53
    1.0000000000000003e-54
    Training iter #127500: Batch Loss = 0.479683, Accuracy = 0.8763999938964844
    PERFORMANCE ON TEST SET: Batch Loss = 0.7010337114334106, Accuracy = 0.8114856481552124
    Training iter #128000: Batch Loss = 0.454984, Accuracy = 0.8931999802589417
    PERFORMANCE ON TEST SET: Batch Loss = 0.6523404717445374, Accuracy = 0.8459425568580627
    Training iter #128500: Batch Loss = 0.463736, Accuracy = 0.88919997215271
    PERFORMANCE ON TEST SET: Batch Loss = 0.6490777730941772, Accuracy = 0.8476904034614563
    Training iter #129000: Batch Loss = 0.501892, Accuracy = 0.8651999831199646
    PERFORMANCE ON TEST SET: Batch Loss = 0.6524984836578369, Accuracy = 0.8509363532066345
    Training iter #129500: Batch Loss = 0.580357, Accuracy = 0.8116000294685364
    PERFORMANCE ON TEST SET: Batch Loss = 0.6657751202583313, Accuracy = 0.8359550833702087
    =================================================
    2.1600000000000006e-54
    1.0000000000000004e-55
    Training iter #130000: Batch Loss = 0.477873, Accuracy = 0.8795999884605408
    PERFORMANCE ON TEST SET: Batch Loss = 0.6549442410469055, Accuracy = 0.8399500846862793
    Training iter #130500: Batch Loss = 0.499612, Accuracy = 0.8543999791145325
    PERFORMANCE ON TEST SET: Batch Loss = 0.6455782651901245, Accuracy = 0.8514357209205627
    Training iter #131000: Batch Loss = 0.483941, Accuracy = 0.8744000196456909
    PERFORMANCE ON TEST SET: Batch Loss = 0.6770440340042114, Accuracy = 0.8269662857055664
    Training iter #131500: Batch Loss = 0.507096, Accuracy = 0.8583999872207642
    PERFORMANCE ON TEST SET: Batch Loss = 0.8148543238639832, Accuracy = 0.7548065185546875
    Training iter #132000: Batch Loss = 0.479200, Accuracy = 0.8679999709129333
    PERFORMANCE ON TEST SET: Batch Loss = 0.6513737440109253, Accuracy = 0.8451935052871704
    =================================================
    2.1600000000000007e-55
    1.0000000000000004e-56
    Training iter #132500: Batch Loss = 0.440020, Accuracy = 0.9016000032424927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6450732350349426, Accuracy = 0.8536828756332397
    Training iter #133000: Batch Loss = 0.460211, Accuracy = 0.8935999870300293
    PERFORMANCE ON TEST SET: Batch Loss = 0.640078067779541, Accuracy = 0.8556804060935974
    Training iter #133500: Batch Loss = 0.441472, Accuracy = 0.9035999774932861
    PERFORMANCE ON TEST SET: Batch Loss = 0.6409358978271484, Accuracy = 0.8584269881248474
    Training iter #134000: Batch Loss = 0.464890, Accuracy = 0.8844000101089478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6512279510498047, Accuracy = 0.846441924571991
    Training iter #134500: Batch Loss = 0.484631, Accuracy = 0.8772000074386597
    PERFORMANCE ON TEST SET: Batch Loss = 0.656691312789917, Accuracy = 0.8499375581741333
    =================================================
    2.1600000000000006e-56
    1.0000000000000004e-57
    Training iter #135000: Batch Loss = 0.488942, Accuracy = 0.873199999332428
    PERFORMANCE ON TEST SET: Batch Loss = 0.650661826133728, Accuracy = 0.8489388227462769
    Training iter #135500: Batch Loss = 0.449469, Accuracy = 0.8999999761581421
    PERFORMANCE ON TEST SET: Batch Loss = 0.6376165747642517, Accuracy = 0.8576778769493103
    Training iter #136000: Batch Loss = 0.489980, Accuracy = 0.8640000224113464
    PERFORMANCE ON TEST SET: Batch Loss = 0.7159780263900757, Accuracy = 0.800000011920929
    Training iter #136500: Batch Loss = 0.472072, Accuracy = 0.8831999897956848
    PERFORMANCE ON TEST SET: Batch Loss = 0.6531903743743896, Accuracy = 0.8481897711753845
    Training iter #137000: Batch Loss = 0.479562, Accuracy = 0.8736000061035156
    PERFORMANCE ON TEST SET: Batch Loss = 0.6905461549758911, Accuracy = 0.8207240700721741
    =================================================
    2.1600000000000005e-57
    1.0000000000000004e-58
    Training iter #137500: Batch Loss = 0.482066, Accuracy = 0.8708000183105469
    PERFORMANCE ON TEST SET: Batch Loss = 0.6611344814300537, Accuracy = 0.8404494524002075
    Training iter #138000: Batch Loss = 0.455211, Accuracy = 0.8895999789237976
    PERFORMANCE ON TEST SET: Batch Loss = 0.6399320960044861, Accuracy = 0.8589263558387756
    Training iter #138500: Batch Loss = 0.551070, Accuracy = 0.8363999724388123
    PERFORMANCE ON TEST SET: Batch Loss = 0.7231252193450928, Accuracy = 0.7937577962875366
    Training iter #139000: Batch Loss = 0.565490, Accuracy = 0.8199999928474426
    PERFORMANCE ON TEST SET: Batch Loss = 0.7143517732620239, Accuracy = 0.7950062155723572
    Training iter #139500: Batch Loss = 0.540614, Accuracy = 0.8388000130653381
    PERFORMANCE ON TEST SET: Batch Loss = 0.7024973630905151, Accuracy = 0.812734067440033
    =================================================
    2.1600000000000006e-58
    1.0000000000000005e-59
    Training iter #140000: Batch Loss = 0.468363, Accuracy = 0.8916000127792358
    PERFORMANCE ON TEST SET: Batch Loss = 0.6664584875106812, Accuracy = 0.8451935052871704
    Training iter #140500: Batch Loss = 0.465285, Accuracy = 0.8944000005722046
    PERFORMANCE ON TEST SET: Batch Loss = 0.6533288359642029, Accuracy = 0.8514357209205627
    Training iter #141000: Batch Loss = 0.646088, Accuracy = 0.7936000227928162
    PERFORMANCE ON TEST SET: Batch Loss = 0.694471001625061, Accuracy = 0.828214704990387
    Training iter #141500: Batch Loss = 0.440477, Accuracy = 0.9028000235557556
    PERFORMANCE ON TEST SET: Batch Loss = 0.6407341361045837, Accuracy = 0.8616729378700256
    Training iter #142000: Batch Loss = 0.477439, Accuracy = 0.8808000087738037
    PERFORMANCE ON TEST SET: Batch Loss = 0.656014621257782, Accuracy = 0.8479400873184204
    =================================================
    2.1600000000000004e-59
    1.0000000000000005e-60
    Training iter #142500: Batch Loss = 0.588579, Accuracy = 0.8059999942779541
    PERFORMANCE ON TEST SET: Batch Loss = 0.828930139541626, Accuracy = 0.7518102526664734
    Training iter #143000: Batch Loss = 0.451714, Accuracy = 0.8935999870300293
    PERFORMANCE ON TEST SET: Batch Loss = 0.6545240879058838, Accuracy = 0.856928825378418
    Training iter #143500: Batch Loss = 0.434969, Accuracy = 0.9052000045776367
    PERFORMANCE ON TEST SET: Batch Loss = 0.6432766318321228, Accuracy = 0.8576778769493103
    Training iter #144000: Batch Loss = 0.456990, Accuracy = 0.8988000154495239
    PERFORMANCE ON TEST SET: Batch Loss = 0.6393046379089355, Accuracy = 0.8609238266944885
    Training iter #144500: Batch Loss = 0.445773, Accuracy = 0.9047999978065491
    PERFORMANCE ON TEST SET: Batch Loss = 0.6635194420814514, Accuracy = 0.8436954021453857
    =================================================
    2.1600000000000005e-60
    1.0000000000000006e-61
    Training iter #145000: Batch Loss = 0.525869, Accuracy = 0.8483999967575073
    PERFORMANCE ON TEST SET: Batch Loss = 0.6900901198387146, Accuracy = 0.8307116031646729
    Training iter #145500: Batch Loss = 0.455140, Accuracy = 0.8992000222206116
    PERFORMANCE ON TEST SET: Batch Loss = 0.6495354175567627, Accuracy = 0.8591760396957397
    Training iter #146000: Batch Loss = 0.456115, Accuracy = 0.901199996471405
    PERFORMANCE ON TEST SET: Batch Loss = 0.6434711217880249, Accuracy = 0.864669144153595
    Training iter #146500: Batch Loss = 0.466671, Accuracy = 0.8912000060081482
    PERFORMANCE ON TEST SET: Batch Loss = 0.6428562998771667, Accuracy = 0.8611735105514526
    Training iter #147000: Batch Loss = 0.442019, Accuracy = 0.9020000100135803
    PERFORMANCE ON TEST SET: Batch Loss = 0.6443480253219604, Accuracy = 0.8604244589805603
    =================================================
    2.1600000000000005e-61
    1.0000000000000005e-62
    Training iter #147500: Batch Loss = 0.461818, Accuracy = 0.8912000060081482
    PERFORMANCE ON TEST SET: Batch Loss = 0.6527995467185974, Accuracy = 0.8514357209205627
    Training iter #148000: Batch Loss = 0.469362, Accuracy = 0.8848000168800354
    PERFORMANCE ON TEST SET: Batch Loss = 0.6561824083328247, Accuracy = 0.8511860370635986
    Training iter #148500: Batch Loss = 0.451530, Accuracy = 0.8980000019073486
    PERFORMANCE ON TEST SET: Batch Loss = 0.6478580236434937, Accuracy = 0.8619226217269897
    Training iter #149000: Batch Loss = 0.657840, Accuracy = 0.7799999713897705
    PERFORMANCE ON TEST SET: Batch Loss = 0.7355902194976807, Accuracy = 0.7992509603500366
    Training iter #149500: Batch Loss = 0.467908, Accuracy = 0.892799973487854
    PERFORMANCE ON TEST SET: Batch Loss = 0.6881867051124573, Accuracy = 0.8334581851959229
    =================================================
    2.1600000000000006e-62
    1.0000000000000005e-63
    Training iter #150000: Batch Loss = 0.462619, Accuracy = 0.8907999992370605
    PERFORMANCE ON TEST SET: Batch Loss = 0.6892973184585571, Accuracy = 0.8297128677368164
    Training iter #150500: Batch Loss = 0.453285, Accuracy = 0.8956000208854675
    PERFORMANCE ON TEST SET: Batch Loss = 0.6634359359741211, Accuracy = 0.8499375581741333
    Training iter #151000: Batch Loss = 0.518039, Accuracy = 0.8611999750137329
    PERFORMANCE ON TEST SET: Batch Loss = 0.7498172521591187, Accuracy = 0.7952559590339661
    Training iter #151500: Batch Loss = 0.457643, Accuracy = 0.896399974822998
    PERFORMANCE ON TEST SET: Batch Loss = 0.6576567888259888, Accuracy = 0.8539325594902039
    Training iter #152000: Batch Loss = 0.474726, Accuracy = 0.8871999979019165
    PERFORMANCE ON TEST SET: Batch Loss = 0.6712970733642578, Accuracy = 0.8424469232559204
    =================================================
    2.1600000000000007e-63
    1.0000000000000005e-64
    Training iter #152500: Batch Loss = 0.638842, Accuracy = 0.7968000173568726
    PERFORMANCE ON TEST SET: Batch Loss = 0.908103883266449, Accuracy = 0.7205992341041565
    Training iter #153000: Batch Loss = 0.472641, Accuracy = 0.8804000020027161
    PERFORMANCE ON TEST SET: Batch Loss = 0.6544905304908752, Accuracy = 0.8556804060935974
    Training iter #153500: Batch Loss = 0.457782, Accuracy = 0.8944000005722046
    PERFORMANCE ON TEST SET: Batch Loss = 0.6499098539352417, Accuracy = 0.8616729378700256
    Training iter #154000: Batch Loss = 0.452795, Accuracy = 0.8960000276565552
    PERFORMANCE ON TEST SET: Batch Loss = 0.6647880673408508, Accuracy = 0.8529338240623474
    Training iter #154500: Batch Loss = 0.504899, Accuracy = 0.853600025177002
    PERFORMANCE ON TEST SET: Batch Loss = 0.6512835025787354, Accuracy = 0.8534331917762756
    =================================================
    2.1600000000000006e-64
    1.0000000000000006e-65
    Training iter #155000: Batch Loss = 0.467195, Accuracy = 0.8880000114440918
    PERFORMANCE ON TEST SET: Batch Loss = 0.697695255279541, Accuracy = 0.8257178664207458
    Training iter #155500: Batch Loss = 0.442499, Accuracy = 0.9039999842643738
    PERFORMANCE ON TEST SET: Batch Loss = 0.6711421012878418, Accuracy = 0.8451935052871704
    Training iter #156000: Batch Loss = 0.471930, Accuracy = 0.8831999897956848
    PERFORMANCE ON TEST SET: Batch Loss = 0.6700397729873657, Accuracy = 0.8494381904602051
    Training iter #156500: Batch Loss = 0.446784, Accuracy = 0.9031999707221985
    PERFORMANCE ON TEST SET: Batch Loss = 0.6555205583572388, Accuracy = 0.8639200925827026
    Training iter #157000: Batch Loss = 0.479552, Accuracy = 0.88919997215271
    PERFORMANCE ON TEST SET: Batch Loss = 0.6825295090675354, Accuracy = 0.838701605796814
    =================================================
    2.1600000000000004e-65
    1.0000000000000005e-66
    Training iter #157500: Batch Loss = 0.455164, Accuracy = 0.8980000019073486
    PERFORMANCE ON TEST SET: Batch Loss = 0.6763653755187988, Accuracy = 0.8456928730010986
    Training iter #158000: Batch Loss = 0.465705, Accuracy = 0.8871999979019165
    PERFORMANCE ON TEST SET: Batch Loss = 0.6522267460823059, Accuracy = 0.8616729378700256
    Training iter #158500: Batch Loss = 0.482691, Accuracy = 0.8723999857902527
    PERFORMANCE ON TEST SET: Batch Loss = 0.6772302389144897, Accuracy = 0.8456928730010986
    Training iter #159000: Batch Loss = 0.445655, Accuracy = 0.9100000262260437
    PERFORMANCE ON TEST SET: Batch Loss = 0.6560788154602051, Accuracy = 0.8606741428375244
    Training iter #159500: Batch Loss = 0.491023, Accuracy = 0.8679999709129333
    PERFORMANCE ON TEST SET: Batch Loss = 0.6625874042510986, Accuracy = 0.8504369258880615
    =================================================
    2.1600000000000004e-66
    1.0000000000000004e-67
    Training iter #160000: Batch Loss = 0.481118, Accuracy = 0.8700000047683716
    PERFORMANCE ON TEST SET: Batch Loss = 0.7244521379470825, Accuracy = 0.8034956455230713
    Training iter #160500: Batch Loss = 0.448162, Accuracy = 0.9056000113487244
    PERFORMANCE ON TEST SET: Batch Loss = 0.6457957625389099, Accuracy = 0.8634207248687744
    Training iter #161000: Batch Loss = 0.431727, Accuracy = 0.9156000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6513169407844543, Accuracy = 0.862421989440918
    Training iter #161500: Batch Loss = 0.434615, Accuracy = 0.9143999814987183
    PERFORMANCE ON TEST SET: Batch Loss = 0.651602029800415, Accuracy = 0.8616729378700256
    Training iter #162000: Batch Loss = 0.439828, Accuracy = 0.9075999855995178
    PERFORMANCE ON TEST SET: Batch Loss = 0.6553748846054077, Accuracy = 0.8606741428375244
    =================================================
    2.1600000000000005e-67
    1.0000000000000005e-68
    Training iter #162500: Batch Loss = 0.459780, Accuracy = 0.8960000276565552
    PERFORMANCE ON TEST SET: Batch Loss = 0.6468895673751831, Accuracy = 0.8689138293266296
    Training iter #163000: Batch Loss = 0.471913, Accuracy = 0.8848000168800354
    PERFORMANCE ON TEST SET: Batch Loss = 0.6524155139923096, Accuracy = 0.867415726184845
    Training iter #163500: Batch Loss = 0.440094, Accuracy = 0.9052000045776367
    PERFORMANCE ON TEST SET: Batch Loss = 0.6761960983276367, Accuracy = 0.8451935052871704
    Training iter #164000: Batch Loss = 0.453196, Accuracy = 0.8984000086784363
    PERFORMANCE ON TEST SET: Batch Loss = 0.6728078126907349, Accuracy = 0.8481897711753845
    Training iter #164500: Batch Loss = 0.529419, Accuracy = 0.8468000292778015
    PERFORMANCE ON TEST SET: Batch Loss = 0.6809371113777161, Accuracy = 0.8399500846862793
    =================================================
    2.1600000000000006e-68
    1.0000000000000005e-69
    Training iter #165000: Batch Loss = 0.448160, Accuracy = 0.8996000289916992
    PERFORMANCE ON TEST SET: Batch Loss = 0.6522221565246582, Accuracy = 0.8686641454696655
    Training iter #165500: Batch Loss = 0.438919, Accuracy = 0.906000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6541278958320618, Accuracy = 0.8651685118675232
    Training iter #166000: Batch Loss = 0.567108, Accuracy = 0.8339999914169312
    PERFORMANCE ON TEST SET: Batch Loss = 0.6838107705116272, Accuracy = 0.8314606547355652
    Training iter #166500: Batch Loss = 0.460067, Accuracy = 0.8888000249862671
    PERFORMANCE ON TEST SET: Batch Loss = 0.6765801906585693, Accuracy = 0.849188506603241
    Training iter #167000: Batch Loss = 0.439696, Accuracy = 0.9092000126838684
    PERFORMANCE ON TEST SET: Batch Loss = 0.6542661786079407, Accuracy = 0.864669144153595
    =================================================
    2.1600000000000006e-69
    1.0000000000000005e-70
    Training iter #167500: Batch Loss = 0.461074, Accuracy = 0.8952000141143799
    PERFORMANCE ON TEST SET: Batch Loss = 0.6837146282196045, Accuracy = 0.849188506603241
    Training iter #168000: Batch Loss = 0.511425, Accuracy = 0.8619999885559082
    PERFORMANCE ON TEST SET: Batch Loss = 0.699394702911377, Accuracy = 0.8267166018486023
    Training iter #168500: Batch Loss = 0.470075, Accuracy = 0.8916000127792358
    PERFORMANCE ON TEST SET: Batch Loss = 0.6881005167961121, Accuracy = 0.8461922407150269
    Training iter #169000: Batch Loss = 0.495037, Accuracy = 0.8676000237464905
    PERFORMANCE ON TEST SET: Batch Loss = 0.6596801280975342, Accuracy = 0.8626716732978821
    Training iter #169500: Batch Loss = 0.446016, Accuracy = 0.9052000045776367
    PERFORMANCE ON TEST SET: Batch Loss = 0.6824575662612915, Accuracy = 0.844194769859314
    =================================================
    2.1600000000000007e-70
    1.0000000000000005e-71
    Training iter #170000: Batch Loss = 0.471925, Accuracy = 0.8867999911308289
    PERFORMANCE ON TEST SET: Batch Loss = 0.728151798248291, Accuracy = 0.8094881176948547
    Training iter #170500: Batch Loss = 0.466938, Accuracy = 0.8871999979019165
    PERFORMANCE ON TEST SET: Batch Loss = 0.6920248866081238, Accuracy = 0.8379525542259216
    Training iter #171000: Batch Loss = 0.429535, Accuracy = 0.9156000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6518163084983826, Accuracy = 0.8689138293266296
    Training iter #171500: Batch Loss = 0.446460, Accuracy = 0.9056000113487244
    PERFORMANCE ON TEST SET: Batch Loss = 0.659939169883728, Accuracy = 0.8601747751235962
    Training iter #172000: Batch Loss = 0.438985, Accuracy = 0.9115999937057495
    PERFORMANCE ON TEST SET: Batch Loss = 0.6651192903518677, Accuracy = 0.8599250912666321
    =================================================
    2.160000000000001e-71
    1.0000000000000005e-72
    Training iter #172500: Batch Loss = 0.434041, Accuracy = 0.9151999950408936
    PERFORMANCE ON TEST SET: Batch Loss = 0.6638509631156921, Accuracy = 0.8611735105514526
    Training iter #173000: Batch Loss = 0.438589, Accuracy = 0.9075999855995178
    PERFORMANCE ON TEST SET: Batch Loss = 0.6578459739685059, Accuracy = 0.8656679391860962
    Training iter #173500: Batch Loss = 0.456433, Accuracy = 0.9007999897003174
    PERFORMANCE ON TEST SET: Batch Loss = 0.6505355834960938, Accuracy = 0.8704119920730591
    Training iter #174000: Batch Loss = 0.484924, Accuracy = 0.876800000667572
    PERFORMANCE ON TEST SET: Batch Loss = 0.7179298400878906, Accuracy = 0.8254681825637817
    Training iter #174500: Batch Loss = 0.449304, Accuracy = 0.8984000086784363
    PERFORMANCE ON TEST SET: Batch Loss = 0.6654706001281738, Accuracy = 0.8599250912666321
    =================================================
    2.160000000000001e-72
    1.0000000000000005e-73
    Training iter #175000: Batch Loss = 0.566629, Accuracy = 0.83160001039505
    PERFORMANCE ON TEST SET: Batch Loss = 0.7219547033309937, Accuracy = 0.8292135000228882
    Training iter #175500: Batch Loss = 0.462849, Accuracy = 0.8999999761581421
    PERFORMANCE ON TEST SET: Batch Loss = 0.6793650984764099, Accuracy = 0.8516854047775269
    Training iter #176000: Batch Loss = 0.530490, Accuracy = 0.8384000062942505
    PERFORMANCE ON TEST SET: Batch Loss = 0.6906334757804871, Accuracy = 0.8392009735107422
    Training iter #176500: Batch Loss = 0.445667, Accuracy = 0.902400016784668
    PERFORMANCE ON TEST SET: Batch Loss = 0.6560722589492798, Accuracy = 0.8689138293266296
    Training iter #177000: Batch Loss = 0.442621, Accuracy = 0.9079999923706055
    PERFORMANCE ON TEST SET: Batch Loss = 0.6767570376396179, Accuracy = 0.8459425568580627
    =================================================
    2.160000000000001e-73
    1.0000000000000005e-74
    Training iter #177500: Batch Loss = 0.458494, Accuracy = 0.8912000060081482
    PERFORMANCE ON TEST SET: Batch Loss = 0.7429006099700928, Accuracy = 0.8089887499809265
    Training iter #178000: Batch Loss = 0.446210, Accuracy = 0.9103999733924866
    PERFORMANCE ON TEST SET: Batch Loss = 0.6863257884979248, Accuracy = 0.8446941375732422
    Training iter #178500: Batch Loss = 0.456898, Accuracy = 0.8971999883651733
    PERFORMANCE ON TEST SET: Batch Loss = 0.680911123752594, Accuracy = 0.8626716732978821
    Training iter #179000: Batch Loss = 0.471006, Accuracy = 0.8920000195503235
    PERFORMANCE ON TEST SET: Batch Loss = 0.679452657699585, Accuracy = 0.8571785092353821
    Training iter #179500: Batch Loss = 0.511842, Accuracy = 0.8619999885559082
    PERFORMANCE ON TEST SET: Batch Loss = 0.6771259307861328, Accuracy = 0.8496878743171692
    =================================================
    2.160000000000001e-74
    1.0000000000000006e-75
    Training iter #180000: Batch Loss = 0.464928, Accuracy = 0.8863999843597412
    PERFORMANCE ON TEST SET: Batch Loss = 0.6706146001815796, Accuracy = 0.8566791415214539
    Training iter #180500: Batch Loss = 0.449098, Accuracy = 0.8999999761581421
    PERFORMANCE ON TEST SET: Batch Loss = 0.6587190628051758, Accuracy = 0.8716604113578796
    Training iter #181000: Batch Loss = 0.542786, Accuracy = 0.8348000049591064
    PERFORMANCE ON TEST SET: Batch Loss = 0.7321041226387024, Accuracy = 0.8114856481552124
    Training iter #181500: Batch Loss = 0.508144, Accuracy = 0.86080002784729
    PERFORMANCE ON TEST SET: Batch Loss = 0.7265983819961548, Accuracy = 0.8184769153594971
    Training iter #182000: Batch Loss = 0.430386, Accuracy = 0.9164000153541565
    PERFORMANCE ON TEST SET: Batch Loss = 0.6652533411979675, Accuracy = 0.8649188280105591
    =================================================
    2.160000000000001e-75
    1.0000000000000005e-76
    Training iter #182500: Batch Loss = 0.555001, Accuracy = 0.8339999914169312
    PERFORMANCE ON TEST SET: Batch Loss = 0.7038713097572327, Accuracy = 0.8339575529098511
    Training iter #183000: Batch Loss = 1.046999, Accuracy = 0.6687999963760376
    PERFORMANCE ON TEST SET: Batch Loss = 0.9533954858779907, Accuracy = 0.7268414497375488
    Training iter #183500: Batch Loss = 0.635192, Accuracy = 0.8047999739646912
    PERFORMANCE ON TEST SET: Batch Loss = 0.8406072854995728, Accuracy = 0.7573033571243286
    Training iter #184000: Batch Loss = 0.455981, Accuracy = 0.8931999802589417
    PERFORMANCE ON TEST SET: Batch Loss = 0.6765965223312378, Accuracy = 0.8626716732978821
    Training iter #184500: Batch Loss = 0.452805, Accuracy = 0.9016000032424927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6597052216529846, Accuracy = 0.8739076256752014
    =================================================
    2.160000000000001e-76
    1.0000000000000005e-77
    Training iter #185000: Batch Loss = 0.453142, Accuracy = 0.8980000019073486
    PERFORMANCE ON TEST SET: Batch Loss = 0.6752660274505615, Accuracy = 0.8586766719818115
    Training iter #185500: Batch Loss = 0.476974, Accuracy = 0.8812000155448914
    PERFORMANCE ON TEST SET: Batch Loss = 0.7299720644950867, Accuracy = 0.8199750185012817
    Training iter #186000: Batch Loss = 0.640737, Accuracy = 0.7943999767303467
    PERFORMANCE ON TEST SET: Batch Loss = 0.8606889247894287, Accuracy = 0.7493133544921875
    Training iter #186500: Batch Loss = 0.473916, Accuracy = 0.8831999897956848
    PERFORMANCE ON TEST SET: Batch Loss = 0.6664865016937256, Accuracy = 0.8591760396957397
    Training iter #187000: Batch Loss = 0.455034, Accuracy = 0.8960000276565552
    PERFORMANCE ON TEST SET: Batch Loss = 0.6696993708610535, Accuracy = 0.8631710410118103
    =================================================
    2.160000000000001e-77
    1.0000000000000005e-78
    Training iter #187500: Batch Loss = 0.447950, Accuracy = 0.8960000276565552
    PERFORMANCE ON TEST SET: Batch Loss = 0.6794687509536743, Accuracy = 0.8466916084289551
    Training iter #188000: Batch Loss = 0.441052, Accuracy = 0.9067999720573425
    PERFORMANCE ON TEST SET: Batch Loss = 0.6544967889785767, Accuracy = 0.8686641454696655
    Training iter #188500: Batch Loss = 0.492892, Accuracy = 0.8672000169754028
    PERFORMANCE ON TEST SET: Batch Loss = 0.6752805113792419, Accuracy = 0.8521847724914551
    Training iter #189000: Batch Loss = 0.527179, Accuracy = 0.8547999858856201
    PERFORMANCE ON TEST SET: Batch Loss = 0.6865295767784119, Accuracy = 0.846941351890564
    Training iter #189500: Batch Loss = 0.567016, Accuracy = 0.8284000158309937
    PERFORMANCE ON TEST SET: Batch Loss = 0.7777532339096069, Accuracy = 0.8082396984100342
    =================================================
    2.160000000000001e-78
    1.0000000000000004e-79
    Training iter #190000: Batch Loss = 0.475600, Accuracy = 0.8827999830245972
    PERFORMANCE ON TEST SET: Batch Loss = 0.6880692839622498, Accuracy = 0.8516854047775269
    Training iter #190500: Batch Loss = 0.701985, Accuracy = 0.7739999890327454
    PERFORMANCE ON TEST SET: Batch Loss = 0.6858410239219666, Accuracy = 0.8421972393989563
    Training iter #191000: Batch Loss = 0.441243, Accuracy = 0.9100000262260437
    PERFORMANCE ON TEST SET: Batch Loss = 0.6572670936584473, Accuracy = 0.8756554126739502
    Training iter #191500: Batch Loss = 0.430261, Accuracy = 0.9128000140190125
    PERFORMANCE ON TEST SET: Batch Loss = 0.6595166921615601, Accuracy = 0.870162308216095
    Training iter #192000: Batch Loss = 0.460881, Accuracy = 0.9028000235557556
    PERFORMANCE ON TEST SET: Batch Loss = 0.659040629863739, Accuracy = 0.8744069933891296
    =================================================
    2.160000000000001e-79
    1.0000000000000005e-80
    Training iter #192500: Batch Loss = 0.527580, Accuracy = 0.8424000144004822
    PERFORMANCE ON TEST SET: Batch Loss = 0.7118729948997498, Accuracy = 0.841448187828064
    Training iter #193000: Batch Loss = 0.441622, Accuracy = 0.9020000100135803
    PERFORMANCE ON TEST SET: Batch Loss = 0.7223952412605286, Accuracy = 0.8234706521034241
    Training iter #193500: Batch Loss = 0.552847, Accuracy = 0.8348000049591064
    PERFORMANCE ON TEST SET: Batch Loss = 0.703448474407196, Accuracy = 0.8374531865119934
    Training iter #194000: Batch Loss = 0.477042, Accuracy = 0.8784000277519226
    PERFORMANCE ON TEST SET: Batch Loss = 0.7931079268455505, Accuracy = 0.7872658967971802
    Training iter #194500: Batch Loss = 0.425096, Accuracy = 0.9196000099182129
    PERFORMANCE ON TEST SET: Batch Loss = 0.6536878943443298, Accuracy = 0.8746566772460938
    =================================================
    2.160000000000001e-80
    1.0000000000000005e-81
    Training iter #195000: Batch Loss = 0.461556, Accuracy = 0.8899999856948853
    PERFORMANCE ON TEST SET: Batch Loss = 0.677040696144104, Accuracy = 0.8651685118675232
    Training iter #195500: Batch Loss = 0.459962, Accuracy = 0.9043999910354614
    PERFORMANCE ON TEST SET: Batch Loss = 0.7003235816955566, Accuracy = 0.8481897711753845
    Training iter #196000: Batch Loss = 0.433198, Accuracy = 0.9143999814987183
    PERFORMANCE ON TEST SET: Batch Loss = 0.6547677516937256, Accuracy = 0.8756554126739502
    Training iter #196500: Batch Loss = 0.462103, Accuracy = 0.8924000263214111
    PERFORMANCE ON TEST SET: Batch Loss = 0.6959550380706787, Accuracy = 0.8421972393989563
    Training iter #197000: Batch Loss = 0.984963, Accuracy = 0.6899999976158142
    PERFORMANCE ON TEST SET: Batch Loss = 1.2827763557434082, Accuracy = 0.6519351005554199
    =================================================
    2.160000000000001e-81
    1.0000000000000005e-82
    Training iter #197500: Batch Loss = 0.464795, Accuracy = 0.897599995136261
    PERFORMANCE ON TEST SET: Batch Loss = 0.6546644568443298, Accuracy = 0.8739076256752014
    Training iter #198000: Batch Loss = 0.480864, Accuracy = 0.8795999884605408
    PERFORMANCE ON TEST SET: Batch Loss = 0.6673099398612976, Accuracy = 0.864669144153595
    Training iter #198500: Batch Loss = 0.473612, Accuracy = 0.8812000155448914
    PERFORMANCE ON TEST SET: Batch Loss = 0.6816747784614563, Accuracy = 0.8494381904602051
    Training iter #199000: Batch Loss = 0.432354, Accuracy = 0.9124000072479248
    PERFORMANCE ON TEST SET: Batch Loss = 0.6576235890388489, Accuracy = 0.8704119920730591
    Training iter #199500: Batch Loss = 0.437663, Accuracy = 0.9100000262260437
    PERFORMANCE ON TEST SET: Batch Loss = 0.6587314605712891, Accuracy = 0.8679150938987732
    =================================================
    2.160000000000001e-82
    1.0000000000000006e-83
    Training iter #200000: Batch Loss = 0.427534, Accuracy = 0.9192000031471252
    PERFORMANCE ON TEST SET: Batch Loss = 0.6564434766769409, Accuracy = 0.8759050965309143
    Training iter #200500: Batch Loss = 0.427133, Accuracy = 0.9168000221252441
    PERFORMANCE ON TEST SET: Batch Loss = 0.667438805103302, Accuracy = 0.8741573095321655
    Training iter #201000: Batch Loss = 0.449629, Accuracy = 0.9088000059127808
    PERFORMANCE ON TEST SET: Batch Loss = 0.6679164171218872, Accuracy = 0.872908890247345
    Training iter #201500: Batch Loss = 0.444758, Accuracy = 0.9079999923706055
    PERFORMANCE ON TEST SET: Batch Loss = 0.6930480003356934, Accuracy = 0.8446941375732422
    Training iter #202000: Batch Loss = 0.433025, Accuracy = 0.9160000085830688
    PERFORMANCE ON TEST SET: Batch Loss = 0.6510576009750366, Accuracy = 0.8794007301330566
    =================================================
    2.160000000000001e-83
    1.0000000000000006e-84
    Training iter #202500: Batch Loss = 0.422975, Accuracy = 0.9232000112533569
    PERFORMANCE ON TEST SET: Batch Loss = 0.6560213565826416, Accuracy = 0.8759050965309143
    Training iter #203000: Batch Loss = 0.447004, Accuracy = 0.9120000004768372
    PERFORMANCE ON TEST SET: Batch Loss = 0.6559710502624512, Accuracy = 0.8774032592773438
    Training iter #203500: Batch Loss = 0.452953, Accuracy = 0.9007999897003174
    PERFORMANCE ON TEST SET: Batch Loss = 0.6580193638801575, Accuracy = 0.8739076256752014
    Training iter #204000: Batch Loss = 0.501055, Accuracy = 0.8623999953269958
    PERFORMANCE ON TEST SET: Batch Loss = 0.6700800657272339, Accuracy = 0.854681670665741
    Training iter #204500: Batch Loss = 0.430669, Accuracy = 0.9156000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6656785011291504, Accuracy = 0.8649188280105591
    =================================================
    2.160000000000001e-84
    1.0000000000000005e-85
    Training iter #205000: Batch Loss = 0.453549, Accuracy = 0.897599995136261
    PERFORMANCE ON TEST SET: Batch Loss = 0.7187780737876892, Accuracy = 0.8249688148498535
    Training iter #205500: Batch Loss = 0.676401, Accuracy = 0.7860000133514404
    PERFORMANCE ON TEST SET: Batch Loss = 0.8222514390945435, Accuracy = 0.7757802605628967
    Training iter #206000: Batch Loss = 0.482413, Accuracy = 0.8755999803543091
    PERFORMANCE ON TEST SET: Batch Loss = 0.6973458528518677, Accuracy = 0.8501872420310974
    Training iter #206500: Batch Loss = 0.442736, Accuracy = 0.9151999950408936
    PERFORMANCE ON TEST SET: Batch Loss = 0.661292552947998, Accuracy = 0.8781523108482361
    Training iter #207000: Batch Loss = 0.453770, Accuracy = 0.9020000100135803
    PERFORMANCE ON TEST SET: Batch Loss = 0.6577365398406982, Accuracy = 0.8726591467857361
    =================================================
    2.160000000000001e-85
    1.0000000000000006e-86
    Training iter #207500: Batch Loss = 0.434223, Accuracy = 0.9179999828338623
    PERFORMANCE ON TEST SET: Batch Loss = 0.6519043445587158, Accuracy = 0.8794007301330566
    Training iter #208000: Batch Loss = 0.422296, Accuracy = 0.9196000099182129
    PERFORMANCE ON TEST SET: Batch Loss = 0.6516822576522827, Accuracy = 0.8774032592773438
    Training iter #208500: Batch Loss = 0.592100, Accuracy = 0.807200014591217
    PERFORMANCE ON TEST SET: Batch Loss = 0.6915231347084045, Accuracy = 0.849188506603241
    Training iter #209000: Batch Loss = 0.456244, Accuracy = 0.901199996471405
    PERFORMANCE ON TEST SET: Batch Loss = 0.6868458986282349, Accuracy = 0.8509363532066345
    Training iter #209500: Batch Loss = 0.449433, Accuracy = 0.8988000154495239
    PERFORMANCE ON TEST SET: Batch Loss = 0.680070698261261, Accuracy = 0.8614231944084167
    =================================================
    2.160000000000001e-86
    1.0000000000000006e-87
    Training iter #210000: Batch Loss = 0.631037, Accuracy = 0.7947999835014343
    PERFORMANCE ON TEST SET: Batch Loss = 0.877704381942749, Accuracy = 0.7488139867782593
    Training iter #210500: Batch Loss = 0.428296, Accuracy = 0.9196000099182129
    PERFORMANCE ON TEST SET: Batch Loss = 0.6554760932922363, Accuracy = 0.8781523108482361
    Training iter #211000: Batch Loss = 0.445015, Accuracy = 0.9064000248908997
    PERFORMANCE ON TEST SET: Batch Loss = 0.7245768308639526, Accuracy = 0.8344569206237793
    Training iter #211500: Batch Loss = 0.460205, Accuracy = 0.8907999992370605
    PERFORMANCE ON TEST SET: Batch Loss = 0.7913858890533447, Accuracy = 0.7850187420845032
    Training iter #212000: Batch Loss = 0.470323, Accuracy = 0.8916000127792358
    PERFORMANCE ON TEST SET: Batch Loss = 0.6658244132995605, Accuracy = 0.8734082579612732
    =================================================
    2.160000000000001e-87
    1.0000000000000006e-88
    Training iter #212500: Batch Loss = 0.446808, Accuracy = 0.9092000126838684
    PERFORMANCE ON TEST SET: Batch Loss = 0.6557068824768066, Accuracy = 0.8791510462760925
    Training iter #213000: Batch Loss = 0.450146, Accuracy = 0.9071999788284302
    PERFORMANCE ON TEST SET: Batch Loss = 0.6630673408508301, Accuracy = 0.8651685118675232
    Training iter #213500: Batch Loss = 0.421255, Accuracy = 0.9232000112533569
    PERFORMANCE ON TEST SET: Batch Loss = 0.6581072807312012, Accuracy = 0.8734082579612732
    Training iter #214000: Batch Loss = 0.523791, Accuracy = 0.8551999926567078
    PERFORMANCE ON TEST SET: Batch Loss = 0.8025667667388916, Accuracy = 0.7872658967971802
    Training iter #214500: Batch Loss = 0.443107, Accuracy = 0.9088000059127808
    PERFORMANCE ON TEST SET: Batch Loss = 0.6712751388549805, Accuracy = 0.862421989440918
    =================================================
    2.1600000000000007e-88
    1.0000000000000006e-89
    Training iter #215000: Batch Loss = 0.423097, Accuracy = 0.9211999773979187
    PERFORMANCE ON TEST SET: Batch Loss = 0.6609467267990112, Accuracy = 0.8734082579612732
    Training iter #215500: Batch Loss = 0.421716, Accuracy = 0.925599992275238
    PERFORMANCE ON TEST SET: Batch Loss = 0.6596394181251526, Accuracy = 0.8726591467857361
    Training iter #216000: Batch Loss = 0.465291, Accuracy = 0.8903999924659729
    PERFORMANCE ON TEST SET: Batch Loss = 0.7164732813835144, Accuracy = 0.8302122354507446
    Training iter #216500: Batch Loss = 0.422338, Accuracy = 0.9272000193595886
    PERFORMANCE ON TEST SET: Batch Loss = 0.6617941856384277, Accuracy = 0.8761547803878784
    Training iter #217000: Batch Loss = 0.419426, Accuracy = 0.9272000193595886
    PERFORMANCE ON TEST SET: Batch Loss = 0.659917414188385, Accuracy = 0.8736579418182373
    =================================================
    2.1600000000000007e-89
    1.0000000000000006e-90
    Training iter #217500: Batch Loss = 1.045599, Accuracy = 0.6863999962806702
    PERFORMANCE ON TEST SET: Batch Loss = 1.024167776107788, Accuracy = 0.7133582830429077
    Training iter #218000: Batch Loss = 0.446399, Accuracy = 0.9064000248908997
    PERFORMANCE ON TEST SET: Batch Loss = 0.6529519557952881, Accuracy = 0.8774032592773438
    Training iter #218500: Batch Loss = 0.454823, Accuracy = 0.8999999761581421
    PERFORMANCE ON TEST SET: Batch Loss = 0.6685300469398499, Accuracy = 0.8661673069000244
    Training iter #219000: Batch Loss = 0.431516, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.6563274264335632, Accuracy = 0.8776529431343079
    Training iter #219500: Batch Loss = 0.449226, Accuracy = 0.9056000113487244
    PERFORMANCE ON TEST SET: Batch Loss = 0.6596351265907288, Accuracy = 0.8759050965309143
    =================================================
    2.1600000000000007e-90
    1.0000000000000007e-91
    Training iter #220000: Batch Loss = 0.453645, Accuracy = 0.8992000222206116
    PERFORMANCE ON TEST SET: Batch Loss = 0.6700319051742554, Accuracy = 0.8656679391860962
    Training iter #220500: Batch Loss = 0.429573, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.6577092409133911, Accuracy = 0.8771535754203796
    Training iter #221000: Batch Loss = 0.440485, Accuracy = 0.906000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6672118902206421, Accuracy = 0.8669163584709167
    Training iter #221500: Batch Loss = 0.512849, Accuracy = 0.8604000210762024
    PERFORMANCE ON TEST SET: Batch Loss = 0.6773036122322083, Accuracy = 0.8591760396957397
    Training iter #222000: Batch Loss = 0.435243, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.6757386922836304, Accuracy = 0.872409462928772
    =================================================
    2.1600000000000008e-91
    1.0000000000000007e-92
    Training iter #222500: Batch Loss = 0.415057, Accuracy = 0.930400013923645
    PERFORMANCE ON TEST SET: Batch Loss = 0.6652294397354126, Accuracy = 0.8789013624191284
    Training iter #223000: Batch Loss = 0.440036, Accuracy = 0.9120000004768372
    PERFORMANCE ON TEST SET: Batch Loss = 0.6657921671867371, Accuracy = 0.8816479444503784
    Training iter #223500: Batch Loss = 0.448603, Accuracy = 0.9079999923706055
    PERFORMANCE ON TEST SET: Batch Loss = 0.66228848695755, Accuracy = 0.8749063611030579
    Training iter #224000: Batch Loss = 0.440350, Accuracy = 0.9107999801635742
    PERFORMANCE ON TEST SET: Batch Loss = 0.6655464172363281, Accuracy = 0.8741573095321655
    Training iter #224500: Batch Loss = 0.536187, Accuracy = 0.8460000157356262
    PERFORMANCE ON TEST SET: Batch Loss = 0.7488968372344971, Accuracy = 0.8114856481552124
    =================================================
    2.1600000000000008e-92
    1.0000000000000008e-93
    Training iter #225000: Batch Loss = 0.512510, Accuracy = 0.8628000020980835
    PERFORMANCE ON TEST SET: Batch Loss = 0.7457748651504517, Accuracy = 0.82322096824646
    Training iter #225500: Batch Loss = 0.428706, Accuracy = 0.9200000166893005
    PERFORMANCE ON TEST SET: Batch Loss = 0.6668128371238708, Accuracy = 0.8769038915634155
    Training iter #226000: Batch Loss = 0.418504, Accuracy = 0.9272000193595886
    PERFORMANCE ON TEST SET: Batch Loss = 0.6609745025634766, Accuracy = 0.8803995251655579
    Training iter #226500: Batch Loss = 0.515514, Accuracy = 0.857200026512146
    PERFORMANCE ON TEST SET: Batch Loss = 0.823901891708374, Accuracy = 0.7772784233093262
    Training iter #227000: Batch Loss = 0.427387, Accuracy = 0.9232000112533569
    PERFORMANCE ON TEST SET: Batch Loss = 0.6554425954818726, Accuracy = 0.8808988928794861
    =================================================
    2.160000000000001e-93
    1.0000000000000008e-94
    Training iter #227500: Batch Loss = 0.435394, Accuracy = 0.9136000275611877
    PERFORMANCE ON TEST SET: Batch Loss = 0.6677771210670471, Accuracy = 0.8746566772460938
    Training iter #228000: Batch Loss = 0.447356, Accuracy = 0.9043999910354614
    PERFORMANCE ON TEST SET: Batch Loss = 0.7423174381256104, Accuracy = 0.8234706521034241
    Training iter #228500: Batch Loss = 0.437913, Accuracy = 0.9143999814987183
    PERFORMANCE ON TEST SET: Batch Loss = 0.669055163860321, Accuracy = 0.8794007301330566
    Training iter #229000: Batch Loss = 0.440376, Accuracy = 0.9132000207901001
    PERFORMANCE ON TEST SET: Batch Loss = 0.6637049913406372, Accuracy = 0.8784019947052002
    Training iter #229500: Batch Loss = 0.431214, Accuracy = 0.9156000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6602550148963928, Accuracy = 0.8801498413085938
    =================================================
    2.160000000000001e-94
    1.0000000000000008e-95
    Training iter #230000: Batch Loss = 0.416389, Accuracy = 0.9247999787330627
    PERFORMANCE ON TEST SET: Batch Loss = 0.6593992710113525, Accuracy = 0.8794007301330566
    Training iter #230500: Batch Loss = 0.439393, Accuracy = 0.9103999733924866
    PERFORMANCE ON TEST SET: Batch Loss = 0.6641444563865662, Accuracy = 0.8784019947052002
    Training iter #231000: Batch Loss = 0.427707, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.6718103289604187, Accuracy = 0.872908890247345
    Training iter #231500: Batch Loss = 0.444004, Accuracy = 0.901199996471405
    PERFORMANCE ON TEST SET: Batch Loss = 0.7143889665603638, Accuracy = 0.8424469232559204
    Training iter #232000: Batch Loss = 0.416112, Accuracy = 0.9276000261306763
    PERFORMANCE ON TEST SET: Batch Loss = 0.6586223840713501, Accuracy = 0.8803995251655579
    =================================================
    2.160000000000001e-95
    1.0000000000000007e-96
    Training iter #232500: Batch Loss = 0.475751, Accuracy = 0.8812000155448914
    PERFORMANCE ON TEST SET: Batch Loss = 0.6983845829963684, Accuracy = 0.8439450860023499
    Training iter #233000: Batch Loss = 0.478237, Accuracy = 0.8755999803543091
    PERFORMANCE ON TEST SET: Batch Loss = 0.6878312826156616, Accuracy = 0.8641697764396667
    Training iter #233500: Batch Loss = 0.472345, Accuracy = 0.8880000114440918
    PERFORMANCE ON TEST SET: Batch Loss = 0.7515136003494263, Accuracy = 0.8144819140434265
    Training iter #234000: Batch Loss = 0.492256, Accuracy = 0.8755999803543091
    PERFORMANCE ON TEST SET: Batch Loss = 0.7515119314193726, Accuracy = 0.8189762830734253
    Training iter #234500: Batch Loss = 0.429253, Accuracy = 0.9211999773979187
    PERFORMANCE ON TEST SET: Batch Loss = 0.6560791730880737, Accuracy = 0.8836454153060913
    =================================================
    2.160000000000001e-96
    1.0000000000000007e-97
    Training iter #235000: Batch Loss = 0.443710, Accuracy = 0.906000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6825094223022461, Accuracy = 0.8589263558387756
    Training iter #235500: Batch Loss = 0.429637, Accuracy = 0.9128000140190125
    PERFORMANCE ON TEST SET: Batch Loss = 0.6605832576751709, Accuracy = 0.8794007301330566
    Training iter #236000: Batch Loss = 0.450793, Accuracy = 0.9031999707221985
    PERFORMANCE ON TEST SET: Batch Loss = 0.669649064540863, Accuracy = 0.8769038915634155
    Training iter #236500: Batch Loss = 0.423588, Accuracy = 0.9223999977111816
    PERFORMANCE ON TEST SET: Batch Loss = 0.6629574298858643, Accuracy = 0.882896363735199
    Training iter #237000: Batch Loss = 0.470360, Accuracy = 0.8948000073432922
    PERFORMANCE ON TEST SET: Batch Loss = 0.734134316444397, Accuracy = 0.8284644484519958
    =================================================
    2.1600000000000012e-97
    1.0000000000000008e-98
    Training iter #237500: Batch Loss = 0.416948, Accuracy = 0.9259999990463257
    PERFORMANCE ON TEST SET: Batch Loss = 0.6792916059494019, Accuracy = 0.8656679391860962
    Training iter #238000: Batch Loss = 0.428165, Accuracy = 0.920799970626831
    PERFORMANCE ON TEST SET: Batch Loss = 0.6651625633239746, Accuracy = 0.8734082579612732
    Training iter #238500: Batch Loss = 0.416428, Accuracy = 0.9283999800682068
    PERFORMANCE ON TEST SET: Batch Loss = 0.6646286249160767, Accuracy = 0.8801498413085938
    Training iter #239000: Batch Loss = 0.452402, Accuracy = 0.8996000289916992
    PERFORMANCE ON TEST SET: Batch Loss = 0.7635267376899719, Accuracy = 0.8129837512969971
    Training iter #239500: Batch Loss = 0.435886, Accuracy = 0.91839998960495
    PERFORMANCE ON TEST SET: Batch Loss = 0.677570641040802, Accuracy = 0.8741573095321655
    =================================================
    2.1600000000000012e-98
    1.0000000000000008e-99
    Training iter #240000: Batch Loss = 0.542521, Accuracy = 0.8420000076293945
    PERFORMANCE ON TEST SET: Batch Loss = 0.681993842124939, Accuracy = 0.8656679391860962
    Training iter #240500: Batch Loss = 0.459181, Accuracy = 0.8907999992370605
    PERFORMANCE ON TEST SET: Batch Loss = 0.6923307776451111, Accuracy = 0.8521847724914551
    Training iter #241000: Batch Loss = 0.456419, Accuracy = 0.8971999883651733
    PERFORMANCE ON TEST SET: Batch Loss = 0.693612813949585, Accuracy = 0.8591760396957397
    Training iter #241500: Batch Loss = 0.437671, Accuracy = 0.9168000221252441
    PERFORMANCE ON TEST SET: Batch Loss = 0.6665165424346924, Accuracy = 0.882896363735199
    Training iter #242000: Batch Loss = 0.421537, Accuracy = 0.9243999719619751
    PERFORMANCE ON TEST SET: Batch Loss = 0.6643074750900269, Accuracy = 0.8836454153060913
    =================================================
    2.1600000000000013e-99
    1.0000000000000008e-100
    Training iter #242500: Batch Loss = 0.419429, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.6628028154373169, Accuracy = 0.8813982605934143
    Training iter #243000: Batch Loss = 0.412187, Accuracy = 0.930400013923645
    PERFORMANCE ON TEST SET: Batch Loss = 0.6740400791168213, Accuracy = 0.8664169907569885
    Training iter #243500: Batch Loss = 0.537637, Accuracy = 0.8471999764442444
    PERFORMANCE ON TEST SET: Batch Loss = 0.6972838640213013, Accuracy = 0.8554307222366333
    Training iter #244000: Batch Loss = 0.430466, Accuracy = 0.9164000153541565
    PERFORMANCE ON TEST SET: Batch Loss = 0.6628497838973999, Accuracy = 0.8794007301330566
    Training iter #244500: Batch Loss = 0.497768, Accuracy = 0.8672000169754028
    PERFORMANCE ON TEST SET: Batch Loss = 0.7281547784805298, Accuracy = 0.8317103385925293
    =================================================
    2.1600000000000014e-100
    1.0000000000000008e-101
    Training iter #245000: Batch Loss = 0.447591, Accuracy = 0.9088000059127808
    PERFORMANCE ON TEST SET: Batch Loss = 0.6874423027038574, Accuracy = 0.8699126243591309
    Training iter #245500: Batch Loss = 0.433104, Accuracy = 0.9168000221252441
    PERFORMANCE ON TEST SET: Batch Loss = 0.6626171469688416, Accuracy = 0.882896363735199
    Training iter #246000: Batch Loss = 0.418419, Accuracy = 0.9276000261306763
    PERFORMANCE ON TEST SET: Batch Loss = 0.6624706387519836, Accuracy = 0.885642945766449
    Training iter #246500: Batch Loss = 0.539776, Accuracy = 0.8471999764442444
    PERFORMANCE ON TEST SET: Batch Loss = 0.6857320070266724, Accuracy = 0.8606741428375244
    Training iter #247000: Batch Loss = 0.436370, Accuracy = 0.9151999950408936
    PERFORMANCE ON TEST SET: Batch Loss = 0.6644392609596252, Accuracy = 0.8808988928794861
    =================================================
    2.1600000000000013e-101
    1.000000000000001e-102
    Training iter #247500: Batch Loss = 0.437093, Accuracy = 0.9111999869346619
    PERFORMANCE ON TEST SET: Batch Loss = 0.6644640564918518, Accuracy = 0.8823969960212708
    Training iter #248000: Batch Loss = 0.422352, Accuracy = 0.9243999719619751
    PERFORMANCE ON TEST SET: Batch Loss = 0.6721031069755554, Accuracy = 0.8794007301330566
    Training iter #248500: Batch Loss = 0.410538, Accuracy = 0.9323999881744385
    PERFORMANCE ON TEST SET: Batch Loss = 0.673288106918335, Accuracy = 0.8759050965309143
    Training iter #249000: Batch Loss = 0.432777, Accuracy = 0.9168000221252441
    PERFORMANCE ON TEST SET: Batch Loss = 0.6643295288085938, Accuracy = 0.8831460475921631
    Training iter #249500: Batch Loss = 0.413262, Accuracy = 0.9312000274658203
    PERFORMANCE ON TEST SET: Batch Loss = 0.6616725921630859, Accuracy = 0.8833957314491272
    =================================================
    2.1600000000000013e-102
    1.000000000000001e-103
    Training iter #250000: Batch Loss = 0.419613, Accuracy = 0.9279999732971191
    PERFORMANCE ON TEST SET: Batch Loss = 0.6787343621253967, Accuracy = 0.8711610436439514
    Training iter #250500: Batch Loss = 0.493413, Accuracy = 0.876800000667572
    PERFORMANCE ON TEST SET: Batch Loss = 0.6894918084144592, Accuracy = 0.8651685118675232
    Training iter #251000: Batch Loss = 0.450312, Accuracy = 0.9047999978065491
    PERFORMANCE ON TEST SET: Batch Loss = 0.6632218360900879, Accuracy = 0.8841448426246643
    Training iter #251500: Batch Loss = 0.417700, Accuracy = 0.9296000003814697
    PERFORMANCE ON TEST SET: Batch Loss = 0.6625159978866577, Accuracy = 0.8833957314491272
    Training iter #252000: Batch Loss = 0.415518, Accuracy = 0.925599992275238
    PERFORMANCE ON TEST SET: Batch Loss = 0.6703633069992065, Accuracy = 0.8799000978469849
    =================================================
    2.1600000000000014e-103
    1.000000000000001e-104
    Training iter #252500: Batch Loss = 0.436517, Accuracy = 0.9156000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6698272824287415, Accuracy = 0.880649209022522
    Training iter #253000: Batch Loss = 0.446137, Accuracy = 0.9031999707221985
    PERFORMANCE ON TEST SET: Batch Loss = 0.673366367816925, Accuracy = 0.8796504139900208
    Training iter #253500: Batch Loss = 0.454658, Accuracy = 0.8992000222206116
    PERFORMANCE ON TEST SET: Batch Loss = 0.728009819984436, Accuracy = 0.8352059721946716
    Training iter #254000: Batch Loss = 0.449424, Accuracy = 0.8996000289916992
    PERFORMANCE ON TEST SET: Batch Loss = 0.6839497089385986, Accuracy = 0.8684144616127014
    Training iter #254500: Batch Loss = 0.424600, Accuracy = 0.9240000247955322
    PERFORMANCE ON TEST SET: Batch Loss = 0.6624192595481873, Accuracy = 0.8851435780525208
    =================================================
    2.1600000000000015e-104
    1.000000000000001e-105
    Training iter #255000: Batch Loss = 0.414083, Accuracy = 0.9312000274658203
    PERFORMANCE ON TEST SET: Batch Loss = 0.6651142835617065, Accuracy = 0.882896363735199
    Training iter #255500: Batch Loss = 0.412039, Accuracy = 0.9344000220298767
    PERFORMANCE ON TEST SET: Batch Loss = 0.6651171445846558, Accuracy = 0.8848938941955566
    Training iter #256000: Batch Loss = 0.430105, Accuracy = 0.925599992275238
    PERFORMANCE ON TEST SET: Batch Loss = 0.683589518070221, Accuracy = 0.8711610436439514
    Training iter #256500: Batch Loss = 0.428764, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.6719710230827332, Accuracy = 0.8796504139900208
    Training iter #257000: Batch Loss = 0.418507, Accuracy = 0.926800012588501
    PERFORMANCE ON TEST SET: Batch Loss = 0.662976861000061, Accuracy = 0.8868913650512695
    =================================================
    2.1600000000000013e-105
    1.0000000000000009e-106
    Training iter #257500: Batch Loss = 0.436399, Accuracy = 0.9115999937057495
    PERFORMANCE ON TEST SET: Batch Loss = 0.6624554395675659, Accuracy = 0.885642945766449
    Training iter #258000: Batch Loss = 0.428797, Accuracy = 0.9164000153541565
    PERFORMANCE ON TEST SET: Batch Loss = 0.6639539003372192, Accuracy = 0.8878901600837708
    Training iter #258500: Batch Loss = 0.430622, Accuracy = 0.9179999828338623
    PERFORMANCE ON TEST SET: Batch Loss = 0.6715014576911926, Accuracy = 0.8811485767364502
    Training iter #259000: Batch Loss = 0.416932, Accuracy = 0.9272000193595886
    PERFORMANCE ON TEST SET: Batch Loss = 0.6684032082557678, Accuracy = 0.8833957314491272
    Training iter #259500: Batch Loss = 0.418248, Accuracy = 0.9259999990463257
    PERFORMANCE ON TEST SET: Batch Loss = 0.6934674382209778, Accuracy = 0.8589263558387756
    =================================================
    2.1600000000000014e-106
    1.0000000000000009e-107
    Training iter #260000: Batch Loss = 0.426671, Accuracy = 0.9232000112533569
    PERFORMANCE ON TEST SET: Batch Loss = 0.6708556413650513, Accuracy = 0.8801498413085938
    Training iter #260500: Batch Loss = 0.476868, Accuracy = 0.8820000290870667
    PERFORMANCE ON TEST SET: Batch Loss = 0.6820647120475769, Accuracy = 0.872908890247345
    Training iter #261000: Batch Loss = 0.414683, Accuracy = 0.9319999814033508
    PERFORMANCE ON TEST SET: Batch Loss = 0.6655716300010681, Accuracy = 0.8843945264816284
    Training iter #261500: Batch Loss = 0.522489, Accuracy = 0.8575999736785889
    PERFORMANCE ON TEST SET: Batch Loss = 0.719650387763977, Accuracy = 0.8531835079193115
    Training iter #262000: Batch Loss = 0.438740, Accuracy = 0.91839998960495
    PERFORMANCE ON TEST SET: Batch Loss = 0.6725388169288635, Accuracy = 0.8796504139900208
    =================================================
    2.1600000000000016e-107
    1.000000000000001e-108
    Training iter #262500: Batch Loss = 0.541282, Accuracy = 0.853600025177002
    PERFORMANCE ON TEST SET: Batch Loss = 0.7099471092224121, Accuracy = 0.8516854047775269
    Training iter #263000: Batch Loss = 0.460369, Accuracy = 0.8944000005722046
    PERFORMANCE ON TEST SET: Batch Loss = 0.7136740684509277, Accuracy = 0.8474407196044922
    Training iter #263500: Batch Loss = 0.524880, Accuracy = 0.8560000061988831
    PERFORMANCE ON TEST SET: Batch Loss = 0.6869025230407715, Accuracy = 0.8686641454696655
    Training iter #264000: Batch Loss = 0.427552, Accuracy = 0.9187999963760376
    PERFORMANCE ON TEST SET: Batch Loss = 0.672839879989624, Accuracy = 0.8803995251655579
    Training iter #264500: Batch Loss = 0.847045, Accuracy = 0.7300000190734863
    PERFORMANCE ON TEST SET: Batch Loss = 0.8029698133468628, Accuracy = 0.8034956455230713
    =================================================
    2.1600000000000016e-108
    1.0000000000000009e-109
    Training iter #265000: Batch Loss = 0.413569, Accuracy = 0.9319999814033508
    PERFORMANCE ON TEST SET: Batch Loss = 0.6905099153518677, Accuracy = 0.856928825378418
    Training iter #265500: Batch Loss = 0.449710, Accuracy = 0.8999999761581421
    PERFORMANCE ON TEST SET: Batch Loss = 0.6917481422424316, Accuracy = 0.8604244589805603
    Training iter #266000: Batch Loss = 0.408374, Accuracy = 0.9344000220298767
    PERFORMANCE ON TEST SET: Batch Loss = 0.6637896299362183, Accuracy = 0.885642945766449
    Training iter #266500: Batch Loss = 0.426373, Accuracy = 0.91839998960495
    PERFORMANCE ON TEST SET: Batch Loss = 0.6707987785339355, Accuracy = 0.8796504139900208
    Training iter #267000: Batch Loss = 0.566013, Accuracy = 0.8371999859809875
    PERFORMANCE ON TEST SET: Batch Loss = 0.8129798769950867, Accuracy = 0.797253429889679
    =================================================
    2.1600000000000017e-109
    1.000000000000001e-110
    Training iter #267500: Batch Loss = 0.497329, Accuracy = 0.8672000169754028
    PERFORMANCE ON TEST SET: Batch Loss = 0.6749918460845947, Accuracy = 0.8791510462760925
    Training iter #268000: Batch Loss = 0.420996, Accuracy = 0.9279999732971191
    PERFORMANCE ON TEST SET: Batch Loss = 0.6595016717910767, Accuracy = 0.8881398439407349
    Training iter #268500: Batch Loss = 0.408550, Accuracy = 0.9300000071525574
    PERFORMANCE ON TEST SET: Batch Loss = 0.6693432331085205, Accuracy = 0.8833957314491272
    Training iter #269000: Batch Loss = 0.434490, Accuracy = 0.9136000275611877
    PERFORMANCE ON TEST SET: Batch Loss = 0.6625021696090698, Accuracy = 0.8821473121643066
    Training iter #269500: Batch Loss = 0.481391, Accuracy = 0.8795999884605408
    PERFORMANCE ON TEST SET: Batch Loss = 0.697509765625, Accuracy = 0.8556804060935974
    =================================================
    2.1600000000000017e-110
    1.000000000000001e-111
    Training iter #270000: Batch Loss = 0.420866, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.6651742458343506, Accuracy = 0.8868913650512695
    Training iter #270500: Batch Loss = 0.407097, Accuracy = 0.9315999746322632
    PERFORMANCE ON TEST SET: Batch Loss = 0.6598714590072632, Accuracy = 0.8868913650512695
    Training iter #271000: Batch Loss = 0.443135, Accuracy = 0.901199996471405
    PERFORMANCE ON TEST SET: Batch Loss = 0.7083337306976318, Accuracy = 0.8511860370635986
    Training iter #271500: Batch Loss = 0.442108, Accuracy = 0.9083999991416931
    PERFORMANCE ON TEST SET: Batch Loss = 0.7465572357177734, Accuracy = 0.8297128677368164
    Training iter #272000: Batch Loss = 0.414374, Accuracy = 0.9296000003814697
    PERFORMANCE ON TEST SET: Batch Loss = 0.6612353920936584, Accuracy = 0.8866416811943054
    =================================================
    2.1600000000000015e-111
    1.000000000000001e-112
    Training iter #272500: Batch Loss = 0.534743, Accuracy = 0.8464000225067139
    PERFORMANCE ON TEST SET: Batch Loss = 0.6956489086151123, Accuracy = 0.8699126243591309
    Training iter #273000: Batch Loss = 0.429876, Accuracy = 0.9243999719619751
    PERFORMANCE ON TEST SET: Batch Loss = 0.674155592918396, Accuracy = 0.8833957314491272
    Training iter #273500: Batch Loss = 0.418691, Accuracy = 0.9247999787330627
    PERFORMANCE ON TEST SET: Batch Loss = 0.6612851023674011, Accuracy = 0.888389527797699
    Training iter #274000: Batch Loss = 0.421482, Accuracy = 0.9228000044822693
    PERFORMANCE ON TEST SET: Batch Loss = 0.71705162525177, Accuracy = 0.8411985039710999
    Training iter #274500: Batch Loss = 0.424823, Accuracy = 0.926800012588501
    PERFORMANCE ON TEST SET: Batch Loss = 0.6653280258178711, Accuracy = 0.8861423134803772
    =================================================
    2.1600000000000014e-112
    1.000000000000001e-113
    Training iter #275000: Batch Loss = 0.415769, Accuracy = 0.9300000071525574
    PERFORMANCE ON TEST SET: Batch Loss = 0.6823351979255676, Accuracy = 0.8754057288169861
    Training iter #275500: Batch Loss = 0.417844, Accuracy = 0.9300000071525574
    PERFORMANCE ON TEST SET: Batch Loss = 0.662121057510376, Accuracy = 0.8876404762268066
    Training iter #276000: Batch Loss = 0.423708, Accuracy = 0.9179999828338623
    PERFORMANCE ON TEST SET: Batch Loss = 0.6861047148704529, Accuracy = 0.8716604113578796
    Training iter #276500: Batch Loss = 0.419836, Accuracy = 0.925599992275238
    PERFORMANCE ON TEST SET: Batch Loss = 0.6688499450683594, Accuracy = 0.8761547803878784
    Training iter #277000: Batch Loss = 0.407301, Accuracy = 0.9363999962806702
    PERFORMANCE ON TEST SET: Batch Loss = 0.6636424660682678, Accuracy = 0.8878901600837708
    =================================================
    2.1600000000000014e-113
    1.000000000000001e-114
    Training iter #277500: Batch Loss = 0.465099, Accuracy = 0.8880000114440918
    PERFORMANCE ON TEST SET: Batch Loss = 0.6854004859924316, Accuracy = 0.8659176230430603
    Training iter #278000: Batch Loss = 0.419978, Accuracy = 0.9312000274658203
    PERFORMANCE ON TEST SET: Batch Loss = 0.6747820973396301, Accuracy = 0.8813982605934143
    Training iter #278500: Batch Loss = 0.447797, Accuracy = 0.9132000207901001
    PERFORMANCE ON TEST SET: Batch Loss = 0.7129371762275696, Accuracy = 0.846441924571991
    Training iter #279000: Batch Loss = 0.418806, Accuracy = 0.9259999990463257
    PERFORMANCE ON TEST SET: Batch Loss = 0.6649070978164673, Accuracy = 0.8916354775428772
    Training iter #279500: Batch Loss = 0.677553, Accuracy = 0.7919999957084656
    PERFORMANCE ON TEST SET: Batch Loss = 0.8220937252044678, Accuracy = 0.786766529083252
    =================================================
    2.1600000000000014e-114
    1.000000000000001e-115
    Training iter #280000: Batch Loss = 0.434375, Accuracy = 0.920799970626831
    PERFORMANCE ON TEST SET: Batch Loss = 0.6880176663398743, Accuracy = 0.8666666746139526
    Training iter #280500: Batch Loss = 0.577975, Accuracy = 0.8223999738693237
    PERFORMANCE ON TEST SET: Batch Loss = 0.8113977313041687, Accuracy = 0.7947565317153931
    Training iter #281000: Batch Loss = 0.467344, Accuracy = 0.8948000073432922
    PERFORMANCE ON TEST SET: Batch Loss = 0.7148627042770386, Accuracy = 0.8379525542259216
    Training iter #281500: Batch Loss = 0.401697, Accuracy = 0.9383999705314636
    PERFORMANCE ON TEST SET: Batch Loss = 0.6610703468322754, Accuracy = 0.8881398439407349
    Training iter #282000: Batch Loss = 0.468375, Accuracy = 0.885200023651123
    PERFORMANCE ON TEST SET: Batch Loss = 0.6815977096557617, Accuracy = 0.8731585741043091
    =================================================
    2.1600000000000013e-115
    1.000000000000001e-116
    Training iter #282500: Batch Loss = 0.476758, Accuracy = 0.8831999897956848
    PERFORMANCE ON TEST SET: Batch Loss = 0.669402003288269, Accuracy = 0.875156044960022
    Training iter #283000: Batch Loss = 0.409848, Accuracy = 0.9332000017166138
    PERFORMANCE ON TEST SET: Batch Loss = 0.6761956810951233, Accuracy = 0.8746566772460938
    Training iter #283500: Batch Loss = 0.552802, Accuracy = 0.8432000279426575
    PERFORMANCE ON TEST SET: Batch Loss = 0.7164466977119446, Accuracy = 0.8571785092353821
    Training iter #284000: Batch Loss = 0.442325, Accuracy = 0.9115999937057495
    PERFORMANCE ON TEST SET: Batch Loss = 0.6669802665710449, Accuracy = 0.8863919973373413
    Training iter #284500: Batch Loss = 0.464591, Accuracy = 0.8952000141143799
    PERFORMANCE ON TEST SET: Batch Loss = 0.6792879700660706, Accuracy = 0.8791510462760925
    =================================================
    2.1600000000000014e-116
    1.0000000000000009e-117
    Training iter #285000: Batch Loss = 0.416991, Accuracy = 0.9232000112533569
    PERFORMANCE ON TEST SET: Batch Loss = 0.6968370676040649, Accuracy = 0.8634207248687744
    Training iter #285500: Batch Loss = 0.412138, Accuracy = 0.9283999800682068
    PERFORMANCE ON TEST SET: Batch Loss = 0.6624937057495117, Accuracy = 0.8911360502243042
    Training iter #286000: Batch Loss = 0.494010, Accuracy = 0.8759999871253967
    PERFORMANCE ON TEST SET: Batch Loss = 0.745922863483429, Accuracy = 0.8262172341346741
    Training iter #286500: Batch Loss = 0.414151, Accuracy = 0.9372000098228455
    PERFORMANCE ON TEST SET: Batch Loss = 0.6717078685760498, Accuracy = 0.8843945264816284
    Training iter #287000: Batch Loss = 0.421063, Accuracy = 0.9247999787330627
    PERFORMANCE ON TEST SET: Batch Loss = 0.6741094589233398, Accuracy = 0.882896363735199
    =================================================
    2.1600000000000014e-117
    1.0000000000000009e-118
    Training iter #287500: Batch Loss = 0.463868, Accuracy = 0.8907999992370605
    PERFORMANCE ON TEST SET: Batch Loss = 0.686017632484436, Accuracy = 0.8636704087257385
    Training iter #288000: Batch Loss = 0.411334, Accuracy = 0.9344000220298767
    PERFORMANCE ON TEST SET: Batch Loss = 0.6641572713851929, Accuracy = 0.8754057288169861
    Training iter #288500: Batch Loss = 0.409333, Accuracy = 0.9332000017166138
    PERFORMANCE ON TEST SET: Batch Loss = 0.6614269614219666, Accuracy = 0.888389527797699
    Training iter #289000: Batch Loss = 0.409529, Accuracy = 0.9368000030517578
    PERFORMANCE ON TEST SET: Batch Loss = 0.6739975810050964, Accuracy = 0.8868913650512695
    Training iter #289500: Batch Loss = 0.433644, Accuracy = 0.920799970626831
    PERFORMANCE ON TEST SET: Batch Loss = 0.6853067874908447, Accuracy = 0.8744069933891296
    =================================================
    2.1600000000000015e-118
    1.0000000000000008e-119
    Training iter #290000: Batch Loss = 0.465793, Accuracy = 0.8960000276565552
    PERFORMANCE ON TEST SET: Batch Loss = 0.6841033697128296, Accuracy = 0.8734082579612732
    Training iter #290500: Batch Loss = 0.408629, Accuracy = 0.9315999746322632
    PERFORMANCE ON TEST SET: Batch Loss = 0.6592303514480591, Accuracy = 0.890636682510376
    Training iter #291000: Batch Loss = 0.411293, Accuracy = 0.9323999881744385
    PERFORMANCE ON TEST SET: Batch Loss = 0.6612773537635803, Accuracy = 0.8908863663673401
    Training iter #291500: Batch Loss = 0.490692, Accuracy = 0.8884000182151794
    PERFORMANCE ON TEST SET: Batch Loss = 0.8367215394973755, Accuracy = 0.7907615303993225
    Training iter #292000: Batch Loss = 0.410345, Accuracy = 0.9363999962806702
    PERFORMANCE ON TEST SET: Batch Loss = 0.6634836196899414, Accuracy = 0.8898876309394836
    =================================================
    2.1600000000000015e-119
    1.0000000000000008e-120
    Training iter #292500: Batch Loss = 0.410528, Accuracy = 0.9296000003814697
    PERFORMANCE ON TEST SET: Batch Loss = 0.6671398282051086, Accuracy = 0.8863919973373413
    Training iter #293000: Batch Loss = 0.475787, Accuracy = 0.8863999843597412
    PERFORMANCE ON TEST SET: Batch Loss = 0.6856344938278198, Accuracy = 0.8726591467857361
    Training iter #293500: Batch Loss = 0.527608, Accuracy = 0.8592000007629395
    PERFORMANCE ON TEST SET: Batch Loss = 0.793582558631897, Accuracy = 0.7980024814605713
    Training iter #294000: Batch Loss = 0.410510, Accuracy = 0.9323999881744385
    PERFORMANCE ON TEST SET: Batch Loss = 0.6760517954826355, Accuracy = 0.8799000978469849
    Training iter #294500: Batch Loss = 0.410862, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6916615962982178, Accuracy = 0.8741573095321655
    =================================================
    2.1600000000000015e-120
    1.0000000000000008e-121
    Training iter #295000: Batch Loss = 0.567473, Accuracy = 0.83160001039505
    PERFORMANCE ON TEST SET: Batch Loss = 0.7591492533683777, Accuracy = 0.8302122354507446
    Training iter #295500: Batch Loss = 0.519925, Accuracy = 0.8528000116348267
    PERFORMANCE ON TEST SET: Batch Loss = 0.6988794803619385, Accuracy = 0.8706616759300232
    Training iter #296000: Batch Loss = 0.412192, Accuracy = 0.926800012588501
    PERFORMANCE ON TEST SET: Batch Loss = 0.6636137366294861, Accuracy = 0.8901373147964478
    Training iter #296500: Batch Loss = 0.490567, Accuracy = 0.8755999803543091
    PERFORMANCE ON TEST SET: Batch Loss = 0.7132596373558044, Accuracy = 0.8521847724914551
    Training iter #297000: Batch Loss = 0.481847, Accuracy = 0.88919997215271
    PERFORMANCE ON TEST SET: Batch Loss = 0.7342300415039062, Accuracy = 0.8297128677368164
    =================================================
    2.1600000000000014e-121
    1.0000000000000009e-122
    Training iter #297500: Batch Loss = 0.417533, Accuracy = 0.9296000003814697
    PERFORMANCE ON TEST SET: Batch Loss = 0.6691902875900269, Accuracy = 0.888389527797699
    Training iter #298000: Batch Loss = 0.399787, Accuracy = 0.9380000233650208
    PERFORMANCE ON TEST SET: Batch Loss = 0.6608549952507019, Accuracy = 0.8881398439407349
    Training iter #298500: Batch Loss = 0.471071, Accuracy = 0.8863999843597412
    PERFORMANCE ON TEST SET: Batch Loss = 0.7181242108345032, Accuracy = 0.844194769859314
    Training iter #299000: Batch Loss = 0.404809, Accuracy = 0.9387999773025513
    PERFORMANCE ON TEST SET: Batch Loss = 0.6585898399353027, Accuracy = 0.8878901600837708
    Training iter #299500: Batch Loss = 0.412048, Accuracy = 0.9348000288009644
    PERFORMANCE ON TEST SET: Batch Loss = 0.6685185432434082, Accuracy = 0.8836454153060913
    =================================================
    2.1600000000000014e-122
    1.0000000000000009e-123
    Training iter #300000: Batch Loss = 0.424772, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.7216092348098755, Accuracy = 0.8611735105514526
    Training iter #300500: Batch Loss = 0.463566, Accuracy = 0.8999999761581421
    PERFORMANCE ON TEST SET: Batch Loss = 0.6760550141334534, Accuracy = 0.882896363735199
    Training iter #301000: Batch Loss = 0.423626, Accuracy = 0.926800012588501
    PERFORMANCE ON TEST SET: Batch Loss = 0.6656198501586914, Accuracy = 0.885642945766449
    Training iter #301500: Batch Loss = 0.476427, Accuracy = 0.8867999911308289
    PERFORMANCE ON TEST SET: Batch Loss = 0.6992204189300537, Accuracy = 0.8579275608062744
    Training iter #302000: Batch Loss = 0.407673, Accuracy = 0.9327999949455261
    PERFORMANCE ON TEST SET: Batch Loss = 0.6580668091773987, Accuracy = 0.8926342129707336
    =================================================
    2.1600000000000015e-123
    1.000000000000001e-124
    Training iter #302500: Batch Loss = 0.944763, Accuracy = 0.7008000016212463
    PERFORMANCE ON TEST SET: Batch Loss = 0.9065970778465271, Accuracy = 0.755805253982544
    Training iter #303000: Batch Loss = 0.456934, Accuracy = 0.8960000276565552
    PERFORMANCE ON TEST SET: Batch Loss = 0.6804394721984863, Accuracy = 0.8848938941955566
    Training iter #303500: Batch Loss = 0.401241, Accuracy = 0.9380000233650208
    PERFORMANCE ON TEST SET: Batch Loss = 0.6616968512535095, Accuracy = 0.8888888955116272
    Training iter #304000: Batch Loss = 0.458833, Accuracy = 0.8960000276565552
    PERFORMANCE ON TEST SET: Batch Loss = 0.7242891788482666, Accuracy = 0.8379525542259216
    Training iter #304500: Batch Loss = 0.411513, Accuracy = 0.9312000274658203
    PERFORMANCE ON TEST SET: Batch Loss = 0.6589128971099854, Accuracy = 0.8896379470825195
    =================================================
    2.1600000000000015e-124
    1.0000000000000009e-125
    Training iter #305000: Batch Loss = 0.401089, Accuracy = 0.9419999718666077
    PERFORMANCE ON TEST SET: Batch Loss = 0.6641961336135864, Accuracy = 0.8868913650512695
    Training iter #305500: Batch Loss = 0.437368, Accuracy = 0.9100000262260437
    PERFORMANCE ON TEST SET: Batch Loss = 0.6791653037071228, Accuracy = 0.8851435780525208
    Training iter #306000: Batch Loss = 0.434795, Accuracy = 0.9211999773979187
    PERFORMANCE ON TEST SET: Batch Loss = 0.6732809543609619, Accuracy = 0.8896379470825195
    Training iter #306500: Batch Loss = 0.524733, Accuracy = 0.8564000129699707
    PERFORMANCE ON TEST SET: Batch Loss = 0.7102982997894287, Accuracy = 0.8556804060935974
    Training iter #307000: Batch Loss = 0.561105, Accuracy = 0.8371999859809875
    PERFORMANCE ON TEST SET: Batch Loss = 0.745026707649231, Accuracy = 0.8409488201141357
    =================================================
    2.1600000000000015e-125
    1.000000000000001e-126
    Training iter #307500: Batch Loss = 0.426098, Accuracy = 0.9156000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6839154362678528, Accuracy = 0.8711610436439514
    Training iter #308000: Batch Loss = 0.499138, Accuracy = 0.8751999735832214
    PERFORMANCE ON TEST SET: Batch Loss = 0.9731272459030151, Accuracy = 0.732833981513977
    Training iter #308500: Batch Loss = 0.412621, Accuracy = 0.9308000206947327
    PERFORMANCE ON TEST SET: Batch Loss = 0.6750996708869934, Accuracy = 0.8858926296234131
    Training iter #309000: Batch Loss = 0.469183, Accuracy = 0.8880000114440918
    PERFORMANCE ON TEST SET: Batch Loss = 0.759037435054779, Accuracy = 0.8239700198173523
    Training iter #309500: Batch Loss = 0.406912, Accuracy = 0.930400013923645
    PERFORMANCE ON TEST SET: Batch Loss = 0.6681926250457764, Accuracy = 0.8808988928794861
    =================================================
    2.1600000000000016e-126
    1.0000000000000008e-127
    Training iter #310000: Batch Loss = 0.420058, Accuracy = 0.9243999719619751
    PERFORMANCE ON TEST SET: Batch Loss = 0.6560477018356323, Accuracy = 0.8903869986534119
    Training iter #310500: Batch Loss = 0.434157, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.7280665636062622, Accuracy = 0.8369538187980652
    Training iter #311000: Batch Loss = 0.402228, Accuracy = 0.9344000220298767
    PERFORMANCE ON TEST SET: Batch Loss = 0.6668216586112976, Accuracy = 0.8888888955116272
    Training iter #311500: Batch Loss = 0.446962, Accuracy = 0.9136000275611877
    PERFORMANCE ON TEST SET: Batch Loss = 0.7714661359786987, Accuracy = 0.8209737539291382
    Training iter #312000: Batch Loss = 0.412471, Accuracy = 0.9363999962806702
    PERFORMANCE ON TEST SET: Batch Loss = 0.6631233096122742, Accuracy = 0.8873907327651978
    =================================================
    2.1600000000000017e-127
    1.0000000000000008e-128
    Training iter #312500: Batch Loss = 0.465402, Accuracy = 0.8992000222206116
    PERFORMANCE ON TEST SET: Batch Loss = 0.7899659872055054, Accuracy = 0.8062421679496765
    Training iter #313000: Batch Loss = 0.436836, Accuracy = 0.9028000235557556
    PERFORMANCE ON TEST SET: Batch Loss = 0.6664047241210938, Accuracy = 0.8858926296234131
    Training iter #313500: Batch Loss = 0.642228, Accuracy = 0.8032000064849854
    PERFORMANCE ON TEST SET: Batch Loss = 0.8732491731643677, Accuracy = 0.7792758941650391
    Training iter #314000: Batch Loss = 0.464173, Accuracy = 0.8952000141143799
    PERFORMANCE ON TEST SET: Batch Loss = 0.6901714205741882, Accuracy = 0.872409462928772
    Training iter #314500: Batch Loss = 0.398688, Accuracy = 0.9423999786376953
    PERFORMANCE ON TEST SET: Batch Loss = 0.6515886783599854, Accuracy = 0.8921348452568054
    =================================================
    2.1600000000000018e-128
    1.0000000000000009e-129
    Training iter #315000: Batch Loss = 0.407146, Accuracy = 0.9315999746322632
    PERFORMANCE ON TEST SET: Batch Loss = 0.6639540195465088, Accuracy = 0.8803995251655579
    Training iter #315500: Batch Loss = 0.407311, Accuracy = 0.9344000220298767
    PERFORMANCE ON TEST SET: Batch Loss = 0.6608086824417114, Accuracy = 0.8833957314491272
    Training iter #316000: Batch Loss = 0.433048, Accuracy = 0.9151999950408936
    PERFORMANCE ON TEST SET: Batch Loss = 0.6712179183959961, Accuracy = 0.8853932619094849
    Training iter #316500: Batch Loss = 0.406025, Accuracy = 0.930400013923645
    PERFORMANCE ON TEST SET: Batch Loss = 0.6782166957855225, Accuracy = 0.8868913650512695
    Training iter #317000: Batch Loss = 0.421473, Accuracy = 0.9327999949455261
    PERFORMANCE ON TEST SET: Batch Loss = 0.6704660058021545, Accuracy = 0.8918851613998413
    =================================================
    2.160000000000002e-129
    1.0000000000000009e-130
    Training iter #317500: Batch Loss = 0.425819, Accuracy = 0.920799970626831
    PERFORMANCE ON TEST SET: Batch Loss = 0.6616286039352417, Accuracy = 0.885642945766449
    Training iter #318000: Batch Loss = 0.408110, Accuracy = 0.9323999881744385
    PERFORMANCE ON TEST SET: Batch Loss = 0.6544538736343384, Accuracy = 0.8948814272880554
    Training iter #318500: Batch Loss = 0.411688, Accuracy = 0.9308000206947327
    PERFORMANCE ON TEST SET: Batch Loss = 0.659514844417572, Accuracy = 0.8923845291137695
    Training iter #319000: Batch Loss = 0.418088, Accuracy = 0.9319999814033508
    PERFORMANCE ON TEST SET: Batch Loss = 0.6572040915489197, Accuracy = 0.8936329483985901
    Training iter #319500: Batch Loss = 0.429477, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.664179265499115, Accuracy = 0.8891385793685913
    =================================================
    2.1600000000000018e-130
    1.0000000000000009e-131
    Training iter #320000: Batch Loss = 0.426642, Accuracy = 0.9128000140190125
    PERFORMANCE ON TEST SET: Batch Loss = 0.6941626071929932, Accuracy = 0.8551810383796692
    Training iter #320500: Batch Loss = 0.512637, Accuracy = 0.8619999885559082
    PERFORMANCE ON TEST SET: Batch Loss = 0.6835692524909973, Accuracy = 0.8716604113578796
    Training iter #321000: Batch Loss = 0.522246, Accuracy = 0.8615999817848206
    PERFORMANCE ON TEST SET: Batch Loss = 0.8519109487533569, Accuracy = 0.7805243730545044
    Training iter #321500: Batch Loss = 0.420284, Accuracy = 0.925599992275238
    PERFORMANCE ON TEST SET: Batch Loss = 0.6747654676437378, Accuracy = 0.8853932619094849
    Training iter #322000: Batch Loss = 0.398521, Accuracy = 0.9416000247001648
    PERFORMANCE ON TEST SET: Batch Loss = 0.6772291660308838, Accuracy = 0.8893882632255554
    =================================================
    2.1600000000000017e-131
    1.000000000000001e-132
    Training iter #322500: Batch Loss = 0.417221, Accuracy = 0.9336000084877014
    PERFORMANCE ON TEST SET: Batch Loss = 0.665117621421814, Accuracy = 0.8926342129707336
    Training iter #323000: Batch Loss = 0.410089, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6543824672698975, Accuracy = 0.8916354775428772
    Training iter #323500: Batch Loss = 0.413356, Accuracy = 0.9300000071525574
    PERFORMANCE ON TEST SET: Batch Loss = 0.6669098734855652, Accuracy = 0.8811485767364502
    Training iter #324000: Batch Loss = 0.411269, Accuracy = 0.9323999881744385
    PERFORMANCE ON TEST SET: Batch Loss = 0.6518366932868958, Accuracy = 0.8916354775428772
    Training iter #324500: Batch Loss = 0.448471, Accuracy = 0.9128000140190125
    PERFORMANCE ON TEST SET: Batch Loss = 0.6705363988876343, Accuracy = 0.882896363735199
    =================================================
    2.1600000000000018e-132
    1.0000000000000008e-133
    Training iter #325000: Batch Loss = 0.405774, Accuracy = 0.9376000165939331
    PERFORMANCE ON TEST SET: Batch Loss = 0.6581498980522156, Accuracy = 0.8946316838264465
    Training iter #325500: Batch Loss = 0.400359, Accuracy = 0.9395999908447266
    PERFORMANCE ON TEST SET: Batch Loss = 0.6606892347335815, Accuracy = 0.8901373147964478
    Training iter #326000: Batch Loss = 0.436593, Accuracy = 0.9111999869346619
    PERFORMANCE ON TEST SET: Batch Loss = 0.6648300886154175, Accuracy = 0.8791510462760925
    Training iter #326500: Batch Loss = 0.412337, Accuracy = 0.9315999746322632
    PERFORMANCE ON TEST SET: Batch Loss = 0.656998336315155, Accuracy = 0.8928838968276978
    Training iter #327000: Batch Loss = 0.403282, Accuracy = 0.9395999908447266
    PERFORMANCE ON TEST SET: Batch Loss = 0.6615923047065735, Accuracy = 0.8956304788589478
    =================================================
    2.1600000000000018e-133
    1.0000000000000009e-134
    Training iter #327500: Batch Loss = 0.607280, Accuracy = 0.8443999886512756
    PERFORMANCE ON TEST SET: Batch Loss = 1.063772201538086, Accuracy = 0.7111111283302307
    Training iter #328000: Batch Loss = 0.546332, Accuracy = 0.8471999764442444
    PERFORMANCE ON TEST SET: Batch Loss = 0.6904681921005249, Accuracy = 0.862421989440918
    Training iter #328500: Batch Loss = 0.410907, Accuracy = 0.9363999962806702
    PERFORMANCE ON TEST SET: Batch Loss = 0.654334306716919, Accuracy = 0.8911360502243042
    Training iter #329000: Batch Loss = 0.410995, Accuracy = 0.9336000084877014
    PERFORMANCE ON TEST SET: Batch Loss = 0.6609875559806824, Accuracy = 0.8923845291137695
    Training iter #329500: Batch Loss = 0.405526, Accuracy = 0.9308000206947327
    PERFORMANCE ON TEST SET: Batch Loss = 0.6680412888526917, Accuracy = 0.8831460475921631
    =================================================
    2.1600000000000017e-134
    1.000000000000001e-135
    Training iter #330000: Batch Loss = 0.421845, Accuracy = 0.9319999814033508
    PERFORMANCE ON TEST SET: Batch Loss = 0.6707466244697571, Accuracy = 0.8851435780525208
    Training iter #330500: Batch Loss = 0.474528, Accuracy = 0.8871999979019165
    PERFORMANCE ON TEST SET: Batch Loss = 0.7175737023353577, Accuracy = 0.8486891388893127
    Training iter #331000: Batch Loss = 0.403155, Accuracy = 0.9387999773025513
    PERFORMANCE ON TEST SET: Batch Loss = 0.6678333878517151, Accuracy = 0.8868913650512695
    Training iter #331500: Batch Loss = 0.407305, Accuracy = 0.9348000288009644
    PERFORMANCE ON TEST SET: Batch Loss = 0.6965939402580261, Accuracy = 0.8591760396957397
    Training iter #332000: Batch Loss = 0.452461, Accuracy = 0.9031999707221985
    PERFORMANCE ON TEST SET: Batch Loss = 0.751006007194519, Accuracy = 0.8289638161659241
    =================================================
    2.1600000000000017e-135
    1.000000000000001e-136
    Training iter #332500: Batch Loss = 0.403316, Accuracy = 0.9412000179290771
    PERFORMANCE ON TEST SET: Batch Loss = 0.6624614000320435, Accuracy = 0.8931335806846619
    Training iter #333000: Batch Loss = 0.399342, Accuracy = 0.9416000247001648
    PERFORMANCE ON TEST SET: Batch Loss = 0.6702125072479248, Accuracy = 0.8916354775428772
    Training iter #333500: Batch Loss = 0.436326, Accuracy = 0.9143999814987183
    PERFORMANCE ON TEST SET: Batch Loss = 0.6759463548660278, Accuracy = 0.8911360502243042
    Training iter #334000: Batch Loss = 0.421366, Accuracy = 0.9272000193595886
    PERFORMANCE ON TEST SET: Batch Loss = 0.6547027230262756, Accuracy = 0.8921348452568054
    Training iter #334500: Batch Loss = 0.435752, Accuracy = 0.9132000207901001
    PERFORMANCE ON TEST SET: Batch Loss = 0.657692015171051, Accuracy = 0.8921348452568054
    =================================================
    2.160000000000002e-136
    1.0000000000000009e-137
    Training iter #335000: Batch Loss = 0.446535, Accuracy = 0.9120000004768372
    PERFORMANCE ON TEST SET: Batch Loss = 0.7312787771224976, Accuracy = 0.8294631838798523
    Training iter #335500: Batch Loss = 0.416536, Accuracy = 0.9319999814033508
    PERFORMANCE ON TEST SET: Batch Loss = 0.6670130491256714, Accuracy = 0.8876404762268066
    Training iter #336000: Batch Loss = 0.459471, Accuracy = 0.8931999802589417
    PERFORMANCE ON TEST SET: Batch Loss = 0.7189436554908752, Accuracy = 0.846941351890564
    Training iter #336500: Batch Loss = 0.453868, Accuracy = 0.8980000019073486
    PERFORMANCE ON TEST SET: Batch Loss = 0.7049739360809326, Accuracy = 0.8571785092353821
    Training iter #337000: Batch Loss = 0.417366, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.9003108739852905, Accuracy = 0.7670412063598633
    =================================================
    2.1600000000000017e-137
    1.000000000000001e-138
    Training iter #337500: Batch Loss = 0.477680, Accuracy = 0.8871999979019165
    PERFORMANCE ON TEST SET: Batch Loss = 0.673903226852417, Accuracy = 0.8721597790718079
    Training iter #338000: Batch Loss = 0.402760, Accuracy = 0.9412000179290771
    PERFORMANCE ON TEST SET: Batch Loss = 0.6712301969528198, Accuracy = 0.8853932619094849
    Training iter #338500: Batch Loss = 0.484467, Accuracy = 0.8948000073432922
    PERFORMANCE ON TEST SET: Batch Loss = 0.8408393263816833, Accuracy = 0.7872658967971802
    Training iter #339000: Batch Loss = 0.417938, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6692434549331665, Accuracy = 0.8908863663673401
    Training iter #339500: Batch Loss = 0.500497, Accuracy = 0.8723999857902527
    PERFORMANCE ON TEST SET: Batch Loss = 0.668770968914032, Accuracy = 0.8866416811943054
    =================================================
    2.1600000000000017e-138
    1.000000000000001e-139
    Training iter #340000: Batch Loss = 0.404506, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6527901887893677, Accuracy = 0.8943819999694824
    Training iter #340500: Batch Loss = 0.442155, Accuracy = 0.9083999991416931
    PERFORMANCE ON TEST SET: Batch Loss = 0.6644102334976196, Accuracy = 0.8898876309394836
    Training iter #341000: Batch Loss = 0.417077, Accuracy = 0.930400013923645
    PERFORMANCE ON TEST SET: Batch Loss = 0.6618393659591675, Accuracy = 0.8946316838264465
    Training iter #341500: Batch Loss = 0.425343, Accuracy = 0.9200000166893005
    PERFORMANCE ON TEST SET: Batch Loss = 0.7332384586334229, Accuracy = 0.8484394550323486
    Training iter #342000: Batch Loss = 0.401052, Accuracy = 0.9380000233650208
    PERFORMANCE ON TEST SET: Batch Loss = 0.6623610854148865, Accuracy = 0.8946316838264465
    =================================================
    2.1600000000000015e-139
    1.0000000000000009e-140
    Training iter #342500: Batch Loss = 0.412425, Accuracy = 0.9232000112533569
    PERFORMANCE ON TEST SET: Batch Loss = 0.7344158887863159, Accuracy = 0.8377028703689575
    Training iter #343000: Batch Loss = 0.402884, Accuracy = 0.9351999759674072
    PERFORMANCE ON TEST SET: Batch Loss = 0.6505411863327026, Accuracy = 0.8948814272880554
    Training iter #343500: Batch Loss = 0.425151, Accuracy = 0.91839998960495
    PERFORMANCE ON TEST SET: Batch Loss = 0.6973254680633545, Accuracy = 0.8621723055839539
    Training iter #344000: Batch Loss = 0.430613, Accuracy = 0.9172000288963318
    PERFORMANCE ON TEST SET: Batch Loss = 0.6822435855865479, Accuracy = 0.8811485767364502
    Training iter #344500: Batch Loss = 0.416000, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6728529930114746, Accuracy = 0.8908863663673401
    =================================================
    2.1600000000000016e-140
    1.0000000000000009e-141
    Training iter #345000: Batch Loss = 0.422070, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.669331431388855, Accuracy = 0.8858926296234131
    Training iter #345500: Batch Loss = 0.403906, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.655228853225708, Accuracy = 0.8926342129707336
    Training iter #346000: Batch Loss = 0.510229, Accuracy = 0.8644000291824341
    PERFORMANCE ON TEST SET: Batch Loss = 0.7139246463775635, Accuracy = 0.8531835079193115
    Training iter #346500: Batch Loss = 0.442948, Accuracy = 0.9092000126838684
    PERFORMANCE ON TEST SET: Batch Loss = 0.6775720119476318, Accuracy = 0.8881398439407349
    Training iter #347000: Batch Loss = 0.412877, Accuracy = 0.9287999868392944
    PERFORMANCE ON TEST SET: Batch Loss = 0.6744194626808167, Accuracy = 0.8833957314491272
    =================================================
    2.1600000000000016e-141
    1.000000000000001e-142
    Training iter #347500: Batch Loss = 0.502550, Accuracy = 0.8712000250816345
    PERFORMANCE ON TEST SET: Batch Loss = 0.7338355183601379, Accuracy = 0.8379525542259216
    Training iter #348000: Batch Loss = 0.393704, Accuracy = 0.9395999908447266
    PERFORMANCE ON TEST SET: Batch Loss = 0.6588130593299866, Accuracy = 0.8953807950019836
    Training iter #348500: Batch Loss = 0.504886, Accuracy = 0.8700000047683716
    PERFORMANCE ON TEST SET: Batch Loss = 0.7943391799926758, Accuracy = 0.8044943809509277
    Training iter #349000: Batch Loss = 0.394829, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6619476079940796, Accuracy = 0.8913857936859131
    Training iter #349500: Batch Loss = 0.401991, Accuracy = 0.9372000098228455
    PERFORMANCE ON TEST SET: Batch Loss = 0.6667883992195129, Accuracy = 0.8898876309394836
    =================================================
    2.1600000000000015e-142
    1.000000000000001e-143
    Training iter #350000: Batch Loss = 0.412939, Accuracy = 0.9368000030517578
    PERFORMANCE ON TEST SET: Batch Loss = 0.6591590046882629, Accuracy = 0.8948814272880554
    Training iter #350500: Batch Loss = 0.435880, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.6602291464805603, Accuracy = 0.8941323161125183
    Training iter #351000: Batch Loss = 0.412213, Accuracy = 0.9319999814033508
    PERFORMANCE ON TEST SET: Batch Loss = 0.6686952710151672, Accuracy = 0.8841448426246643
    Training iter #351500: Batch Loss = 0.397364, Accuracy = 0.9395999908447266
    PERFORMANCE ON TEST SET: Batch Loss = 0.6550745368003845, Accuracy = 0.8951311111450195
    Training iter #352000: Batch Loss = 0.620615, Accuracy = 0.8163999915122986
    PERFORMANCE ON TEST SET: Batch Loss = 0.7598498463630676, Accuracy = 0.8327091336250305
    =================================================
    2.1600000000000016e-143
    1.000000000000001e-144
    Training iter #352500: Batch Loss = 0.400341, Accuracy = 0.9412000179290771
    PERFORMANCE ON TEST SET: Batch Loss = 0.6616067886352539, Accuracy = 0.8956304788589478
    Training iter #353000: Batch Loss = 0.449358, Accuracy = 0.9016000032424927
    PERFORMANCE ON TEST SET: Batch Loss = 0.8478084802627563, Accuracy = 0.7897627949714661
    Training iter #353500: Batch Loss = 0.463504, Accuracy = 0.88919997215271
    PERFORMANCE ON TEST SET: Batch Loss = 0.7295246124267578, Accuracy = 0.8372035026550293
    Training iter #354000: Batch Loss = 0.418760, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.6584777235984802, Accuracy = 0.8946316838264465
    Training iter #354500: Batch Loss = 0.400051, Accuracy = 0.9383999705314636
    PERFORMANCE ON TEST SET: Batch Loss = 0.6575967073440552, Accuracy = 0.8963795304298401
    =================================================
    2.1600000000000017e-144
    1.000000000000001e-145
    Training iter #355000: Batch Loss = 0.581920, Accuracy = 0.8392000198364258
    PERFORMANCE ON TEST SET: Batch Loss = 0.798486590385437, Accuracy = 0.8109862804412842
    Training iter #355500: Batch Loss = 0.420931, Accuracy = 0.9332000017166138
    PERFORMANCE ON TEST SET: Batch Loss = 0.6864013671875, Accuracy = 0.8791510462760925
    Training iter #356000: Batch Loss = 0.543180, Accuracy = 0.8464000225067139
    PERFORMANCE ON TEST SET: Batch Loss = 0.6819358468055725, Accuracy = 0.8764045238494873
    Training iter #356500: Batch Loss = 0.402782, Accuracy = 0.9383999705314636
    PERFORMANCE ON TEST SET: Batch Loss = 0.6540027856826782, Accuracy = 0.896129846572876
    Training iter #357000: Batch Loss = 0.396039, Accuracy = 0.9391999840736389
    PERFORMANCE ON TEST SET: Batch Loss = 0.6492454409599304, Accuracy = 0.8971285820007324
    =================================================
    2.1600000000000017e-145
    1.000000000000001e-146
    Training iter #357500: Batch Loss = 0.428504, Accuracy = 0.920799970626831
    PERFORMANCE ON TEST SET: Batch Loss = 0.6644637584686279, Accuracy = 0.8891385793685913
    Training iter #358000: Batch Loss = 0.478049, Accuracy = 0.8823999762535095
    PERFORMANCE ON TEST SET: Batch Loss = 0.6651225090026855, Accuracy = 0.8868913650512695
    Training iter #358500: Batch Loss = 0.480904, Accuracy = 0.8784000277519226
    PERFORMANCE ON TEST SET: Batch Loss = 0.6874924302101135, Accuracy = 0.8746566772460938
    Training iter #359000: Batch Loss = 0.389772, Accuracy = 0.9419999718666077
    PERFORMANCE ON TEST SET: Batch Loss = 0.6585522890090942, Accuracy = 0.8913857936859131
    Training iter #359500: Batch Loss = 0.476475, Accuracy = 0.8831999897956848
    PERFORMANCE ON TEST SET: Batch Loss = 0.6920951008796692, Accuracy = 0.8644194602966309
    =================================================
    2.1600000000000017e-146
    1.0000000000000011e-147
    Training iter #360000: Batch Loss = 0.395243, Accuracy = 0.9440000057220459
    PERFORMANCE ON TEST SET: Batch Loss = 0.6564546227455139, Accuracy = 0.8976279497146606
    Training iter #360500: Batch Loss = 0.401210, Accuracy = 0.9372000098228455
    PERFORMANCE ON TEST SET: Batch Loss = 0.6575697064399719, Accuracy = 0.8956304788589478
    Training iter #361000: Batch Loss = 0.447277, Accuracy = 0.9075999855995178
    PERFORMANCE ON TEST SET: Batch Loss = 0.6791998744010925, Accuracy = 0.8838951587677002
    Training iter #361500: Batch Loss = 0.423048, Accuracy = 0.9259999990463257
    PERFORMANCE ON TEST SET: Batch Loss = 0.6658965945243835, Accuracy = 0.8908863663673401
    Training iter #362000: Batch Loss = 0.399481, Accuracy = 0.9399999976158142
    PERFORMANCE ON TEST SET: Batch Loss = 0.6551843881607056, Accuracy = 0.8978776335716248
    =================================================
    2.1600000000000018e-147
    1.0000000000000012e-148
    Training iter #362500: Batch Loss = 0.446960, Accuracy = 0.8984000086784363
    PERFORMANCE ON TEST SET: Batch Loss = 0.6692779660224915, Accuracy = 0.8891385793685913
    Training iter #363000: Batch Loss = 0.464802, Accuracy = 0.8935999870300293
    PERFORMANCE ON TEST SET: Batch Loss = 0.6656172871589661, Accuracy = 0.8878901600837708
    Training iter #363500: Batch Loss = 0.403767, Accuracy = 0.9395999908447266
    PERFORMANCE ON TEST SET: Batch Loss = 0.6558130383491516, Accuracy = 0.8963795304298401
    Training iter #364000: Batch Loss = 0.403836, Accuracy = 0.9363999962806702
    PERFORMANCE ON TEST SET: Batch Loss = 0.6613678336143494, Accuracy = 0.8946316838264465
    Training iter #364500: Batch Loss = 0.396141, Accuracy = 0.9383999705314636
    PERFORMANCE ON TEST SET: Batch Loss = 0.6722951531410217, Accuracy = 0.8868913650512695
    =================================================
    2.160000000000002e-148
    1.000000000000001e-149
    Training iter #365000: Batch Loss = 0.403515, Accuracy = 0.9336000084877014
    PERFORMANCE ON TEST SET: Batch Loss = 0.6542288064956665, Accuracy = 0.8958801627159119
    Training iter #365500: Batch Loss = 0.417029, Accuracy = 0.9272000193595886
    PERFORMANCE ON TEST SET: Batch Loss = 0.705484926700592, Accuracy = 0.8629213571548462
    Training iter #366000: Batch Loss = 0.400995, Accuracy = 0.9376000165939331
    PERFORMANCE ON TEST SET: Batch Loss = 0.6936370730400085, Accuracy = 0.8684144616127014
    Training iter #366500: Batch Loss = 0.425672, Accuracy = 0.9232000112533569
    PERFORMANCE ON TEST SET: Batch Loss = 0.6743773221969604, Accuracy = 0.8876404762268066
    Training iter #367000: Batch Loss = 0.408761, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6598272323608398, Accuracy = 0.8981273174285889
    =================================================
    2.160000000000002e-149
    1.0000000000000011e-150
    Training iter #367500: Batch Loss = 0.628436, Accuracy = 0.8227999806404114
    PERFORMANCE ON TEST SET: Batch Loss = 0.752671480178833, Accuracy = 0.8342072367668152
    Training iter #368000: Batch Loss = 0.392761, Accuracy = 0.9440000057220459
    PERFORMANCE ON TEST SET: Batch Loss = 0.6521959900856018, Accuracy = 0.9008738994598389
    Training iter #368500: Batch Loss = 0.440284, Accuracy = 0.9160000085830688
    PERFORMANCE ON TEST SET: Batch Loss = 0.6724187731742859, Accuracy = 0.8846442103385925
    Training iter #369000: Batch Loss = 0.403260, Accuracy = 0.9395999908447266
    PERFORMANCE ON TEST SET: Batch Loss = 0.6556224822998047, Accuracy = 0.8968788981437683
    Training iter #369500: Batch Loss = 0.412811, Accuracy = 0.9296000003814697
    PERFORMANCE ON TEST SET: Batch Loss = 0.6694576144218445, Accuracy = 0.8799000978469849
    =================================================
    2.160000000000002e-150
    1.0000000000000011e-151
    Training iter #370000: Batch Loss = 0.471508, Accuracy = 0.8888000249862671
    PERFORMANCE ON TEST SET: Batch Loss = 0.6742080450057983, Accuracy = 0.8848938941955566
    Training iter #370500: Batch Loss = 0.436720, Accuracy = 0.909600019454956
    PERFORMANCE ON TEST SET: Batch Loss = 0.6856828927993774, Accuracy = 0.8721597790718079
    Training iter #371000: Batch Loss = 0.428360, Accuracy = 0.9160000085830688
    PERFORMANCE ON TEST SET: Batch Loss = 0.6675173044204712, Accuracy = 0.8851435780525208
    Training iter #371500: Batch Loss = 0.398284, Accuracy = 0.9355999827384949
    PERFORMANCE ON TEST SET: Batch Loss = 0.6549580693244934, Accuracy = 0.8948814272880554
    Training iter #372000: Batch Loss = 0.407101, Accuracy = 0.9383999705314636
    PERFORMANCE ON TEST SET: Batch Loss = 0.6603451371192932, Accuracy = 0.8956304788589478
    =================================================
    2.160000000000002e-151
    1.0000000000000011e-152
    Training iter #372500: Batch Loss = 0.410305, Accuracy = 0.9372000098228455
    PERFORMANCE ON TEST SET: Batch Loss = 0.6562266945838928, Accuracy = 0.898377001285553
    Training iter #373000: Batch Loss = 0.402798, Accuracy = 0.9368000030517578
    PERFORMANCE ON TEST SET: Batch Loss = 0.6529724597930908, Accuracy = 0.8993757963180542
    Training iter #373500: Batch Loss = 0.412412, Accuracy = 0.9283999800682068
    PERFORMANCE ON TEST SET: Batch Loss = 0.6599070429801941, Accuracy = 0.8951311111450195
    Training iter #374000: Batch Loss = 0.408967, Accuracy = 0.9340000152587891
    PERFORMANCE ON TEST SET: Batch Loss = 0.6520810127258301, Accuracy = 0.8976279497146606
    Training iter #374500: Batch Loss = 0.401778, Accuracy = 0.9408000111579895
    PERFORMANCE ON TEST SET: Batch Loss = 0.6532557606697083, Accuracy = 0.8981273174285889
    =================================================
    2.160000000000002e-152
    1.0000000000000011e-153
    Training iter #375000: Batch Loss = 0.399922, Accuracy = 0.9391999840736389
    PERFORMANCE ON TEST SET: Batch Loss = 0.6566454768180847, Accuracy = 0.8951311111450195
    Training iter #375500: Batch Loss = 0.432371, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.7558447122573853, Accuracy = 0.8302122354507446
    Training iter #376000: Batch Loss = 0.405011, Accuracy = 0.9327999949455261
    PERFORMANCE ON TEST SET: Batch Loss = 0.6521320343017578, Accuracy = 0.8951311111450195
    Training iter #376500: Batch Loss = 0.403969, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6570854187011719, Accuracy = 0.8958801627159119
    Training iter #377000: Batch Loss = 0.396092, Accuracy = 0.942799985408783
    PERFORMANCE ON TEST SET: Batch Loss = 0.6563953161239624, Accuracy = 0.8946316838264465
    =================================================
    2.1600000000000023e-153
    1.0000000000000011e-154
    Training iter #377500: Batch Loss = 0.406206, Accuracy = 0.9368000030517578
    PERFORMANCE ON TEST SET: Batch Loss = 0.6682388186454773, Accuracy = 0.8913857936859131
    Training iter #378000: Batch Loss = 0.409503, Accuracy = 0.9372000098228455
    PERFORMANCE ON TEST SET: Batch Loss = 0.6599743366241455, Accuracy = 0.896129846572876
    Training iter #378500: Batch Loss = 0.553777, Accuracy = 0.8539999723434448
    PERFORMANCE ON TEST SET: Batch Loss = 0.8702241778373718, Accuracy = 0.7807740569114685
    Training iter #379000: Batch Loss = 0.459729, Accuracy = 0.8935999870300293
    PERFORMANCE ON TEST SET: Batch Loss = 0.7142925262451172, Accuracy = 0.8651685118675232
    Training iter #379500: Batch Loss = 0.624770, Accuracy = 0.8203999996185303
    PERFORMANCE ON TEST SET: Batch Loss = 0.8427980542182922, Accuracy = 0.7885143756866455
    =================================================
    2.1600000000000022e-154
    1.000000000000001e-155
    Training iter #380000: Batch Loss = 0.411031, Accuracy = 0.9312000274658203
    PERFORMANCE ON TEST SET: Batch Loss = 0.6646547317504883, Accuracy = 0.8948814272880554
    Training iter #380500: Batch Loss = 0.401557, Accuracy = 0.9408000111579895
    PERFORMANCE ON TEST SET: Batch Loss = 0.6566802263259888, Accuracy = 0.8968788981437683
    Training iter #381000: Batch Loss = 0.388416, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.656523585319519, Accuracy = 0.8986267447471619
    Training iter #381500: Batch Loss = 0.485864, Accuracy = 0.8827999830245972
    PERFORMANCE ON TEST SET: Batch Loss = 0.8732388019561768, Accuracy = 0.781772792339325
    Training iter #382000: Batch Loss = 0.479304, Accuracy = 0.8823999762535095
    PERFORMANCE ON TEST SET: Batch Loss = 0.7123364210128784, Accuracy = 0.8649188280105591
    =================================================
    2.160000000000002e-155
    1.0000000000000011e-156
    Training iter #382500: Batch Loss = 0.407583, Accuracy = 0.930400013923645
    PERFORMANCE ON TEST SET: Batch Loss = 0.663327693939209, Accuracy = 0.8938826322555542
    Training iter #383000: Batch Loss = 0.479276, Accuracy = 0.8884000182151794
    PERFORMANCE ON TEST SET: Batch Loss = 0.7282119393348694, Accuracy = 0.8474407196044922
    Training iter #383500: Batch Loss = 0.405841, Accuracy = 0.9404000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.6591477394104004, Accuracy = 0.898876428604126
    Training iter #384000: Batch Loss = 0.407953, Accuracy = 0.9340000152587891
    PERFORMANCE ON TEST SET: Batch Loss = 0.6597676277160645, Accuracy = 0.8956304788589478
    Training iter #384500: Batch Loss = 0.436278, Accuracy = 0.9083999991416931
    PERFORMANCE ON TEST SET: Batch Loss = 0.7306891679763794, Accuracy = 0.8389512896537781
    =================================================
    2.160000000000002e-156
    1.000000000000001e-157
    Training iter #385000: Batch Loss = 0.408125, Accuracy = 0.9348000288009644
    PERFORMANCE ON TEST SET: Batch Loss = 0.6548990607261658, Accuracy = 0.9001248478889465
    Training iter #385500: Batch Loss = 0.399688, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.659242570400238, Accuracy = 0.8981273174285889
    Training iter #386000: Batch Loss = 0.403606, Accuracy = 0.9376000165939331
    PERFORMANCE ON TEST SET: Batch Loss = 0.6557592153549194, Accuracy = 0.8971285820007324
    Training iter #386500: Batch Loss = 0.408037, Accuracy = 0.9323999881744385
    PERFORMANCE ON TEST SET: Batch Loss = 0.6609258651733398, Accuracy = 0.8928838968276978
    Training iter #387000: Batch Loss = 0.459022, Accuracy = 0.8999999761581421
    PERFORMANCE ON TEST SET: Batch Loss = 0.6837659478187561, Accuracy = 0.8816479444503784
    =================================================
    2.160000000000002e-157
    1.000000000000001e-158
    Training iter #387500: Batch Loss = 0.391431, Accuracy = 0.946399986743927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6543532013893127, Accuracy = 0.898377001285553
    Training iter #388000: Batch Loss = 0.392136, Accuracy = 0.9440000057220459
    PERFORMANCE ON TEST SET: Batch Loss = 0.6533123850822449, Accuracy = 0.8971285820007324
    Training iter #388500: Batch Loss = 0.414813, Accuracy = 0.9312000274658203
    PERFORMANCE ON TEST SET: Batch Loss = 0.6853197813034058, Accuracy = 0.8716604113578796
    Training iter #389000: Batch Loss = 0.434070, Accuracy = 0.9196000099182129
    PERFORMANCE ON TEST SET: Batch Loss = 0.6715116500854492, Accuracy = 0.8901373147964478
    Training iter #389500: Batch Loss = 0.406644, Accuracy = 0.9348000288009644
    PERFORMANCE ON TEST SET: Batch Loss = 0.6588730812072754, Accuracy = 0.8998751640319824
    =================================================
    2.160000000000002e-158
    1.0000000000000011e-159
    Training iter #390000: Batch Loss = 0.497335, Accuracy = 0.8700000047683716
    PERFORMANCE ON TEST SET: Batch Loss = 0.6824019551277161, Accuracy = 0.8756554126739502
    Training iter #390500: Batch Loss = 0.536005, Accuracy = 0.8600000143051147
    PERFORMANCE ON TEST SET: Batch Loss = 0.8746196031570435, Accuracy = 0.7805243730545044
    Training iter #391000: Batch Loss = 0.418446, Accuracy = 0.9296000003814697
    PERFORMANCE ON TEST SET: Batch Loss = 0.6656875014305115, Accuracy = 0.8921348452568054
    Training iter #391500: Batch Loss = 0.397837, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6516233086585999, Accuracy = 0.8996254801750183
    Training iter #392000: Batch Loss = 0.390584, Accuracy = 0.9455999732017517
    PERFORMANCE ON TEST SET: Batch Loss = 0.6621941328048706, Accuracy = 0.8948814272880554
    =================================================
    2.160000000000002e-159
    1.0000000000000011e-160
    Training iter #392500: Batch Loss = 0.703197, Accuracy = 0.7835999727249146
    PERFORMANCE ON TEST SET: Batch Loss = 1.2153600454330444, Accuracy = 0.6784020066261292
    Training iter #393000: Batch Loss = 0.428220, Accuracy = 0.9192000031471252
    PERFORMANCE ON TEST SET: Batch Loss = 0.7268710136413574, Accuracy = 0.8359550833702087
    Training iter #393500: Batch Loss = 0.395165, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6521247625350952, Accuracy = 0.8971285820007324
    Training iter #394000: Batch Loss = 0.499098, Accuracy = 0.8755999803543091
    PERFORMANCE ON TEST SET: Batch Loss = 0.6874827146530151, Accuracy = 0.8816479444503784
    Training iter #394500: Batch Loss = 0.407057, Accuracy = 0.9380000233650208
    PERFORMANCE ON TEST SET: Batch Loss = 0.6564115285873413, Accuracy = 0.898876428604126
    =================================================
    2.1600000000000022e-160
    1.000000000000001e-161
    Training iter #395000: Batch Loss = 0.447956, Accuracy = 0.9107999801635742
    PERFORMANCE ON TEST SET: Batch Loss = 0.7795815467834473, Accuracy = 0.818227231502533
    Training iter #395500: Batch Loss = 0.393281, Accuracy = 0.942799985408783
    PERFORMANCE ON TEST SET: Batch Loss = 0.654313325881958, Accuracy = 0.9026217460632324
    Training iter #396000: Batch Loss = 0.413875, Accuracy = 0.9348000288009644
    PERFORMANCE ON TEST SET: Batch Loss = 0.6706721782684326, Accuracy = 0.8853932619094849
    Training iter #396500: Batch Loss = 0.402855, Accuracy = 0.9391999840736389
    PERFORMANCE ON TEST SET: Batch Loss = 0.6578770279884338, Accuracy = 0.8991261124610901
    Training iter #397000: Batch Loss = 0.482309, Accuracy = 0.8791999816894531
    PERFORMANCE ON TEST SET: Batch Loss = 0.784027099609375, Accuracy = 0.815480649471283
    =================================================
    2.1600000000000022e-161
    1.0000000000000011e-162
    Training iter #397500: Batch Loss = 0.397817, Accuracy = 0.942799985408783
    PERFORMANCE ON TEST SET: Batch Loss = 0.6573164463043213, Accuracy = 0.8971285820007324
    Training iter #398000: Batch Loss = 0.413124, Accuracy = 0.9259999990463257
    PERFORMANCE ON TEST SET: Batch Loss = 0.6549124121665955, Accuracy = 0.8963795304298401
    Training iter #398500: Batch Loss = 0.913380, Accuracy = 0.7228000164031982
    PERFORMANCE ON TEST SET: Batch Loss = 0.8681483268737793, Accuracy = 0.7737827897071838
    Training iter #399000: Batch Loss = 0.430217, Accuracy = 0.9139999747276306
    PERFORMANCE ON TEST SET: Batch Loss = 0.7268840074539185, Accuracy = 0.8554307222366333
    Training iter #399500: Batch Loss = 0.410065, Accuracy = 0.9332000017166138
    PERFORMANCE ON TEST SET: Batch Loss = 0.6727186441421509, Accuracy = 0.8846442103385925
    =================================================
    2.1600000000000023e-162
    1.000000000000001e-163
    Training iter #400000: Batch Loss = 0.410104, Accuracy = 0.9387999773025513
    PERFORMANCE ON TEST SET: Batch Loss = 0.6601330041885376, Accuracy = 0.8956304788589478
    Training iter #400500: Batch Loss = 0.849118, Accuracy = 0.7447999715805054
    PERFORMANCE ON TEST SET: Batch Loss = 0.7759813666343689, Accuracy = 0.836454451084137
    Training iter #401000: Batch Loss = 0.393487, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.654350996017456, Accuracy = 0.9016229510307312
    Training iter #401500: Batch Loss = 0.445128, Accuracy = 0.9047999978065491
    PERFORMANCE ON TEST SET: Batch Loss = 0.6643182039260864, Accuracy = 0.888389527797699
    Training iter #402000: Batch Loss = 0.399608, Accuracy = 0.946399986743927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6563077569007874, Accuracy = 0.9001248478889465
    =================================================
    2.1600000000000023e-163
    1.000000000000001e-164
    Training iter #402500: Batch Loss = 0.402466, Accuracy = 0.9383999705314636
    PERFORMANCE ON TEST SET: Batch Loss = 0.6864842176437378, Accuracy = 0.8776529431343079
    Training iter #403000: Batch Loss = 0.451220, Accuracy = 0.9035999774932861
    PERFORMANCE ON TEST SET: Batch Loss = 0.8048413991928101, Accuracy = 0.8084893822669983
    Training iter #403500: Batch Loss = 0.397130, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6500980854034424, Accuracy = 0.8996254801750183
    Training iter #404000: Batch Loss = 0.401667, Accuracy = 0.9376000165939331
    PERFORMANCE ON TEST SET: Batch Loss = 0.6918218731880188, Accuracy = 0.8691635727882385
    Training iter #404500: Batch Loss = 0.389950, Accuracy = 0.9444000124931335
    PERFORMANCE ON TEST SET: Batch Loss = 0.6513872146606445, Accuracy = 0.9006242156028748
    =================================================
    2.160000000000002e-164
    1.000000000000001e-165
    Training iter #405000: Batch Loss = 0.456087, Accuracy = 0.8992000222206116
    PERFORMANCE ON TEST SET: Batch Loss = 0.7281453609466553, Accuracy = 0.8644194602966309
    Training iter #405500: Batch Loss = 0.520001, Accuracy = 0.8611999750137329
    PERFORMANCE ON TEST SET: Batch Loss = 0.7315654158592224, Accuracy = 0.8434457182884216
    Training iter #406000: Batch Loss = 0.402319, Accuracy = 0.9416000247001648
    PERFORMANCE ON TEST SET: Batch Loss = 0.6512455940246582, Accuracy = 0.9016229510307312
    Training iter #406500: Batch Loss = 0.395326, Accuracy = 0.9395999908447266
    PERFORMANCE ON TEST SET: Batch Loss = 0.6535834074020386, Accuracy = 0.901123583316803
    Training iter #407000: Batch Loss = 0.456740, Accuracy = 0.9007999897003174
    PERFORMANCE ON TEST SET: Batch Loss = 0.6908678412437439, Accuracy = 0.8711610436439514
    =================================================
    2.1600000000000022e-165
    1.000000000000001e-166
    Training iter #407500: Batch Loss = 0.402519, Accuracy = 0.942799985408783
    PERFORMANCE ON TEST SET: Batch Loss = 0.6488118171691895, Accuracy = 0.9021223187446594
    Training iter #408000: Batch Loss = 0.474478, Accuracy = 0.8924000263214111
    PERFORMANCE ON TEST SET: Batch Loss = 0.9262385964393616, Accuracy = 0.7565543055534363
    Training iter #408500: Batch Loss = 0.395026, Accuracy = 0.9404000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.6739718317985535, Accuracy = 0.8868913650512695
    Training iter #409000: Batch Loss = 0.399368, Accuracy = 0.9372000098228455
    PERFORMANCE ON TEST SET: Batch Loss = 0.6493815183639526, Accuracy = 0.9003745317459106
    Training iter #409500: Batch Loss = 0.425623, Accuracy = 0.9200000166893005
    PERFORMANCE ON TEST SET: Batch Loss = 0.6604335308074951, Accuracy = 0.8928838968276978
    =================================================
    2.1600000000000022e-166
    1.000000000000001e-167
    Training iter #410000: Batch Loss = 0.389005, Accuracy = 0.946399986743927
    PERFORMANCE ON TEST SET: Batch Loss = 0.650678813457489, Accuracy = 0.8993757963180542
    Training iter #410500: Batch Loss = 0.543117, Accuracy = 0.8695999979972839
    PERFORMANCE ON TEST SET: Batch Loss = 0.9152940511703491, Accuracy = 0.7727839946746826
    Training iter #411000: Batch Loss = 0.403480, Accuracy = 0.9440000057220459
    PERFORMANCE ON TEST SET: Batch Loss = 0.6565293669700623, Accuracy = 0.8998751640319824
    Training iter #411500: Batch Loss = 0.463821, Accuracy = 0.8980000019073486
    PERFORMANCE ON TEST SET: Batch Loss = 0.7094164490699768, Accuracy = 0.8629213571548462
    Training iter #412000: Batch Loss = 0.416371, Accuracy = 0.9236000180244446
    PERFORMANCE ON TEST SET: Batch Loss = 0.6733453273773193, Accuracy = 0.8926342129707336
    =================================================
    2.160000000000002e-167
    1.000000000000001e-168
    Training iter #412500: Batch Loss = 0.391195, Accuracy = 0.9452000260353088
    PERFORMANCE ON TEST SET: Batch Loss = 0.6520869135856628, Accuracy = 0.9013732671737671
    Training iter #413000: Batch Loss = 0.465783, Accuracy = 0.8967999815940857
    PERFORMANCE ON TEST SET: Batch Loss = 0.679153323173523, Accuracy = 0.8821473121643066
    Training iter #413500: Batch Loss = 0.395034, Accuracy = 0.9452000260353088
    PERFORMANCE ON TEST SET: Batch Loss = 0.6613425016403198, Accuracy = 0.8966292142868042
    Training iter #414000: Batch Loss = 0.420267, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.6763785481452942, Accuracy = 0.8818976283073425
    Training iter #414500: Batch Loss = 0.397042, Accuracy = 0.9368000030517578
    PERFORMANCE ON TEST SET: Batch Loss = 0.650389552116394, Accuracy = 0.9031211137771606
    =================================================
    2.160000000000002e-168
    1.000000000000001e-169
    Training iter #415000: Batch Loss = 0.432829, Accuracy = 0.9136000275611877
    PERFORMANCE ON TEST SET: Batch Loss = 0.6771882772445679, Accuracy = 0.8799000978469849
    Training iter #415500: Batch Loss = 0.390746, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6467793583869934, Accuracy = 0.901123583316803
    Training iter #416000: Batch Loss = 0.406327, Accuracy = 0.9351999759674072
    PERFORMANCE ON TEST SET: Batch Loss = 0.6878892183303833, Accuracy = 0.8734082579612732
    Training iter #416500: Batch Loss = 0.414545, Accuracy = 0.9395999908447266
    PERFORMANCE ON TEST SET: Batch Loss = 0.6824337840080261, Accuracy = 0.8764045238494873
    Training iter #417000: Batch Loss = 0.409833, Accuracy = 0.9348000288009644
    PERFORMANCE ON TEST SET: Batch Loss = 0.6681452393531799, Accuracy = 0.890636682510376
    =================================================
    2.160000000000002e-169
    1.000000000000001e-170
    Training iter #417500: Batch Loss = 0.395945, Accuracy = 0.9404000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.6517689824104309, Accuracy = 0.9036204814910889
    Training iter #418000: Batch Loss = 0.391992, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6566420197486877, Accuracy = 0.8996254801750183
    Training iter #418500: Batch Loss = 0.414925, Accuracy = 0.9368000030517578
    PERFORMANCE ON TEST SET: Batch Loss = 0.6668269634246826, Accuracy = 0.8863919973373413
    Training iter #419000: Batch Loss = 0.485321, Accuracy = 0.8844000101089478
    PERFORMANCE ON TEST SET: Batch Loss = 0.7515394687652588, Accuracy = 0.8536828756332397
    Training iter #419500: Batch Loss = 0.586547, Accuracy = 0.8259999752044678
    PERFORMANCE ON TEST SET: Batch Loss = 0.6708271503448486, Accuracy = 0.8791510462760925
    =================================================
    2.160000000000002e-170
    1.0000000000000011e-171
    Training iter #420000: Batch Loss = 0.422574, Accuracy = 0.9236000180244446
    PERFORMANCE ON TEST SET: Batch Loss = 0.761388897895813, Accuracy = 0.8299625515937805
    Training iter #420500: Batch Loss = 0.394998, Accuracy = 0.9455999732017517
    PERFORMANCE ON TEST SET: Batch Loss = 0.6499088406562805, Accuracy = 0.901123583316803
    Training iter #421000: Batch Loss = 0.401142, Accuracy = 0.9404000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.6800752878189087, Accuracy = 0.8796504139900208
    Training iter #421500: Batch Loss = 0.406509, Accuracy = 0.9355999827384949
    PERFORMANCE ON TEST SET: Batch Loss = 0.6789281368255615, Accuracy = 0.8861423134803772
    Training iter #422000: Batch Loss = 0.433350, Accuracy = 0.9204000234603882
    PERFORMANCE ON TEST SET: Batch Loss = 0.7406772971153259, Accuracy = 0.8446941375732422
    =================================================
    2.160000000000002e-171
    1.0000000000000012e-172
    Training iter #422500: Batch Loss = 0.395345, Accuracy = 0.9423999786376953
    PERFORMANCE ON TEST SET: Batch Loss = 0.654452383518219, Accuracy = 0.9021223187446594
    Training iter #423000: Batch Loss = 0.458293, Accuracy = 0.8996000289916992
    PERFORMANCE ON TEST SET: Batch Loss = 0.7133464217185974, Accuracy = 0.8604244589805603
    Training iter #423500: Batch Loss = 0.388711, Accuracy = 0.9452000260353088
    PERFORMANCE ON TEST SET: Batch Loss = 0.6511023044586182, Accuracy = 0.9013732671737671
    Training iter #424000: Batch Loss = 0.689560, Accuracy = 0.7851999998092651
    PERFORMANCE ON TEST SET: Batch Loss = 0.8229663372039795, Accuracy = 0.8019974827766418
    Training iter #424500: Batch Loss = 0.398846, Accuracy = 0.9404000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.6593992710113525, Accuracy = 0.9021223187446594
    =================================================
    2.160000000000002e-172
    1.0000000000000011e-173
    Training iter #425000: Batch Loss = 0.381108, Accuracy = 0.9516000151634216
    PERFORMANCE ON TEST SET: Batch Loss = 0.6550639271736145, Accuracy = 0.9001248478889465
    Training iter #425500: Batch Loss = 0.393735, Accuracy = 0.9399999976158142
    PERFORMANCE ON TEST SET: Batch Loss = 0.6536041498184204, Accuracy = 0.901123583316803
    Training iter #426000: Batch Loss = 0.403570, Accuracy = 0.9391999840736389
    PERFORMANCE ON TEST SET: Batch Loss = 0.6582568883895874, Accuracy = 0.8946316838264465
    Training iter #426500: Batch Loss = 0.402667, Accuracy = 0.9387999773025513
    PERFORMANCE ON TEST SET: Batch Loss = 0.6587048172950745, Accuracy = 0.8981273174285889
    Training iter #427000: Batch Loss = 0.455210, Accuracy = 0.8988000154495239
    PERFORMANCE ON TEST SET: Batch Loss = 0.6808685064315796, Accuracy = 0.8911360502243042
    =================================================
    2.160000000000002e-173
    1.0000000000000011e-174
    Training iter #427500: Batch Loss = 0.415640, Accuracy = 0.9340000152587891
    PERFORMANCE ON TEST SET: Batch Loss = 0.6618086099624634, Accuracy = 0.8998751640319824
    Training iter #428000: Batch Loss = 0.492601, Accuracy = 0.8831999897956848
    PERFORMANCE ON TEST SET: Batch Loss = 0.6975404024124146, Accuracy = 0.8784019947052002
    Training iter #428500: Batch Loss = 0.420222, Accuracy = 0.9232000112533569
    PERFORMANCE ON TEST SET: Batch Loss = 0.722888708114624, Accuracy = 0.8549313545227051
    Training iter #429000: Batch Loss = 0.497388, Accuracy = 0.8827999830245972
    PERFORMANCE ON TEST SET: Batch Loss = 0.752316951751709, Accuracy = 0.8377028703689575
    Training iter #429500: Batch Loss = 0.514746, Accuracy = 0.86080002784729
    PERFORMANCE ON TEST SET: Batch Loss = 0.7163670063018799, Accuracy = 0.8556804060935974
    =================================================
    2.1600000000000022e-174
    1.0000000000000011e-175
    Training iter #430000: Batch Loss = 0.396102, Accuracy = 0.946399986743927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6530941128730774, Accuracy = 0.9016229510307312
    Training iter #430500: Batch Loss = 0.479668, Accuracy = 0.8899999856948853
    PERFORMANCE ON TEST SET: Batch Loss = 0.6948104500770569, Accuracy = 0.8736579418182373
    Training iter #431000: Batch Loss = 0.395195, Accuracy = 0.9387999773025513
    PERFORMANCE ON TEST SET: Batch Loss = 0.6542341709136963, Accuracy = 0.9008738994598389
    Training iter #431500: Batch Loss = 0.439371, Accuracy = 0.9103999733924866
    PERFORMANCE ON TEST SET: Batch Loss = 0.7309076189994812, Accuracy = 0.8486891388893127
    Training iter #432000: Batch Loss = 0.478775, Accuracy = 0.8859999775886536
    PERFORMANCE ON TEST SET: Batch Loss = 0.785204291343689, Accuracy = 0.8137328624725342
    =================================================
    2.160000000000002e-175
    1.000000000000001e-176
    Training iter #432500: Batch Loss = 0.384159, Accuracy = 0.9516000151634216
    PERFORMANCE ON TEST SET: Batch Loss = 0.6550430655479431, Accuracy = 0.9003745317459106
    Training iter #433000: Batch Loss = 0.449056, Accuracy = 0.909600019454956
    PERFORMANCE ON TEST SET: Batch Loss = 0.6964415311813354, Accuracy = 0.8754057288169861
    Training iter #433500: Batch Loss = 0.437228, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.6884838342666626, Accuracy = 0.8749063611030579
    Training iter #434000: Batch Loss = 0.393612, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6540086269378662, Accuracy = 0.9056179523468018
    Training iter #434500: Batch Loss = 0.523096, Accuracy = 0.8600000143051147
    PERFORMANCE ON TEST SET: Batch Loss = 0.7137067317962646, Accuracy = 0.8566791415214539
    =================================================
    2.1600000000000023e-176
    1.000000000000001e-177
    Training iter #435000: Batch Loss = 0.403695, Accuracy = 0.9440000057220459
    PERFORMANCE ON TEST SET: Batch Loss = 0.6532198786735535, Accuracy = 0.9043695330619812
    Training iter #435500: Batch Loss = 0.486970, Accuracy = 0.8795999884605408
    PERFORMANCE ON TEST SET: Batch Loss = 0.6951895952224731, Accuracy = 0.877902626991272
    Training iter #436000: Batch Loss = 0.392154, Accuracy = 0.9452000260353088
    PERFORMANCE ON TEST SET: Batch Loss = 0.658667802810669, Accuracy = 0.8968788981437683
    Training iter #436500: Batch Loss = 0.393846, Accuracy = 0.9448000192642212
    PERFORMANCE ON TEST SET: Batch Loss = 0.6622649431228638, Accuracy = 0.8963795304298401
    Training iter #437000: Batch Loss = 0.390766, Accuracy = 0.9472000002861023
    PERFORMANCE ON TEST SET: Batch Loss = 0.650262176990509, Accuracy = 0.9021223187446594
    =================================================
    2.1600000000000023e-177
    1.000000000000001e-178
    Training iter #437500: Batch Loss = 0.482224, Accuracy = 0.8844000101089478
    PERFORMANCE ON TEST SET: Batch Loss = 0.7444829940795898, Accuracy = 0.8327091336250305
    Training iter #438000: Batch Loss = 0.388301, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6563501954078674, Accuracy = 0.9003745317459106
    Training iter #438500: Batch Loss = 0.442964, Accuracy = 0.9223999977111816
    PERFORMANCE ON TEST SET: Batch Loss = 0.7023587226867676, Accuracy = 0.8726591467857361
    Training iter #439000: Batch Loss = 0.408900, Accuracy = 0.9300000071525574
    PERFORMANCE ON TEST SET: Batch Loss = 0.6563712358474731, Accuracy = 0.8993757963180542
    Training iter #439500: Batch Loss = 0.451007, Accuracy = 0.9056000113487244
    PERFORMANCE ON TEST SET: Batch Loss = 0.663195788860321, Accuracy = 0.8966292142868042
    =================================================
    2.1600000000000024e-178
    1.000000000000001e-179
    Training iter #440000: Batch Loss = 0.388071, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.652151882648468, Accuracy = 0.9021223187446594
    Training iter #440500: Batch Loss = 0.404251, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.6552485823631287, Accuracy = 0.9043695330619812
    Training iter #441000: Batch Loss = 0.422455, Accuracy = 0.9243999719619751
    PERFORMANCE ON TEST SET: Batch Loss = 0.6779962182044983, Accuracy = 0.8853932619094849
    Training iter #441500: Batch Loss = 0.387094, Accuracy = 0.9491999745368958
    PERFORMANCE ON TEST SET: Batch Loss = 0.6559356451034546, Accuracy = 0.9018726348876953
    Training iter #442000: Batch Loss = 0.455709, Accuracy = 0.8939999938011169
    PERFORMANCE ON TEST SET: Batch Loss = 0.6807922720909119, Accuracy = 0.8791510462760925
    =================================================
    2.1600000000000025e-179
    1.000000000000001e-180
    Training iter #442500: Batch Loss = 0.394205, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.6523817777633667, Accuracy = 0.903870165348053
    Training iter #443000: Batch Loss = 0.389483, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6540141105651855, Accuracy = 0.903870165348053
    Training iter #443500: Batch Loss = 0.430615, Accuracy = 0.9143999814987183
    PERFORMANCE ON TEST SET: Batch Loss = 0.6926206350326538, Accuracy = 0.8789013624191284
    Training iter #444000: Batch Loss = 0.401487, Accuracy = 0.9440000057220459
    PERFORMANCE ON TEST SET: Batch Loss = 0.6615456342697144, Accuracy = 0.898876428604126
    Training iter #444500: Batch Loss = 0.602181, Accuracy = 0.8379999995231628
    PERFORMANCE ON TEST SET: Batch Loss = 0.8181154727935791, Accuracy = 0.8097378015518188
    =================================================
    2.1600000000000025e-180
    1.0000000000000011e-181
    Training iter #445000: Batch Loss = 0.395325, Accuracy = 0.942799985408783
    PERFORMANCE ON TEST SET: Batch Loss = 0.6477407217025757, Accuracy = 0.9063670635223389
    Training iter #445500: Batch Loss = 0.387254, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.6567201018333435, Accuracy = 0.8991261124610901
    Training iter #446000: Batch Loss = 0.406997, Accuracy = 0.9399999976158142
    PERFORMANCE ON TEST SET: Batch Loss = 0.6587563753128052, Accuracy = 0.8963795304298401
    Training iter #446500: Batch Loss = 0.394550, Accuracy = 0.9476000070571899
    PERFORMANCE ON TEST SET: Batch Loss = 0.6536465883255005, Accuracy = 0.9028714299201965
    Training iter #447000: Batch Loss = 0.383911, Accuracy = 0.9516000151634216
    PERFORMANCE ON TEST SET: Batch Loss = 0.6514645218849182, Accuracy = 0.903870165348053
    =================================================
    2.1600000000000024e-181
    1.000000000000001e-182
    Training iter #447500: Batch Loss = 0.400379, Accuracy = 0.9444000124931335
    PERFORMANCE ON TEST SET: Batch Loss = 0.7104480862617493, Accuracy = 0.862421989440918
    Training iter #448000: Batch Loss = 0.566025, Accuracy = 0.8443999886512756
    PERFORMANCE ON TEST SET: Batch Loss = 0.7226091623306274, Accuracy = 0.8509363532066345
    Training iter #448500: Batch Loss = 0.478768, Accuracy = 0.8804000020027161
    PERFORMANCE ON TEST SET: Batch Loss = 0.7047916054725647, Accuracy = 0.872409462928772
    Training iter #449000: Batch Loss = 0.383416, Accuracy = 0.9503999948501587
    PERFORMANCE ON TEST SET: Batch Loss = 0.654536783695221, Accuracy = 0.9033707976341248
    Training iter #449500: Batch Loss = 0.484935, Accuracy = 0.8812000155448914
    PERFORMANCE ON TEST SET: Batch Loss = 0.6779678463935852, Accuracy = 0.8876404762268066
    =================================================
    2.1600000000000024e-182
    1.000000000000001e-183
    Training iter #450000: Batch Loss = 0.399015, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.652373731136322, Accuracy = 0.9056179523468018
    Training iter #450500: Batch Loss = 0.454131, Accuracy = 0.9071999788284302
    PERFORMANCE ON TEST SET: Batch Loss = 0.7650066614151001, Accuracy = 0.8219725489616394
    Training iter #451000: Batch Loss = 0.384584, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.6529674530029297, Accuracy = 0.9036204814910889
    Training iter #451500: Batch Loss = 0.400379, Accuracy = 0.9455999732017517
    PERFORMANCE ON TEST SET: Batch Loss = 0.6531786918640137, Accuracy = 0.9046192169189453
    Training iter #452000: Batch Loss = 0.402301, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.6668854355812073, Accuracy = 0.8936329483985901
    =================================================
    2.1600000000000026e-183
    1.0000000000000011e-184
    Training iter #452500: Batch Loss = 0.453209, Accuracy = 0.8956000208854675
    PERFORMANCE ON TEST SET: Batch Loss = 0.6721524596214294, Accuracy = 0.8789013624191284
    Training iter #453000: Batch Loss = 0.383806, Accuracy = 0.951200008392334
    PERFORMANCE ON TEST SET: Batch Loss = 0.6560006141662598, Accuracy = 0.9003745317459106
    Training iter #453500: Batch Loss = 0.392044, Accuracy = 0.9448000192642212
    PERFORMANCE ON TEST SET: Batch Loss = 0.6485969424247742, Accuracy = 0.903870165348053
    Training iter #454000: Batch Loss = 0.387670, Accuracy = 0.9524000287055969
    PERFORMANCE ON TEST SET: Batch Loss = 0.658571720123291, Accuracy = 0.9018726348876953
    Training iter #454500: Batch Loss = 0.381774, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6513739228248596, Accuracy = 0.9021223187446594
    =================================================
    2.1600000000000025e-184
    1.0000000000000011e-185
    Training iter #455000: Batch Loss = 0.418530, Accuracy = 0.9327999949455261
    PERFORMANCE ON TEST SET: Batch Loss = 0.6643845438957214, Accuracy = 0.8978776335716248
    Training iter #455500: Batch Loss = 0.397124, Accuracy = 0.9455999732017517
    PERFORMANCE ON TEST SET: Batch Loss = 0.6534666419029236, Accuracy = 0.906616747379303
    Training iter #456000: Batch Loss = 0.835986, Accuracy = 0.7544000148773193
    PERFORMANCE ON TEST SET: Batch Loss = 0.8358755707740784, Accuracy = 0.8072409629821777
    Training iter #456500: Batch Loss = 0.542386, Accuracy = 0.8507999777793884
    PERFORMANCE ON TEST SET: Batch Loss = 0.6912025809288025, Accuracy = 0.8696629405021667
    Training iter #457000: Batch Loss = 0.404676, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6613517999649048, Accuracy = 0.8946316838264465
    =================================================
    2.1600000000000025e-185
    1.0000000000000011e-186
    Training iter #457500: Batch Loss = 0.390994, Accuracy = 0.9472000002861023
    PERFORMANCE ON TEST SET: Batch Loss = 0.6457864046096802, Accuracy = 0.9028714299201965
    Training iter #458000: Batch Loss = 0.429525, Accuracy = 0.9175999760627747
    PERFORMANCE ON TEST SET: Batch Loss = 0.6890873312950134, Accuracy = 0.8821473121643066
    Training iter #458500: Batch Loss = 0.382746, Accuracy = 0.9520000219345093
    PERFORMANCE ON TEST SET: Batch Loss = 0.6600878834724426, Accuracy = 0.8963795304298401
    Training iter #459000: Batch Loss = 0.397976, Accuracy = 0.9412000179290771
    PERFORMANCE ON TEST SET: Batch Loss = 0.6538092494010925, Accuracy = 0.901123583316803
    Training iter #459500: Batch Loss = 0.386387, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6507462859153748, Accuracy = 0.9061173796653748
    =================================================
    2.1600000000000025e-186
    1.0000000000000012e-187
    Training iter #460000: Batch Loss = 0.382657, Accuracy = 0.9491999745368958
    PERFORMANCE ON TEST SET: Batch Loss = 0.6481125354766846, Accuracy = 0.9041198492050171
    Training iter #460500: Batch Loss = 0.398390, Accuracy = 0.9467999935150146
    PERFORMANCE ON TEST SET: Batch Loss = 0.6517134308815002, Accuracy = 0.903870165348053
    Training iter #461000: Batch Loss = 0.589462, Accuracy = 0.828000009059906
    PERFORMANCE ON TEST SET: Batch Loss = 0.7472832202911377, Accuracy = 0.8394506573677063
    Training iter #461500: Batch Loss = 0.388649, Accuracy = 0.9476000070571899
    PERFORMANCE ON TEST SET: Batch Loss = 0.647203803062439, Accuracy = 0.9048689007759094
    Training iter #462000: Batch Loss = 0.385906, Accuracy = 0.9467999935150146
    PERFORMANCE ON TEST SET: Batch Loss = 0.6472963094711304, Accuracy = 0.9033707976341248
    =================================================
    2.1600000000000026e-187
    1.0000000000000012e-188
    Training iter #462500: Batch Loss = 0.412148, Accuracy = 0.9368000030517578
    PERFORMANCE ON TEST SET: Batch Loss = 0.6628847122192383, Accuracy = 0.8963795304298401
    Training iter #463000: Batch Loss = 0.435974, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.6904137134552002, Accuracy = 0.8796504139900208
    Training iter #463500: Batch Loss = 0.395747, Accuracy = 0.9419999718666077
    PERFORMANCE ON TEST SET: Batch Loss = 0.670428991317749, Accuracy = 0.8848938941955566
    Training iter #464000: Batch Loss = 0.394299, Accuracy = 0.9408000111579895
    PERFORMANCE ON TEST SET: Batch Loss = 0.682243824005127, Accuracy = 0.8803995251655579
    Training iter #464500: Batch Loss = 0.391314, Accuracy = 0.946399986743927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6495338082313538, Accuracy = 0.9043695330619812
    =================================================
    2.1600000000000025e-188
    1.0000000000000013e-189
    Training iter #465000: Batch Loss = 0.417859, Accuracy = 0.9283999800682068
    PERFORMANCE ON TEST SET: Batch Loss = 0.6717542409896851, Accuracy = 0.8891385793685913
    Training iter #465500: Batch Loss = 0.395995, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6759626269340515, Accuracy = 0.8873907327651978
    Training iter #466000: Batch Loss = 0.429142, Accuracy = 0.9236000180244446
    PERFORMANCE ON TEST SET: Batch Loss = 0.6648253202438354, Accuracy = 0.8943819999694824
    Training iter #466500: Batch Loss = 0.395090, Accuracy = 0.9488000273704529
    PERFORMANCE ON TEST SET: Batch Loss = 0.6498546600341797, Accuracy = 0.9063670635223389
    Training iter #467000: Batch Loss = 0.397738, Accuracy = 0.9404000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.6480165123939514, Accuracy = 0.9043695330619812
    =================================================
    2.1600000000000023e-189
    1.0000000000000013e-190
    Training iter #467500: Batch Loss = 0.432883, Accuracy = 0.9204000234603882
    PERFORMANCE ON TEST SET: Batch Loss = 0.6815518736839294, Accuracy = 0.8843945264816284
    Training iter #468000: Batch Loss = 0.400683, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.6471503973007202, Accuracy = 0.9058676362037659
    Training iter #468500: Batch Loss = 0.389388, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6525349617004395, Accuracy = 0.9028714299201965
    Training iter #469000: Batch Loss = 0.448257, Accuracy = 0.9039999842643738
    PERFORMANCE ON TEST SET: Batch Loss = 0.8671472072601318, Accuracy = 0.7822721600532532
    Training iter #469500: Batch Loss = 0.491894, Accuracy = 0.8712000250816345
    PERFORMANCE ON TEST SET: Batch Loss = 0.6752232909202576, Accuracy = 0.8848938941955566
    =================================================
    2.1600000000000022e-190
    1.0000000000000013e-191
    Training iter #470000: Batch Loss = 0.390175, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6488295197486877, Accuracy = 0.9078651666641235
    Training iter #470500: Batch Loss = 0.591662, Accuracy = 0.8284000158309937
    PERFORMANCE ON TEST SET: Batch Loss = 0.7456328868865967, Accuracy = 0.8409488201141357
    Training iter #471000: Batch Loss = 0.385232, Accuracy = 0.949999988079071
    PERFORMANCE ON TEST SET: Batch Loss = 0.6467118263244629, Accuracy = 0.9056179523468018
    Training iter #471500: Batch Loss = 0.434287, Accuracy = 0.9215999841690063
    PERFORMANCE ON TEST SET: Batch Loss = 0.7009294629096985, Accuracy = 0.875156044960022
    Training iter #472000: Batch Loss = 0.397054, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.6590760946273804, Accuracy = 0.9036204814910889
    =================================================
    2.160000000000002e-191
    1.0000000000000013e-192
    Training iter #472500: Batch Loss = 0.416952, Accuracy = 0.9272000193595886
    PERFORMANCE ON TEST SET: Batch Loss = 0.698207437992096, Accuracy = 0.8676654100418091
    Training iter #473000: Batch Loss = 0.420975, Accuracy = 0.925599992275238
    PERFORMANCE ON TEST SET: Batch Loss = 0.6563557386398315, Accuracy = 0.9016229510307312
    Training iter #473500: Batch Loss = 0.400189, Accuracy = 0.9455999732017517
    PERFORMANCE ON TEST SET: Batch Loss = 0.649795413017273, Accuracy = 0.9041198492050171
    Training iter #474000: Batch Loss = 0.413701, Accuracy = 0.9287999868392944
    PERFORMANCE ON TEST SET: Batch Loss = 0.7040033340454102, Accuracy = 0.8591760396957397
    Training iter #474500: Batch Loss = 0.389249, Accuracy = 0.9491999745368958
    PERFORMANCE ON TEST SET: Batch Loss = 0.6512916684150696, Accuracy = 0.9086142182350159
    =================================================
    2.1600000000000023e-192
    1.0000000000000013e-193
    Training iter #475000: Batch Loss = 0.381124, Accuracy = 0.9563999772071838
    PERFORMANCE ON TEST SET: Batch Loss = 0.6495962142944336, Accuracy = 0.9068664312362671
    Training iter #475500: Batch Loss = 0.417033, Accuracy = 0.9236000180244446
    PERFORMANCE ON TEST SET: Batch Loss = 0.7096537351608276, Accuracy = 0.8556804060935974
    Training iter #476000: Batch Loss = 0.383956, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6484196186065674, Accuracy = 0.9071161150932312
    Training iter #476500: Batch Loss = 0.386848, Accuracy = 0.9476000070571899
    PERFORMANCE ON TEST SET: Batch Loss = 0.661109447479248, Accuracy = 0.8981273174285889
    Training iter #477000: Batch Loss = 0.411298, Accuracy = 0.9336000084877014
    PERFORMANCE ON TEST SET: Batch Loss = 0.7040483355522156, Accuracy = 0.8659176230430603
    =================================================
    2.1600000000000024e-193
    1.0000000000000012e-194
    Training iter #477500: Batch Loss = 0.397381, Accuracy = 0.9455999732017517
    PERFORMANCE ON TEST SET: Batch Loss = 0.6509265303611755, Accuracy = 0.9071161150932312
    Training iter #478000: Batch Loss = 0.464721, Accuracy = 0.8980000019073486
    PERFORMANCE ON TEST SET: Batch Loss = 0.7508828639984131, Accuracy = 0.8302122354507446
    Training iter #478500: Batch Loss = 0.406086, Accuracy = 0.9351999759674072
    PERFORMANCE ON TEST SET: Batch Loss = 0.6545650362968445, Accuracy = 0.9051185846328735
    Training iter #479000: Batch Loss = 0.483080, Accuracy = 0.8888000249862671
    PERFORMANCE ON TEST SET: Batch Loss = 0.6776830554008484, Accuracy = 0.8791510462760925
    Training iter #479500: Batch Loss = 0.522223, Accuracy = 0.8628000020980835
    PERFORMANCE ON TEST SET: Batch Loss = 0.7533591985702515, Accuracy = 0.8392009735107422
    =================================================
    2.1600000000000024e-194
    1.0000000000000012e-195
    Training iter #480000: Batch Loss = 0.386896, Accuracy = 0.9491999745368958
    PERFORMANCE ON TEST SET: Batch Loss = 0.6507737040519714, Accuracy = 0.9056179523468018
    Training iter #480500: Batch Loss = 0.385741, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6703446507453918, Accuracy = 0.8908863663673401
    Training iter #481000: Batch Loss = 0.392820, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6472254395484924, Accuracy = 0.9071161150932312
    Training iter #481500: Batch Loss = 0.565258, Accuracy = 0.8503999710083008
    PERFORMANCE ON TEST SET: Batch Loss = 0.8978493213653564, Accuracy = 0.7655430436134338
    Training iter #482000: Batch Loss = 0.411077, Accuracy = 0.9323999881744385
    PERFORMANCE ON TEST SET: Batch Loss = 0.6770342588424683, Accuracy = 0.8863919973373413
    =================================================
    2.1600000000000025e-195
    1.0000000000000012e-196
    Training iter #482500: Batch Loss = 0.533492, Accuracy = 0.8636000156402588
    PERFORMANCE ON TEST SET: Batch Loss = 0.688012957572937, Accuracy = 0.885642945766449
    Training iter #483000: Batch Loss = 0.402260, Accuracy = 0.9416000247001648
    PERFORMANCE ON TEST SET: Batch Loss = 0.6521113514900208, Accuracy = 0.9033707976341248
    Training iter #483500: Batch Loss = 0.536280, Accuracy = 0.8628000020980835
    PERFORMANCE ON TEST SET: Batch Loss = 0.7621057629585266, Accuracy = 0.82871413230896
    Training iter #484000: Batch Loss = 0.386222, Accuracy = 0.946399986743927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6691164374351501, Accuracy = 0.8896379470825195
    Training iter #484500: Batch Loss = 0.413356, Accuracy = 0.9355999827384949
    PERFORMANCE ON TEST SET: Batch Loss = 0.6483258008956909, Accuracy = 0.9053682684898376
    =================================================
    2.1600000000000025e-196
    1.0000000000000012e-197
    Training iter #485000: Batch Loss = 0.483565, Accuracy = 0.881600022315979
    PERFORMANCE ON TEST SET: Batch Loss = 0.7917454838752747, Accuracy = 0.8157303333282471
    Training iter #485500: Batch Loss = 0.389137, Accuracy = 0.9508000016212463
    PERFORMANCE ON TEST SET: Batch Loss = 0.6471685171127319, Accuracy = 0.906616747379303
    Training iter #486000: Batch Loss = 0.457240, Accuracy = 0.8971999883651733
    PERFORMANCE ON TEST SET: Batch Loss = 0.6883702874183655, Accuracy = 0.8714107275009155
    Training iter #486500: Batch Loss = 0.398730, Accuracy = 0.9408000111579895
    PERFORMANCE ON TEST SET: Batch Loss = 0.6682355403900146, Accuracy = 0.8881398439407349
    Training iter #487000: Batch Loss = 0.381264, Accuracy = 0.9552000164985657
    PERFORMANCE ON TEST SET: Batch Loss = 0.6432492136955261, Accuracy = 0.9091135859489441
    =================================================
    2.1600000000000025e-197
    1.0000000000000012e-198
    Training iter #487500: Batch Loss = 0.390119, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.6497235298156738, Accuracy = 0.9048689007759094
    Training iter #488000: Batch Loss = 0.442433, Accuracy = 0.9172000288963318
    PERFORMANCE ON TEST SET: Batch Loss = 0.7571849822998047, Accuracy = 0.8449438214302063
    Training iter #488500: Batch Loss = 0.398277, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6541271209716797, Accuracy = 0.9006242156028748
    Training iter #489000: Batch Loss = 0.409166, Accuracy = 0.9348000288009644
    PERFORMANCE ON TEST SET: Batch Loss = 0.666235089302063, Accuracy = 0.8941323161125183
    Training iter #489500: Batch Loss = 0.410054, Accuracy = 0.9315999746322632
    PERFORMANCE ON TEST SET: Batch Loss = 0.6803504228591919, Accuracy = 0.8811485767364502
    =================================================
    2.1600000000000025e-198
    1.0000000000000013e-199
    Training iter #490000: Batch Loss = 0.416329, Accuracy = 0.9312000274658203
    PERFORMANCE ON TEST SET: Batch Loss = 0.6770508289337158, Accuracy = 0.8808988928794861
    Training iter #490500: Batch Loss = 0.472418, Accuracy = 0.8899999856948853
    PERFORMANCE ON TEST SET: Batch Loss = 0.7718534469604492, Accuracy = 0.8254681825637817
    Training iter #491000: Batch Loss = 0.393637, Accuracy = 0.9452000260353088
    PERFORMANCE ON TEST SET: Batch Loss = 0.6486157774925232, Accuracy = 0.9091135859489441
    Training iter #491500: Batch Loss = 0.425712, Accuracy = 0.9240000247955322
    PERFORMANCE ON TEST SET: Batch Loss = 0.7073253393173218, Accuracy = 0.8631710410118103
    Training iter #492000: Batch Loss = 0.394864, Accuracy = 0.9444000124931335
    PERFORMANCE ON TEST SET: Batch Loss = 0.6495417952537537, Accuracy = 0.9043695330619812
    =================================================
    2.1600000000000024e-199
    1.0000000000000013e-200
    Training iter #492500: Batch Loss = 0.383422, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.64836585521698, Accuracy = 0.906616747379303
    Training iter #493000: Batch Loss = 0.534488, Accuracy = 0.857200026512146
    PERFORMANCE ON TEST SET: Batch Loss = 0.8460085391998291, Accuracy = 0.7887640595436096
    Training iter #493500: Batch Loss = 0.505021, Accuracy = 0.8672000169754028
    PERFORMANCE ON TEST SET: Batch Loss = 0.7104465961456299, Accuracy = 0.8706616759300232
    Training iter #494000: Batch Loss = 0.408294, Accuracy = 0.9387999773025513
    PERFORMANCE ON TEST SET: Batch Loss = 0.6588533520698547, Accuracy = 0.8981273174285889
    Training iter #494500: Batch Loss = 0.391021, Accuracy = 0.946399986743927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6486469507217407, Accuracy = 0.9071161150932312
    =================================================
    2.1600000000000024e-200
    1.0000000000000012e-201
    Training iter #495000: Batch Loss = 0.381243, Accuracy = 0.9488000273704529
    PERFORMANCE ON TEST SET: Batch Loss = 0.6472053527832031, Accuracy = 0.9096130132675171
    Training iter #495500: Batch Loss = 0.463866, Accuracy = 0.8992000222206116
    PERFORMANCE ON TEST SET: Batch Loss = 0.6674778461456299, Accuracy = 0.8833957314491272
    Training iter #496000: Batch Loss = 0.392287, Accuracy = 0.949999988079071
    PERFORMANCE ON TEST SET: Batch Loss = 0.648783802986145, Accuracy = 0.9068664312362671
    Training iter #496500: Batch Loss = 0.483623, Accuracy = 0.8880000114440918
    PERFORMANCE ON TEST SET: Batch Loss = 0.7596243619918823, Accuracy = 0.8359550833702087
    Training iter #497000: Batch Loss = 0.442939, Accuracy = 0.9016000032424927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6504679918289185, Accuracy = 0.8968788981437683
    =================================================
    2.1600000000000024e-201
    1.0000000000000012e-202
    Training iter #497500: Batch Loss = 0.391906, Accuracy = 0.946399986743927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6444524526596069, Accuracy = 0.9051185846328735
    Training iter #498000: Batch Loss = 0.491127, Accuracy = 0.8784000277519226
    PERFORMANCE ON TEST SET: Batch Loss = 0.6598376631736755, Accuracy = 0.8936329483985901
    Training iter #498500: Batch Loss = 0.385257, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6506040096282959, Accuracy = 0.9021223187446594
    Training iter #499000: Batch Loss = 0.389871, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6519860029220581, Accuracy = 0.9048689007759094
    Training iter #499500: Batch Loss = 0.423807, Accuracy = 0.9276000261306763
    PERFORMANCE ON TEST SET: Batch Loss = 0.6905568838119507, Accuracy = 0.8786516785621643
    =================================================
    2.1600000000000023e-202
    1.0000000000000012e-203
    Training iter #500000: Batch Loss = 0.390900, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.6449395418167114, Accuracy = 0.9091135859489441
    Training iter #500500: Batch Loss = 0.381473, Accuracy = 0.949999988079071
    PERFORMANCE ON TEST SET: Batch Loss = 0.6453354954719543, Accuracy = 0.9093632698059082
    Training iter #501000: Batch Loss = 0.393602, Accuracy = 0.9488000273704529
    PERFORMANCE ON TEST SET: Batch Loss = 0.6561552286148071, Accuracy = 0.9033707976341248
    Training iter #501500: Batch Loss = 0.388715, Accuracy = 0.9508000016212463
    PERFORMANCE ON TEST SET: Batch Loss = 0.6498935222625732, Accuracy = 0.9053682684898376
    Training iter #502000: Batch Loss = 0.454416, Accuracy = 0.8988000154495239
    PERFORMANCE ON TEST SET: Batch Loss = 0.7877367734909058, Accuracy = 0.8197253346443176
    =================================================
    2.1600000000000022e-203
    1.0000000000000012e-204
    Training iter #502500: Batch Loss = 0.379239, Accuracy = 0.9571999907493591
    PERFORMANCE ON TEST SET: Batch Loss = 0.6441347599029541, Accuracy = 0.9076154828071594
    Training iter #503000: Batch Loss = 0.504068, Accuracy = 0.8751999735832214
    PERFORMANCE ON TEST SET: Batch Loss = 0.6798274517059326, Accuracy = 0.8808988928794861
    Training iter #503500: Batch Loss = 0.403485, Accuracy = 0.9312000274658203
    PERFORMANCE ON TEST SET: Batch Loss = 0.6555421948432922, Accuracy = 0.8958801627159119
    Training iter #504000: Batch Loss = 0.444729, Accuracy = 0.9132000207901001
    PERFORMANCE ON TEST SET: Batch Loss = 0.7031303644180298, Accuracy = 0.8549313545227051
    Training iter #504500: Batch Loss = 0.383811, Accuracy = 0.9556000232696533
    PERFORMANCE ON TEST SET: Batch Loss = 0.6474729776382446, Accuracy = 0.9058676362037659
    =================================================
    2.1600000000000022e-204
    1.0000000000000013e-205
    Training iter #505000: Batch Loss = 0.392984, Accuracy = 0.9520000219345093
    PERFORMANCE ON TEST SET: Batch Loss = 0.6424763202667236, Accuracy = 0.9081148505210876
    Training iter #505500: Batch Loss = 0.391894, Accuracy = 0.9476000070571899
    PERFORMANCE ON TEST SET: Batch Loss = 0.645749568939209, Accuracy = 0.9078651666641235
    Training iter #506000: Batch Loss = 0.383316, Accuracy = 0.9467999935150146
    PERFORMANCE ON TEST SET: Batch Loss = 0.6435657739639282, Accuracy = 0.9083645343780518
    Training iter #506500: Batch Loss = 0.391590, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6490523815155029, Accuracy = 0.9043695330619812
    Training iter #507000: Batch Loss = 0.583153, Accuracy = 0.8299999833106995
    PERFORMANCE ON TEST SET: Batch Loss = 0.8968158960342407, Accuracy = 0.7887640595436096
    =================================================
    2.1600000000000023e-205
    1.0000000000000013e-206
    Training iter #507500: Batch Loss = 0.390468, Accuracy = 0.9544000029563904
    PERFORMANCE ON TEST SET: Batch Loss = 0.6590965986251831, Accuracy = 0.8991261124610901
    Training iter #508000: Batch Loss = 0.441742, Accuracy = 0.906000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6491345167160034, Accuracy = 0.9046192169189453
    Training iter #508500: Batch Loss = 0.433257, Accuracy = 0.9196000099182129
    PERFORMANCE ON TEST SET: Batch Loss = 0.9488885998725891, Accuracy = 0.7545567750930786
    Training iter #509000: Batch Loss = 0.386007, Accuracy = 0.9472000002861023
    PERFORMANCE ON TEST SET: Batch Loss = 0.6538957357406616, Accuracy = 0.8968788981437683
    Training iter #509500: Batch Loss = 0.383254, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6419751644134521, Accuracy = 0.9081148505210876
    =================================================
    2.1600000000000023e-206
    1.0000000000000013e-207
    Training iter #510000: Batch Loss = 0.384238, Accuracy = 0.9552000164985657
    PERFORMANCE ON TEST SET: Batch Loss = 0.6509405374526978, Accuracy = 0.9006242156028748
    Training iter #510500: Batch Loss = 0.397872, Accuracy = 0.949999988079071
    PERFORMANCE ON TEST SET: Batch Loss = 0.6527944803237915, Accuracy = 0.9026217460632324
    Training iter #511000: Batch Loss = 0.391058, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6415990591049194, Accuracy = 0.90886390209198
    Training iter #511500: Batch Loss = 0.388334, Accuracy = 0.9480000138282776
    PERFORMANCE ON TEST SET: Batch Loss = 0.6466259956359863, Accuracy = 0.9081148505210876
    Training iter #512000: Batch Loss = 0.386538, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6421152353286743, Accuracy = 0.90886390209198
    =================================================
    2.160000000000002e-207
    1.0000000000000014e-208
    Training iter #512500: Batch Loss = 0.403226, Accuracy = 0.9399999976158142
    PERFORMANCE ON TEST SET: Batch Loss = 0.6705437302589417, Accuracy = 0.8833957314491272
    Training iter #513000: Batch Loss = 0.395607, Accuracy = 0.9452000260353088
    PERFORMANCE ON TEST SET: Batch Loss = 0.6597017645835876, Accuracy = 0.8938826322555542
    Training iter #513500: Batch Loss = 0.379337, Accuracy = 0.9567999839782715
    PERFORMANCE ON TEST SET: Batch Loss = 0.6466939449310303, Accuracy = 0.906616747379303
    Training iter #514000: Batch Loss = 0.387989, Accuracy = 0.9491999745368958
    PERFORMANCE ON TEST SET: Batch Loss = 0.6378386616706848, Accuracy = 0.9083645343780518
    Training iter #514500: Batch Loss = 0.384907, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6504995822906494, Accuracy = 0.9003745317459106
    =================================================
    2.160000000000002e-208
    1.0000000000000014e-209
    Training iter #515000: Batch Loss = 0.412809, Accuracy = 0.9327999949455261
    PERFORMANCE ON TEST SET: Batch Loss = 0.6864240169525146, Accuracy = 0.8761547803878784
    Training iter #515500: Batch Loss = 0.492676, Accuracy = 0.8920000195503235
    PERFORMANCE ON TEST SET: Batch Loss = 0.8941043615341187, Accuracy = 0.7910112142562866
    Training iter #516000: Batch Loss = 0.394467, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.642357587814331, Accuracy = 0.9078651666641235
    Training iter #516500: Batch Loss = 0.393789, Accuracy = 0.9448000192642212
    PERFORMANCE ON TEST SET: Batch Loss = 0.6447442770004272, Accuracy = 0.9078651666641235
    Training iter #517000: Batch Loss = 0.466081, Accuracy = 0.8996000289916992
    PERFORMANCE ON TEST SET: Batch Loss = 0.7624592781066895, Accuracy = 0.82871413230896
    =================================================
    2.160000000000002e-209
    1.0000000000000014e-210
    Training iter #517500: Batch Loss = 0.388673, Accuracy = 0.9467999935150146
    PERFORMANCE ON TEST SET: Batch Loss = 0.6420062780380249, Accuracy = 0.9113608002662659
    Training iter #518000: Batch Loss = 0.405426, Accuracy = 0.9380000233650208
    PERFORMANCE ON TEST SET: Batch Loss = 0.663398027420044, Accuracy = 0.8878901600837708
    Training iter #518500: Batch Loss = 0.386715, Accuracy = 0.9563999772071838
    PERFORMANCE ON TEST SET: Batch Loss = 0.6456030011177063, Accuracy = 0.906616747379303
    Training iter #519000: Batch Loss = 0.380082, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6408360600471497, Accuracy = 0.9078651666641235
    Training iter #519500: Batch Loss = 0.386788, Accuracy = 0.9488000273704529
    PERFORMANCE ON TEST SET: Batch Loss = 0.6384868621826172, Accuracy = 0.9076154828071594
    =================================================
    2.160000000000002e-210
    1.0000000000000014e-211
    Training iter #520000: Batch Loss = 0.436116, Accuracy = 0.9120000004768372
    PERFORMANCE ON TEST SET: Batch Loss = 0.7600433826446533, Accuracy = 0.8277153372764587
    Training iter #520500: Batch Loss = 0.383053, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6432564854621887, Accuracy = 0.9068664312362671
    Training iter #521000: Batch Loss = 0.383093, Accuracy = 0.9544000029563904
    PERFORMANCE ON TEST SET: Batch Loss = 0.6516400575637817, Accuracy = 0.9068664312362671
    Training iter #521500: Batch Loss = 0.479121, Accuracy = 0.8935999870300293
    PERFORMANCE ON TEST SET: Batch Loss = 0.7632263898849487, Accuracy = 0.8379525542259216
    Training iter #522000: Batch Loss = 0.393071, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.6536895632743835, Accuracy = 0.9033707976341248
    =================================================
    2.160000000000002e-211
    1.0000000000000014e-212
    Training iter #522500: Batch Loss = 0.428695, Accuracy = 0.9223999977111816
    PERFORMANCE ON TEST SET: Batch Loss = 0.6710835695266724, Accuracy = 0.8896379470825195
    Training iter #523000: Batch Loss = 0.393854, Accuracy = 0.9488000273704529
    PERFORMANCE ON TEST SET: Batch Loss = 0.645514726638794, Accuracy = 0.9098626971244812
    Training iter #523500: Batch Loss = 0.397279, Accuracy = 0.9480000138282776
    PERFORMANCE ON TEST SET: Batch Loss = 0.6397188305854797, Accuracy = 0.9083645343780518
    Training iter #524000: Batch Loss = 0.496303, Accuracy = 0.8736000061035156
    PERFORMANCE ON TEST SET: Batch Loss = 0.7952905297279358, Accuracy = 0.8162297010421753
    Training iter #524500: Batch Loss = 0.381322, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6412672400474548, Accuracy = 0.9096130132675171
    =================================================
    2.160000000000002e-212
    1.0000000000000014e-213
    Training iter #525000: Batch Loss = 0.386359, Accuracy = 0.9476000070571899
    PERFORMANCE ON TEST SET: Batch Loss = 0.649854838848114, Accuracy = 0.9036204814910889
    Training iter #525500: Batch Loss = 0.484762, Accuracy = 0.88919997215271
    PERFORMANCE ON TEST SET: Batch Loss = 0.8708274960517883, Accuracy = 0.789513111114502
    Training iter #526000: Batch Loss = 0.417578, Accuracy = 0.9243999719619751
    PERFORMANCE ON TEST SET: Batch Loss = 0.6705085635185242, Accuracy = 0.8916354775428772
    Training iter #526500: Batch Loss = 0.449376, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.7928562760353088, Accuracy = 0.8277153372764587
    Training iter #527000: Batch Loss = 0.405004, Accuracy = 0.9391999840736389
    PERFORMANCE ON TEST SET: Batch Loss = 0.6572004556655884, Accuracy = 0.9003745317459106
    =================================================
    2.160000000000002e-213
    1.0000000000000014e-214
    Training iter #527500: Batch Loss = 0.527908, Accuracy = 0.8604000210762024
    PERFORMANCE ON TEST SET: Batch Loss = 0.6896721720695496, Accuracy = 0.8848938941955566
    Training iter #528000: Batch Loss = 0.387323, Accuracy = 0.9444000124931335
    PERFORMANCE ON TEST SET: Batch Loss = 0.6456854343414307, Accuracy = 0.9093632698059082
    Training iter #528500: Batch Loss = 0.383370, Accuracy = 0.9503999948501587
    PERFORMANCE ON TEST SET: Batch Loss = 0.6450508236885071, Accuracy = 0.90886390209198
    Training iter #529000: Batch Loss = 0.393697, Accuracy = 0.9508000016212463
    PERFORMANCE ON TEST SET: Batch Loss = 0.6390023827552795, Accuracy = 0.9098626971244812
    Training iter #529500: Batch Loss = 0.601813, Accuracy = 0.8216000199317932
    PERFORMANCE ON TEST SET: Batch Loss = 0.7577953338623047, Accuracy = 0.8299625515937805
    =================================================
    2.160000000000002e-214
    1.0000000000000013e-215
    Training iter #530000: Batch Loss = 0.398712, Accuracy = 0.9351999759674072
    PERFORMANCE ON TEST SET: Batch Loss = 0.6676260232925415, Accuracy = 0.8868913650512695
    Training iter #530500: Batch Loss = 0.390713, Accuracy = 0.9476000070571899
    PERFORMANCE ON TEST SET: Batch Loss = 0.652153730392456, Accuracy = 0.9001248478889465
    Training iter #531000: Batch Loss = 0.451608, Accuracy = 0.9020000100135803
    PERFORMANCE ON TEST SET: Batch Loss = 0.7693264484405518, Accuracy = 0.8184769153594971
    Training iter #531500: Batch Loss = 0.610349, Accuracy = 0.828000009059906
    PERFORMANCE ON TEST SET: Batch Loss = 1.0437582731246948, Accuracy = 0.7315855026245117
    Training iter #532000: Batch Loss = 0.383015, Accuracy = 0.9535999894142151
    PERFORMANCE ON TEST SET: Batch Loss = 0.6521456241607666, Accuracy = 0.9048689007759094
    =================================================
    2.1600000000000023e-215
    1.0000000000000013e-216
    Training iter #532500: Batch Loss = 0.426711, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.6582822203636169, Accuracy = 0.9041198492050171
    Training iter #533000: Batch Loss = 0.420378, Accuracy = 0.9223999977111816
    PERFORMANCE ON TEST SET: Batch Loss = 0.6648149490356445, Accuracy = 0.8893882632255554
    Training iter #533500: Batch Loss = 0.413536, Accuracy = 0.9296000003814697
    PERFORMANCE ON TEST SET: Batch Loss = 0.6598950624465942, Accuracy = 0.9003745317459106
    Training iter #534000: Batch Loss = 0.378716, Accuracy = 0.9563999772071838
    PERFORMANCE ON TEST SET: Batch Loss = 0.6435126066207886, Accuracy = 0.9101123809814453
    Training iter #534500: Batch Loss = 0.438807, Accuracy = 0.9164000153541565
    PERFORMANCE ON TEST SET: Batch Loss = 0.6557016372680664, Accuracy = 0.9018726348876953
    =================================================
    2.1600000000000024e-216
    1.0000000000000013e-217
    Training iter #535000: Batch Loss = 0.434570, Accuracy = 0.9196000099182129
    PERFORMANCE ON TEST SET: Batch Loss = 0.8266153335571289, Accuracy = 0.7980024814605713
    Training iter #535500: Batch Loss = 0.393435, Accuracy = 0.9440000057220459
    PERFORMANCE ON TEST SET: Batch Loss = 0.6958174705505371, Accuracy = 0.8704119920730591
    Training iter #536000: Batch Loss = 0.476039, Accuracy = 0.8848000168800354
    PERFORMANCE ON TEST SET: Batch Loss = 0.6623818278312683, Accuracy = 0.890636682510376
    Training iter #536500: Batch Loss = 0.383548, Accuracy = 0.9531999826431274
    PERFORMANCE ON TEST SET: Batch Loss = 0.6446171998977661, Accuracy = 0.9048689007759094
    Training iter #537000: Batch Loss = 0.498025, Accuracy = 0.8804000020027161
    PERFORMANCE ON TEST SET: Batch Loss = 0.6842034459114075, Accuracy = 0.877902626991272
    =================================================
    2.1600000000000024e-217
    1.0000000000000013e-218
    Training iter #537500: Batch Loss = 0.383278, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6688207387924194, Accuracy = 0.8918851613998413
    Training iter #538000: Batch Loss = 0.393285, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6505889296531677, Accuracy = 0.9068664312362671
    Training iter #538500: Batch Loss = 0.399079, Accuracy = 0.9440000057220459
    PERFORMANCE ON TEST SET: Batch Loss = 0.6482716202735901, Accuracy = 0.9073657989501953
    Training iter #539000: Batch Loss = 0.387902, Accuracy = 0.9480000138282776
    PERFORMANCE ON TEST SET: Batch Loss = 0.645896315574646, Accuracy = 0.9108614325523376
    Training iter #539500: Batch Loss = 0.378369, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6399832963943481, Accuracy = 0.9101123809814453
    =================================================
    2.1600000000000025e-218
    1.0000000000000013e-219
    Training iter #540000: Batch Loss = 0.399252, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.6400887966156006, Accuracy = 0.9086142182350159
    Training iter #540500: Batch Loss = 0.464612, Accuracy = 0.8880000114440918
    PERFORMANCE ON TEST SET: Batch Loss = 0.6588971614837646, Accuracy = 0.8896379470825195
    Training iter #541000: Batch Loss = 0.373400, Accuracy = 0.9580000042915344
    PERFORMANCE ON TEST SET: Batch Loss = 0.6384410262107849, Accuracy = 0.9081148505210876
    Training iter #541500: Batch Loss = 0.384237, Accuracy = 0.9516000151634216
    PERFORMANCE ON TEST SET: Batch Loss = 0.6376152634620667, Accuracy = 0.9076154828071594
    Training iter #542000: Batch Loss = 0.389337, Accuracy = 0.9467999935150146
    PERFORMANCE ON TEST SET: Batch Loss = 0.6722091436386108, Accuracy = 0.8846442103385925
    =================================================
    2.1600000000000025e-219
    1.0000000000000014e-220
    Training iter #542500: Batch Loss = 0.383100, Accuracy = 0.9559999704360962
    PERFORMANCE ON TEST SET: Batch Loss = 0.6424197554588318, Accuracy = 0.91161048412323
    Training iter #543000: Batch Loss = 0.377828, Accuracy = 0.9571999907493591
    PERFORMANCE ON TEST SET: Batch Loss = 0.6441198587417603, Accuracy = 0.9056179523468018
    Training iter #543500: Batch Loss = 0.392255, Accuracy = 0.9524000287055969
    PERFORMANCE ON TEST SET: Batch Loss = 0.6433543562889099, Accuracy = 0.9093632698059082
    Training iter #544000: Batch Loss = 0.391516, Accuracy = 0.9480000138282776
    PERFORMANCE ON TEST SET: Batch Loss = 0.6421946287155151, Accuracy = 0.9093632698059082
    Training iter #544500: Batch Loss = 0.405327, Accuracy = 0.9380000233650208
    PERFORMANCE ON TEST SET: Batch Loss = 0.6453378200531006, Accuracy = 0.9036204814910889
    =================================================
    2.1600000000000023e-220
    1.0000000000000014e-221
    Training iter #545000: Batch Loss = 0.561759, Accuracy = 0.8443999886512756
    PERFORMANCE ON TEST SET: Batch Loss = 0.8250285387039185, Accuracy = 0.8107365965843201
    Training iter #545500: Batch Loss = 0.399283, Accuracy = 0.9452000260353088
    PERFORMANCE ON TEST SET: Batch Loss = 0.6421791315078735, Accuracy = 0.9098626971244812
    Training iter #546000: Batch Loss = 0.425970, Accuracy = 0.9211999773979187
    PERFORMANCE ON TEST SET: Batch Loss = 0.6616520881652832, Accuracy = 0.8973782658576965
    Training iter #546500: Batch Loss = 0.393012, Accuracy = 0.9387999773025513
    PERFORMANCE ON TEST SET: Batch Loss = 0.6471779346466064, Accuracy = 0.9021223187446594
    Training iter #547000: Batch Loss = 0.389888, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.643451988697052, Accuracy = 0.9078651666641235
    =================================================
    2.1600000000000024e-221
    1.0000000000000014e-222
    Training iter #547500: Batch Loss = 0.394739, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.644936740398407, Accuracy = 0.9093632698059082
    Training iter #548000: Batch Loss = 0.419534, Accuracy = 0.9240000247955322
    PERFORMANCE ON TEST SET: Batch Loss = 0.6764100790023804, Accuracy = 0.8903869986534119
    Training iter #548500: Batch Loss = 0.379391, Accuracy = 0.9531999826431274
    PERFORMANCE ON TEST SET: Batch Loss = 0.6624488830566406, Accuracy = 0.8898876309394836
    Training iter #549000: Batch Loss = 0.437372, Accuracy = 0.9196000099182129
    PERFORMANCE ON TEST SET: Batch Loss = 0.6904987096786499, Accuracy = 0.8803995251655579
    Training iter #549500: Batch Loss = 0.407485, Accuracy = 0.9380000233650208
    PERFORMANCE ON TEST SET: Batch Loss = 0.6436522603034973, Accuracy = 0.9053682684898376
    =================================================
    2.1600000000000024e-222
    1.0000000000000015e-223
    Training iter #550000: Batch Loss = 0.482285, Accuracy = 0.8903999924659729
    PERFORMANCE ON TEST SET: Batch Loss = 0.7228691577911377, Accuracy = 0.8494381904602051
    Training iter #550500: Batch Loss = 0.385497, Accuracy = 0.9508000016212463
    PERFORMANCE ON TEST SET: Batch Loss = 0.6398111581802368, Accuracy = 0.9058676362037659
    Training iter #551000: Batch Loss = 0.395341, Accuracy = 0.9480000138282776
    PERFORMANCE ON TEST SET: Batch Loss = 0.6377988457679749, Accuracy = 0.9111111164093018
    Training iter #551500: Batch Loss = 0.387355, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6376132369041443, Accuracy = 0.9081148505210876
    Training iter #552000: Batch Loss = 0.385094, Accuracy = 0.9480000138282776
    PERFORMANCE ON TEST SET: Batch Loss = 0.6386774778366089, Accuracy = 0.9081148505210876
    =================================================
    2.1600000000000023e-223
    1.0000000000000015e-224
    Training iter #552500: Batch Loss = 0.515587, Accuracy = 0.8691999912261963
    PERFORMANCE ON TEST SET: Batch Loss = 0.7717479467391968, Accuracy = 0.8149812817573547
    Training iter #553000: Batch Loss = 0.492829, Accuracy = 0.8744000196456909
    PERFORMANCE ON TEST SET: Batch Loss = 0.6991110444068909, Accuracy = 0.8631710410118103
    Training iter #553500: Batch Loss = 0.384249, Accuracy = 0.9531999826431274
    PERFORMANCE ON TEST SET: Batch Loss = 0.6446914076805115, Accuracy = 0.9076154828071594
    Training iter #554000: Batch Loss = 0.375026, Accuracy = 0.9580000042915344
    PERFORMANCE ON TEST SET: Batch Loss = 0.6406162977218628, Accuracy = 0.9106117486953735
    Training iter #554500: Batch Loss = 0.417467, Accuracy = 0.9291999936103821
    PERFORMANCE ON TEST SET: Batch Loss = 0.649074375629425, Accuracy = 0.906616747379303
    =================================================
    2.1600000000000023e-224
    1.0000000000000015e-225
    Training iter #555000: Batch Loss = 0.624883, Accuracy = 0.8155999779701233
    PERFORMANCE ON TEST SET: Batch Loss = 0.7133296132087708, Accuracy = 0.8606741428375244
    Training iter #555500: Batch Loss = 0.390966, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.6402746438980103, Accuracy = 0.9091135859489441
    Training iter #556000: Batch Loss = 0.379408, Accuracy = 0.9556000232696533
    PERFORMANCE ON TEST SET: Batch Loss = 0.637862503528595, Accuracy = 0.9098626971244812
    Training iter #556500: Batch Loss = 0.442975, Accuracy = 0.9139999747276306
    PERFORMANCE ON TEST SET: Batch Loss = 0.6489783525466919, Accuracy = 0.9053682684898376
    Training iter #557000: Batch Loss = 0.555795, Accuracy = 0.8492000102996826
    PERFORMANCE ON TEST SET: Batch Loss = 0.8715997934341431, Accuracy = 0.7875155806541443
    =================================================
    2.1600000000000022e-225
    1.0000000000000014e-226
    Training iter #557500: Batch Loss = 0.377779, Accuracy = 0.9559999704360962
    PERFORMANCE ON TEST SET: Batch Loss = 0.6378093361854553, Accuracy = 0.9061173796653748
    Training iter #558000: Batch Loss = 0.381813, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6418684124946594, Accuracy = 0.9078651666641235
    Training iter #558500: Batch Loss = 0.383020, Accuracy = 0.9544000029563904
    PERFORMANCE ON TEST SET: Batch Loss = 0.6381209492683411, Accuracy = 0.9086142182350159
    Training iter #559000: Batch Loss = 0.407714, Accuracy = 0.9348000288009644
    PERFORMANCE ON TEST SET: Batch Loss = 0.6642545461654663, Accuracy = 0.8936329483985901
    Training iter #559500: Batch Loss = 0.446293, Accuracy = 0.9100000262260437
    PERFORMANCE ON TEST SET: Batch Loss = 0.8565565347671509, Accuracy = 0.7960050106048584
    =================================================
    2.1600000000000022e-226
    1.0000000000000015e-227
    Training iter #560000: Batch Loss = 0.412791, Accuracy = 0.9391999840736389
    PERFORMANCE ON TEST SET: Batch Loss = 0.6570398807525635, Accuracy = 0.8973782658576965
    Training iter #560500: Batch Loss = 0.391771, Accuracy = 0.9476000070571899
    PERFORMANCE ON TEST SET: Batch Loss = 0.6478167772293091, Accuracy = 0.9008738994598389
    Training iter #561000: Batch Loss = 0.387499, Accuracy = 0.9480000138282776
    PERFORMANCE ON TEST SET: Batch Loss = 0.6449779272079468, Accuracy = 0.9056179523468018
    Training iter #561500: Batch Loss = 0.496353, Accuracy = 0.8840000033378601
    PERFORMANCE ON TEST SET: Batch Loss = 0.7582913041114807, Accuracy = 0.8289638161659241
    Training iter #562000: Batch Loss = 0.414337, Accuracy = 0.9355999827384949
    PERFORMANCE ON TEST SET: Batch Loss = 0.6513003706932068, Accuracy = 0.9006242156028748
    =================================================
    2.160000000000002e-227
    1.0000000000000015e-228
    Training iter #562500: Batch Loss = 0.381951, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6341557502746582, Accuracy = 0.9096130132675171
    Training iter #563000: Batch Loss = 0.381685, Accuracy = 0.9516000151634216
    PERFORMANCE ON TEST SET: Batch Loss = 0.6437355279922485, Accuracy = 0.9046192169189453
    Training iter #563500: Batch Loss = 0.398852, Accuracy = 0.9351999759674072
    PERFORMANCE ON TEST SET: Batch Loss = 0.6444562673568726, Accuracy = 0.901123583316803
    Training iter #564000: Batch Loss = 0.464114, Accuracy = 0.9039999842643738
    PERFORMANCE ON TEST SET: Batch Loss = 0.6863439679145813, Accuracy = 0.882896363735199
    Training iter #564500: Batch Loss = 0.394524, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6702896356582642, Accuracy = 0.8833957314491272
    =================================================
    2.1600000000000022e-228
    1.0000000000000015e-229
    Training iter #565000: Batch Loss = 0.414053, Accuracy = 0.9247999787330627
    PERFORMANCE ON TEST SET: Batch Loss = 0.6533999443054199, Accuracy = 0.8991261124610901
    Training iter #565500: Batch Loss = 0.390241, Accuracy = 0.9559999704360962
    PERFORMANCE ON TEST SET: Batch Loss = 0.6447340250015259, Accuracy = 0.9083645343780518
    Training iter #566000: Batch Loss = 0.384697, Accuracy = 0.9552000164985657
    PERFORMANCE ON TEST SET: Batch Loss = 0.6365546584129333, Accuracy = 0.9106117486953735
    Training iter #566500: Batch Loss = 0.407825, Accuracy = 0.9336000084877014
    PERFORMANCE ON TEST SET: Batch Loss = 0.6988105177879333, Accuracy = 0.8619226217269897
    Training iter #567000: Batch Loss = 0.380208, Accuracy = 0.9535999894142151
    PERFORMANCE ON TEST SET: Batch Loss = 0.6538739800453186, Accuracy = 0.903870165348053
    =================================================
    2.1600000000000023e-229
    1.0000000000000015e-230
    Training iter #567500: Batch Loss = 0.392338, Accuracy = 0.949999988079071
    PERFORMANCE ON TEST SET: Batch Loss = 0.6401328444480896, Accuracy = 0.9103620648384094
    Training iter #568000: Batch Loss = 0.412040, Accuracy = 0.9351999759674072
    PERFORMANCE ON TEST SET: Batch Loss = 0.6575745344161987, Accuracy = 0.8928838968276978
    Training iter #568500: Batch Loss = 0.407284, Accuracy = 0.9287999868392944
    PERFORMANCE ON TEST SET: Batch Loss = 0.730392336845398, Accuracy = 0.8456928730010986
    Training iter #569000: Batch Loss = 0.388866, Accuracy = 0.949999988079071
    PERFORMANCE ON TEST SET: Batch Loss = 0.6578293442726135, Accuracy = 0.8981273174285889
    Training iter #569500: Batch Loss = 0.553736, Accuracy = 0.8515999913215637
    PERFORMANCE ON TEST SET: Batch Loss = 0.6719254851341248, Accuracy = 0.8891385793685913
    =================================================
    2.1600000000000025e-230
    1.0000000000000016e-231
    Training iter #570000: Batch Loss = 0.576774, Accuracy = 0.8371999859809875
    PERFORMANCE ON TEST SET: Batch Loss = 0.8141326904296875, Accuracy = 0.8014981150627136
    Training iter #570500: Batch Loss = 0.414289, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.6520464420318604, Accuracy = 0.9041198492050171
    Training iter #571000: Batch Loss = 0.396423, Accuracy = 0.9472000002861023
    PERFORMANCE ON TEST SET: Batch Loss = 0.6426906585693359, Accuracy = 0.90886390209198
    Training iter #571500: Batch Loss = 0.425368, Accuracy = 0.9192000031471252
    PERFORMANCE ON TEST SET: Batch Loss = 0.6472595930099487, Accuracy = 0.9053682684898376
    Training iter #572000: Batch Loss = 0.400414, Accuracy = 0.9399999976158142
    PERFORMANCE ON TEST SET: Batch Loss = 0.6375401616096497, Accuracy = 0.9076154828071594
    =================================================
    2.1600000000000025e-231
    1.0000000000000016e-232
    Training iter #572500: Batch Loss = 0.397744, Accuracy = 0.9404000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.6739678382873535, Accuracy = 0.8871410489082336
    Training iter #573000: Batch Loss = 0.409275, Accuracy = 0.9391999840736389
    PERFORMANCE ON TEST SET: Batch Loss = 0.6615647077560425, Accuracy = 0.8978776335716248
    Training iter #573500: Batch Loss = 0.408121, Accuracy = 0.9351999759674072
    PERFORMANCE ON TEST SET: Batch Loss = 0.6645609140396118, Accuracy = 0.8896379470825195
    Training iter #574000: Batch Loss = 0.379819, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6462814211845398, Accuracy = 0.9036204814910889
    Training iter #574500: Batch Loss = 0.565058, Accuracy = 0.8503999710083008
    PERFORMANCE ON TEST SET: Batch Loss = 0.6999937891960144, Accuracy = 0.8769038915634155
    =================================================
    2.1600000000000025e-232
    1.0000000000000016e-233
    Training iter #575000: Batch Loss = 0.408918, Accuracy = 0.9308000206947327
    PERFORMANCE ON TEST SET: Batch Loss = 0.6519457101821899, Accuracy = 0.9013732671737671
    Training iter #575500: Batch Loss = 0.412180, Accuracy = 0.9332000017166138
    PERFORMANCE ON TEST SET: Batch Loss = 0.651665210723877, Accuracy = 0.9031211137771606
    Training iter #576000: Batch Loss = 0.413091, Accuracy = 0.9323999881744385
    PERFORMANCE ON TEST SET: Batch Loss = 0.651038408279419, Accuracy = 0.8996254801750183
    Training iter #576500: Batch Loss = 0.390000, Accuracy = 0.9520000219345093
    PERFORMANCE ON TEST SET: Batch Loss = 0.6424860954284668, Accuracy = 0.9086142182350159
    Training iter #577000: Batch Loss = 0.386467, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6373416185379028, Accuracy = 0.9106117486953735
    =================================================
    2.1600000000000024e-233
    1.0000000000000016e-234
    Training iter #577500: Batch Loss = 0.381792, Accuracy = 0.9531999826431274
    PERFORMANCE ON TEST SET: Batch Loss = 0.634151816368103, Accuracy = 0.90886390209198
    Training iter #578000: Batch Loss = 0.528549, Accuracy = 0.8715999722480774
    PERFORMANCE ON TEST SET: Batch Loss = 0.7793877124786377, Accuracy = 0.8137328624725342
    Training iter #578500: Batch Loss = 0.427627, Accuracy = 0.9264000058174133
    PERFORMANCE ON TEST SET: Batch Loss = 0.6536014080047607, Accuracy = 0.9033707976341248
    Training iter #579000: Batch Loss = 0.382825, Accuracy = 0.9524000287055969
    PERFORMANCE ON TEST SET: Batch Loss = 0.6332988739013672, Accuracy = 0.90886390209198
    Training iter #579500: Batch Loss = 0.408053, Accuracy = 0.9291999936103821
    PERFORMANCE ON TEST SET: Batch Loss = 0.7052181959152222, Accuracy = 0.8574281930923462
    =================================================
    2.1600000000000024e-234
    1.0000000000000017e-235
    Training iter #580000: Batch Loss = 0.496633, Accuracy = 0.8708000183105469
    PERFORMANCE ON TEST SET: Batch Loss = 0.7067747116088867, Accuracy = 0.8574281930923462
    Training iter #580500: Batch Loss = 0.476425, Accuracy = 0.897599995136261
    PERFORMANCE ON TEST SET: Batch Loss = 0.738219141960144, Accuracy = 0.8454431891441345
    Training iter #581000: Batch Loss = 0.414485, Accuracy = 0.9315999746322632
    PERFORMANCE ON TEST SET: Batch Loss = 0.6830652952194214, Accuracy = 0.8696629405021667
    Training iter #581500: Batch Loss = 0.379301, Accuracy = 0.9575999975204468
    PERFORMANCE ON TEST SET: Batch Loss = 0.6362630724906921, Accuracy = 0.9118601679801941
    Training iter #582000: Batch Loss = 0.387976, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6396046876907349, Accuracy = 0.9098626971244812
    =================================================
    2.1600000000000022e-235
    1.0000000000000018e-236
    Training iter #582500: Batch Loss = 0.386222, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6398065686225891, Accuracy = 0.9096130132675171
    Training iter #583000: Batch Loss = 0.469605, Accuracy = 0.8912000060081482
    PERFORMANCE ON TEST SET: Batch Loss = 0.6532176733016968, Accuracy = 0.8898876309394836
    Training iter #583500: Batch Loss = 0.506191, Accuracy = 0.8651999831199646
    PERFORMANCE ON TEST SET: Batch Loss = 0.6526502966880798, Accuracy = 0.893383264541626
    Training iter #584000: Batch Loss = 0.397759, Accuracy = 0.9524000287055969
    PERFORMANCE ON TEST SET: Batch Loss = 0.6404466032981873, Accuracy = 0.9096130132675171
    Training iter #584500: Batch Loss = 0.382637, Accuracy = 0.951200008392334
    PERFORMANCE ON TEST SET: Batch Loss = 0.6351636052131653, Accuracy = 0.9106117486953735
    =================================================
    2.160000000000002e-236
    1.0000000000000018e-237
    Training iter #585000: Batch Loss = 0.380075, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6347967982292175, Accuracy = 0.9123595356941223
    Training iter #585500: Batch Loss = 0.372415, Accuracy = 0.9575999975204468
    PERFORMANCE ON TEST SET: Batch Loss = 0.6351038217544556, Accuracy = 0.9131085872650146
    Training iter #586000: Batch Loss = 0.481479, Accuracy = 0.8844000101089478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6626682877540588, Accuracy = 0.8843945264816284
    Training iter #586500: Batch Loss = 0.402294, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6533292531967163, Accuracy = 0.8998751640319824
    Training iter #587000: Batch Loss = 0.412968, Accuracy = 0.9332000017166138
    PERFORMANCE ON TEST SET: Batch Loss = 0.7568325996398926, Accuracy = 0.8332085013389587
    =================================================
    2.160000000000002e-237
    1.0000000000000017e-238
    Training iter #587500: Batch Loss = 0.459482, Accuracy = 0.9103999733924866
    PERFORMANCE ON TEST SET: Batch Loss = 0.6838992238044739, Accuracy = 0.8776529431343079
    Training iter #588000: Batch Loss = 0.445942, Accuracy = 0.9071999788284302
    PERFORMANCE ON TEST SET: Batch Loss = 0.6520695686340332, Accuracy = 0.9028714299201965
    Training iter #588500: Batch Loss = 0.382190, Accuracy = 0.9559999704360962
    PERFORMANCE ON TEST SET: Batch Loss = 0.6368198394775391, Accuracy = 0.90886390209198
    Training iter #589000: Batch Loss = 0.407693, Accuracy = 0.9315999746322632
    PERFORMANCE ON TEST SET: Batch Loss = 0.6980426907539368, Accuracy = 0.8631710410118103
    Training iter #589500: Batch Loss = 0.395950, Accuracy = 0.9531999826431274
    PERFORMANCE ON TEST SET: Batch Loss = 0.644866406917572, Accuracy = 0.9033707976341248
    =================================================
    2.160000000000002e-238
    1.0000000000000018e-239
    Training iter #590000: Batch Loss = 0.383394, Accuracy = 0.9535999894142151
    PERFORMANCE ON TEST SET: Batch Loss = 0.6330928206443787, Accuracy = 0.9121098518371582
    Training iter #590500: Batch Loss = 0.379437, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.634909987449646, Accuracy = 0.9111111164093018
    Training iter #591000: Batch Loss = 0.377452, Accuracy = 0.9559999704360962
    PERFORMANCE ON TEST SET: Batch Loss = 0.6360899209976196, Accuracy = 0.9096130132675171
    Training iter #591500: Batch Loss = 0.423710, Accuracy = 0.9223999977111816
    PERFORMANCE ON TEST SET: Batch Loss = 0.663814902305603, Accuracy = 0.8913857936859131
    Training iter #592000: Batch Loss = 0.392651, Accuracy = 0.9448000192642212
    PERFORMANCE ON TEST SET: Batch Loss = 0.6419442892074585, Accuracy = 0.9056179523468018
    =================================================
    2.1600000000000023e-239
    1.0000000000000018e-240
    Training iter #592500: Batch Loss = 0.381449, Accuracy = 0.9531999826431274
    PERFORMANCE ON TEST SET: Batch Loss = 0.6597638130187988, Accuracy = 0.8973782658576965
    Training iter #593000: Batch Loss = 0.388010, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6505833864212036, Accuracy = 0.9021223187446594
    Training iter #593500: Batch Loss = 0.434992, Accuracy = 0.91839998960495
    PERFORMANCE ON TEST SET: Batch Loss = 0.6575559377670288, Accuracy = 0.898377001285553
    Training iter #594000: Batch Loss = 0.521109, Accuracy = 0.8691999912261963
    PERFORMANCE ON TEST SET: Batch Loss = 0.7817769646644592, Accuracy = 0.8234706521034241
    Training iter #594500: Batch Loss = 0.387351, Accuracy = 0.942799985408783
    PERFORMANCE ON TEST SET: Batch Loss = 0.6441864371299744, Accuracy = 0.9046192169189453
    =================================================
    2.1600000000000023e-240
    1.0000000000000018e-241
    Training iter #595000: Batch Loss = 0.398326, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6525965929031372, Accuracy = 0.898876428604126
    Training iter #595500: Batch Loss = 0.393418, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.6373412013053894, Accuracy = 0.9078651666641235
    Training iter #596000: Batch Loss = 0.415737, Accuracy = 0.9232000112533569
    PERFORMANCE ON TEST SET: Batch Loss = 0.6439646482467651, Accuracy = 0.9031211137771606
    Training iter #596500: Batch Loss = 0.377448, Accuracy = 0.9559999704360962
    PERFORMANCE ON TEST SET: Batch Loss = 0.6364706754684448, Accuracy = 0.9093632698059082
    Training iter #597000: Batch Loss = 0.390014, Accuracy = 0.949999988079071
    PERFORMANCE ON TEST SET: Batch Loss = 0.6393697261810303, Accuracy = 0.9126092195510864
    =================================================
    2.1600000000000023e-241
    1.0000000000000018e-242
    Training iter #597500: Batch Loss = 0.381392, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6373152732849121, Accuracy = 0.9108614325523376
    Training iter #598000: Batch Loss = 0.384295, Accuracy = 0.9491999745368958
    PERFORMANCE ON TEST SET: Batch Loss = 0.6473925709724426, Accuracy = 0.9056179523468018
    Training iter #598500: Batch Loss = 0.381961, Accuracy = 0.9584000110626221
    PERFORMANCE ON TEST SET: Batch Loss = 0.6370945572853088, Accuracy = 0.9113608002662659
    Training iter #599000: Batch Loss = 0.415651, Accuracy = 0.9296000003814697
    PERFORMANCE ON TEST SET: Batch Loss = 0.6449398994445801, Accuracy = 0.9041198492050171
    Training iter #599500: Batch Loss = 0.386179, Accuracy = 0.9488000273704529
    PERFORMANCE ON TEST SET: Batch Loss = 0.6366487741470337, Accuracy = 0.9108614325523376
    =================================================
    2.160000000000002e-242
    1.0000000000000018e-243
    Training iter #600000: Batch Loss = 0.373802, Accuracy = 0.9535999894142151
    PERFORMANCE ON TEST SET: Batch Loss = 0.6374542117118835, Accuracy = 0.9133583307266235
    Training iter #600500: Batch Loss = 0.400689, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6683777570724487, Accuracy = 0.8863919973373413
    Training iter #601000: Batch Loss = 0.497285, Accuracy = 0.8712000250816345
    PERFORMANCE ON TEST SET: Batch Loss = 0.7496175765991211, Accuracy = 0.8384519219398499
    Training iter #601500: Batch Loss = 0.383826, Accuracy = 0.9516000151634216
    PERFORMANCE ON TEST SET: Batch Loss = 0.6412690877914429, Accuracy = 0.90886390209198
    Training iter #602000: Batch Loss = 0.374582, Accuracy = 0.9571999907493591
    PERFORMANCE ON TEST SET: Batch Loss = 0.6361427903175354, Accuracy = 0.9121098518371582
    =================================================
    2.160000000000002e-243
    1.0000000000000019e-244
    Training iter #602500: Batch Loss = 0.390170, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6487638354301453, Accuracy = 0.9036204814910889
    Training iter #603000: Batch Loss = 0.379641, Accuracy = 0.9556000232696533
    PERFORMANCE ON TEST SET: Batch Loss = 0.6414204239845276, Accuracy = 0.91161048412323
    Training iter #603500: Batch Loss = 0.380204, Accuracy = 0.9556000232696533
    PERFORMANCE ON TEST SET: Batch Loss = 0.6377789974212646, Accuracy = 0.9121098518371582
    Training iter #604000: Batch Loss = 0.384800, Accuracy = 0.9544000029563904
    PERFORMANCE ON TEST SET: Batch Loss = 0.6450526714324951, Accuracy = 0.9078651666641235
    Training iter #604500: Batch Loss = 0.482987, Accuracy = 0.8916000127792358
    PERFORMANCE ON TEST SET: Batch Loss = 0.7752990126609802, Accuracy = 0.8292135000228882
    =================================================
    2.1600000000000022e-244
    1.000000000000002e-245
    Training iter #605000: Batch Loss = 0.384086, Accuracy = 0.9524000287055969
    PERFORMANCE ON TEST SET: Batch Loss = 0.6522767543792725, Accuracy = 0.9026217460632324
    Training iter #605500: Batch Loss = 0.377806, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6632285118103027, Accuracy = 0.8968788981437683
    Training iter #606000: Batch Loss = 0.387239, Accuracy = 0.949999988079071
    PERFORMANCE ON TEST SET: Batch Loss = 0.6377910375595093, Accuracy = 0.91435706615448
    Training iter #606500: Batch Loss = 0.386914, Accuracy = 0.9524000287055969
    PERFORMANCE ON TEST SET: Batch Loss = 0.6632099747657776, Accuracy = 0.8963795304298401
    Training iter #607000: Batch Loss = 0.469592, Accuracy = 0.8948000073432922
    PERFORMANCE ON TEST SET: Batch Loss = 0.6544469594955444, Accuracy = 0.8973782658576965
    =================================================
    2.1600000000000024e-245
    1.000000000000002e-246
    Training iter #607500: Batch Loss = 0.372278, Accuracy = 0.9592000246047974
    PERFORMANCE ON TEST SET: Batch Loss = 0.6400057077407837, Accuracy = 0.9068664312362671
    Training iter #608000: Batch Loss = 0.386589, Accuracy = 0.9524000287055969
    PERFORMANCE ON TEST SET: Batch Loss = 0.6361128091812134, Accuracy = 0.91161048412323
    Training iter #608500: Batch Loss = 0.374261, Accuracy = 0.9607999920845032
    PERFORMANCE ON TEST SET: Batch Loss = 0.6369953751564026, Accuracy = 0.91161048412323
    Training iter #609000: Batch Loss = 0.446621, Accuracy = 0.9092000126838684
    PERFORMANCE ON TEST SET: Batch Loss = 0.6696959733963013, Accuracy = 0.8886392116546631
    Training iter #609500: Batch Loss = 0.378135, Accuracy = 0.9592000246047974
    PERFORMANCE ON TEST SET: Batch Loss = 0.6424646973609924, Accuracy = 0.9111111164093018
    =================================================
    2.1600000000000024e-246
    1.000000000000002e-247
    Training iter #610000: Batch Loss = 0.446544, Accuracy = 0.9128000140190125
    PERFORMANCE ON TEST SET: Batch Loss = 0.6813413500785828, Accuracy = 0.8826466798782349
    Training iter #610500: Batch Loss = 0.430760, Accuracy = 0.921999990940094
    PERFORMANCE ON TEST SET: Batch Loss = 0.7805667519569397, Accuracy = 0.8219725489616394
    Training iter #611000: Batch Loss = 0.454446, Accuracy = 0.9128000140190125
    PERFORMANCE ON TEST SET: Batch Loss = 0.7493962049484253, Accuracy = 0.8397004008293152
    Training iter #611500: Batch Loss = 0.410898, Accuracy = 0.9319999814033508
    PERFORMANCE ON TEST SET: Batch Loss = 0.7439053058624268, Accuracy = 0.8426966071128845
    Training iter #612000: Batch Loss = 0.383018, Accuracy = 0.9563999772071838
    PERFORMANCE ON TEST SET: Batch Loss = 0.6428235769271851, Accuracy = 0.9126092195510864
    =================================================
    2.1600000000000023e-247
    1.000000000000002e-248
    Training iter #612500: Batch Loss = 0.396871, Accuracy = 0.9444000124931335
    PERFORMANCE ON TEST SET: Batch Loss = 0.660332441329956, Accuracy = 0.893383264541626
    Training iter #613000: Batch Loss = 0.408348, Accuracy = 0.9336000084877014
    PERFORMANCE ON TEST SET: Batch Loss = 0.7002063989639282, Accuracy = 0.8694132566452026
    Training iter #613500: Batch Loss = 0.384024, Accuracy = 0.951200008392334
    PERFORMANCE ON TEST SET: Batch Loss = 0.6352013349533081, Accuracy = 0.9123595356941223
    Training iter #614000: Batch Loss = 0.390124, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.6527667045593262, Accuracy = 0.9026217460632324
    Training iter #614500: Batch Loss = 0.375748, Accuracy = 0.9595999717712402
    PERFORMANCE ON TEST SET: Batch Loss = 0.6403869390487671, Accuracy = 0.9093632698059082
    =================================================
    2.1600000000000024e-248
    1.0000000000000019e-249
    Training iter #615000: Batch Loss = 0.378820, Accuracy = 0.9599999785423279
    PERFORMANCE ON TEST SET: Batch Loss = 0.6412919163703918, Accuracy = 0.9113608002662659
    Training iter #615500: Batch Loss = 0.388362, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6440529823303223, Accuracy = 0.9103620648384094
    Training iter #616000: Batch Loss = 0.401150, Accuracy = 0.9404000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.649939775466919, Accuracy = 0.9106117486953735
    Training iter #616500: Batch Loss = 0.377302, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6360843181610107, Accuracy = 0.9136080145835876
    Training iter #617000: Batch Loss = 0.436826, Accuracy = 0.91839998960495
    PERFORMANCE ON TEST SET: Batch Loss = 0.6890445947647095, Accuracy = 0.8659176230430603
    =================================================
    2.1600000000000025e-249
    1.0000000000000019e-250
    Training iter #617500: Batch Loss = 0.390153, Accuracy = 0.9535999894142151
    PERFORMANCE ON TEST SET: Batch Loss = 0.6457579731941223, Accuracy = 0.9093632698059082
    Training iter #618000: Batch Loss = 0.382094, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.636884868144989, Accuracy = 0.9126092195510864
    Training iter #618500: Batch Loss = 0.371051, Accuracy = 0.9592000246047974
    PERFORMANCE ON TEST SET: Batch Loss = 0.6359298229217529, Accuracy = 0.9106117486953735
    Training iter #619000: Batch Loss = 0.388723, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6403155326843262, Accuracy = 0.9071161150932312
    Training iter #619500: Batch Loss = 0.372057, Accuracy = 0.9624000191688538
    PERFORMANCE ON TEST SET: Batch Loss = 0.6394070982933044, Accuracy = 0.9083645343780518
    =================================================
    2.1600000000000026e-250
    1.000000000000002e-251
    Training iter #620000: Batch Loss = 0.427397, Accuracy = 0.9120000004768372
    PERFORMANCE ON TEST SET: Batch Loss = 0.6534247994422913, Accuracy = 0.9016229510307312
    Training iter #620500: Batch Loss = 0.484727, Accuracy = 0.8888000249862671
    PERFORMANCE ON TEST SET: Batch Loss = 0.6855679154396057, Accuracy = 0.8786516785621643
    Training iter #621000: Batch Loss = 0.460184, Accuracy = 0.9031999707221985
    PERFORMANCE ON TEST SET: Batch Loss = 0.7384328842163086, Accuracy = 0.8556804060935974
    Training iter #621500: Batch Loss = 0.556705, Accuracy = 0.848800003528595
    PERFORMANCE ON TEST SET: Batch Loss = 0.6825697422027588, Accuracy = 0.8801498413085938
    Training iter #622000: Batch Loss = 0.463595, Accuracy = 0.8948000073432922
    PERFORMANCE ON TEST SET: Batch Loss = 0.6763477325439453, Accuracy = 0.8808988928794861
    =================================================
    2.1600000000000026e-251
    1.000000000000002e-252
    Training iter #622500: Batch Loss = 0.411355, Accuracy = 0.9372000098228455
    PERFORMANCE ON TEST SET: Batch Loss = 0.7641897797584534, Accuracy = 0.8372035026550293
    Training iter #623000: Batch Loss = 0.384967, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6594037413597107, Accuracy = 0.8971285820007324
    Training iter #623500: Batch Loss = 0.383932, Accuracy = 0.9552000164985657
    PERFORMANCE ON TEST SET: Batch Loss = 0.648093581199646, Accuracy = 0.9043695330619812
    Training iter #624000: Batch Loss = 0.391019, Accuracy = 0.9452000260353088
    PERFORMANCE ON TEST SET: Batch Loss = 0.6496680378913879, Accuracy = 0.903870165348053
    Training iter #624500: Batch Loss = 0.432996, Accuracy = 0.9147999882698059
    PERFORMANCE ON TEST SET: Batch Loss = 0.6721012592315674, Accuracy = 0.8843945264816284
    =================================================
    2.1600000000000027e-252
    1.000000000000002e-253
    Training iter #625000: Batch Loss = 0.373324, Accuracy = 0.9595999717712402
    PERFORMANCE ON TEST SET: Batch Loss = 0.6381663680076599, Accuracy = 0.9126092195510864
    Training iter #625500: Batch Loss = 0.383002, Accuracy = 0.9508000016212463
    PERFORMANCE ON TEST SET: Batch Loss = 0.6796436905860901, Accuracy = 0.8838951587677002
    Training iter #626000: Batch Loss = 0.380576, Accuracy = 0.9584000110626221
    PERFORMANCE ON TEST SET: Batch Loss = 0.6609401106834412, Accuracy = 0.8968788981437683
    Training iter #626500: Batch Loss = 0.586722, Accuracy = 0.8435999751091003
    PERFORMANCE ON TEST SET: Batch Loss = 0.8181837797164917, Accuracy = 0.8174781799316406
    Training iter #627000: Batch Loss = 0.511388, Accuracy = 0.8640000224113464
    PERFORMANCE ON TEST SET: Batch Loss = 0.6851674318313599, Accuracy = 0.8761547803878784
    =================================================
    2.1600000000000028e-253
    1.000000000000002e-254
    Training iter #627500: Batch Loss = 0.391998, Accuracy = 0.9419999718666077
    PERFORMANCE ON TEST SET: Batch Loss = 0.6531106233596802, Accuracy = 0.9041198492050171
    Training iter #628000: Batch Loss = 0.396156, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6714800596237183, Accuracy = 0.8901373147964478
    Training iter #628500: Batch Loss = 0.387319, Accuracy = 0.9535999894142151
    PERFORMANCE ON TEST SET: Batch Loss = 0.640151858329773, Accuracy = 0.9101123809814453
    Training iter #629000: Batch Loss = 1.148534, Accuracy = 0.6815999746322632
    PERFORMANCE ON TEST SET: Batch Loss = 0.9899801015853882, Accuracy = 0.7637952566146851
    Training iter #629500: Batch Loss = 0.378291, Accuracy = 0.9535999894142151
    PERFORMANCE ON TEST SET: Batch Loss = 0.6647533178329468, Accuracy = 0.8941323161125183
    =================================================
    2.1600000000000028e-254
    1.000000000000002e-255
    Training iter #630000: Batch Loss = 0.441759, Accuracy = 0.9124000072479248
    PERFORMANCE ON TEST SET: Batch Loss = 0.6936434507369995, Accuracy = 0.8679150938987732
    Training iter #630500: Batch Loss = 0.526853, Accuracy = 0.8651999831199646
    PERFORMANCE ON TEST SET: Batch Loss = 0.6968178749084473, Accuracy = 0.8636704087257385
    Training iter #631000: Batch Loss = 0.380710, Accuracy = 0.9544000029563904
    PERFORMANCE ON TEST SET: Batch Loss = 0.657697319984436, Accuracy = 0.8936329483985901
    Training iter #631500: Batch Loss = 0.390817, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6407968401908875, Accuracy = 0.9103620648384094
    Training iter #632000: Batch Loss = 0.387207, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6388687491416931, Accuracy = 0.9118601679801941
    =================================================
    2.160000000000003e-255
    1.000000000000002e-256
    Training iter #632500: Batch Loss = 0.383579, Accuracy = 0.9516000151634216
    PERFORMANCE ON TEST SET: Batch Loss = 0.6416239738464355, Accuracy = 0.9121098518371582
    Training iter #633000: Batch Loss = 0.416919, Accuracy = 0.9264000058174133
    PERFORMANCE ON TEST SET: Batch Loss = 0.7241137623786926, Accuracy = 0.8576778769493103
    Training iter #633500: Batch Loss = 0.392921, Accuracy = 0.9431999921798706
    PERFORMANCE ON TEST SET: Batch Loss = 0.6697278022766113, Accuracy = 0.8863919973373413
    Training iter #634000: Batch Loss = 0.390950, Accuracy = 0.9531999826431274
    PERFORMANCE ON TEST SET: Batch Loss = 0.6549289226531982, Accuracy = 0.9021223187446594
    Training iter #634500: Batch Loss = 0.603971, Accuracy = 0.8248000144958496
    PERFORMANCE ON TEST SET: Batch Loss = 0.8536560535430908, Accuracy = 0.7920100092887878
    =================================================
    2.1600000000000028e-256
    1.000000000000002e-257
    Training iter #635000: Batch Loss = 0.417630, Accuracy = 0.9240000247955322
    PERFORMANCE ON TEST SET: Batch Loss = 0.6384139060974121, Accuracy = 0.9058676362037659
    Training iter #635500: Batch Loss = 0.458407, Accuracy = 0.9088000059127808
    PERFORMANCE ON TEST SET: Batch Loss = 0.86726313829422, Accuracy = 0.7870162129402161
    Training iter #636000: Batch Loss = 0.517393, Accuracy = 0.8636000156402588
    PERFORMANCE ON TEST SET: Batch Loss = 0.6713129878044128, Accuracy = 0.8886392116546631
    Training iter #636500: Batch Loss = 0.407318, Accuracy = 0.9323999881744385
    PERFORMANCE ON TEST SET: Batch Loss = 0.7471979856491089, Accuracy = 0.8461922407150269
    Training iter #637000: Batch Loss = 0.397228, Accuracy = 0.942799985408783
    PERFORMANCE ON TEST SET: Batch Loss = 0.6590220928192139, Accuracy = 0.901123583316803
    =================================================
    2.160000000000003e-257
    1.000000000000002e-258
    Training iter #637500: Batch Loss = 0.385698, Accuracy = 0.9552000164985657
    PERFORMANCE ON TEST SET: Batch Loss = 0.639858603477478, Accuracy = 0.9138576984405518
    Training iter #638000: Batch Loss = 0.379510, Accuracy = 0.9580000042915344
    PERFORMANCE ON TEST SET: Batch Loss = 0.6410931348800659, Accuracy = 0.9126092195510864
    Training iter #638500: Batch Loss = 0.377431, Accuracy = 0.9520000219345093
    PERFORMANCE ON TEST SET: Batch Loss = 0.6410843729972839, Accuracy = 0.91435706615448
    Training iter #639000: Batch Loss = 0.415205, Accuracy = 0.9300000071525574
    PERFORMANCE ON TEST SET: Batch Loss = 0.6493633389472961, Accuracy = 0.9051185846328735
    Training iter #639500: Batch Loss = 0.402948, Accuracy = 0.9459999799728394
    PERFORMANCE ON TEST SET: Batch Loss = 0.6446851491928101, Accuracy = 0.9078651666641235
    =================================================
    2.160000000000003e-258
    1.0000000000000021e-259
    Training iter #640000: Batch Loss = 0.390731, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6428706049919128, Accuracy = 0.9073657989501953
    Training iter #640500: Batch Loss = 0.407872, Accuracy = 0.9340000152587891
    PERFORMANCE ON TEST SET: Batch Loss = 0.7069318294525146, Accuracy = 0.8694132566452026
    Training iter #641000: Batch Loss = 0.446793, Accuracy = 0.9067999720573425
    PERFORMANCE ON TEST SET: Batch Loss = 0.6640342473983765, Accuracy = 0.8886392116546631
    Training iter #641500: Batch Loss = 0.403984, Accuracy = 0.9351999759674072
    PERFORMANCE ON TEST SET: Batch Loss = 0.6641814112663269, Accuracy = 0.8938826322555542
    Training iter #642000: Batch Loss = 0.457514, Accuracy = 0.906000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6572416424751282, Accuracy = 0.8973782658576965
    =================================================
    2.1600000000000032e-259
    1.0000000000000021e-260
    Training iter #642500: Batch Loss = 0.383029, Accuracy = 0.9520000219345093
    PERFORMANCE ON TEST SET: Batch Loss = 0.6502758264541626, Accuracy = 0.9081148505210876
    Training iter #643000: Batch Loss = 0.386337, Accuracy = 0.9575999975204468
    PERFORMANCE ON TEST SET: Batch Loss = 0.6437661051750183, Accuracy = 0.9126092195510864
    Training iter #643500: Batch Loss = 0.477108, Accuracy = 0.8899999856948853
    PERFORMANCE ON TEST SET: Batch Loss = 0.6680503487586975, Accuracy = 0.898377001285553
    Training iter #644000: Batch Loss = 0.380215, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6367723941802979, Accuracy = 0.9138576984405518
    Training iter #644500: Batch Loss = 0.382677, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6449313163757324, Accuracy = 0.90886390209198
    =================================================
    2.160000000000003e-260
    1.0000000000000021e-261
    Training iter #645000: Batch Loss = 0.387801, Accuracy = 0.9531999826431274
    PERFORMANCE ON TEST SET: Batch Loss = 0.637798011302948, Accuracy = 0.9148564338684082
    Training iter #645500: Batch Loss = 0.409048, Accuracy = 0.9336000084877014
    PERFORMANCE ON TEST SET: Batch Loss = 0.6649718880653381, Accuracy = 0.8981273174285889
    Training iter #646000: Batch Loss = 0.521706, Accuracy = 0.8632000088691711
    PERFORMANCE ON TEST SET: Batch Loss = 0.6575288772583008, Accuracy = 0.8898876309394836
    Training iter #646500: Batch Loss = 0.457836, Accuracy = 0.896399974822998
    PERFORMANCE ON TEST SET: Batch Loss = 0.6804815530776978, Accuracy = 0.8791510462760925
    Training iter #647000: Batch Loss = 0.411477, Accuracy = 0.9315999746322632
    PERFORMANCE ON TEST SET: Batch Loss = 0.6559683084487915, Accuracy = 0.9023720622062683
    =================================================
    2.1600000000000032e-261
    1.0000000000000021e-262
    Training iter #647500: Batch Loss = 0.593344, Accuracy = 0.8348000049591064
    PERFORMANCE ON TEST SET: Batch Loss = 0.776218593120575, Accuracy = 0.82871413230896
    Training iter #648000: Batch Loss = 0.376032, Accuracy = 0.9559999704360962
    PERFORMANCE ON TEST SET: Batch Loss = 0.6436935067176819, Accuracy = 0.9103620648384094
    Training iter #648500: Batch Loss = 0.388266, Accuracy = 0.9535999894142151
    PERFORMANCE ON TEST SET: Batch Loss = 0.6437325477600098, Accuracy = 0.9106117486953735
    Training iter #649000: Batch Loss = 0.383492, Accuracy = 0.9520000219345093
    PERFORMANCE ON TEST SET: Batch Loss = 0.6507254242897034, Accuracy = 0.9076154828071594
    Training iter #649500: Batch Loss = 0.380448, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.638702392578125, Accuracy = 0.9133583307266235
    =================================================
    2.160000000000003e-262
    1.0000000000000021e-263
    Training iter #650000: Batch Loss = 0.371377, Accuracy = 0.9628000259399414
    PERFORMANCE ON TEST SET: Batch Loss = 0.6387483477592468, Accuracy = 0.9151061177253723
    Training iter #650500: Batch Loss = 0.413850, Accuracy = 0.9359999895095825
    PERFORMANCE ON TEST SET: Batch Loss = 0.6453995704650879, Accuracy = 0.9093632698059082
    Training iter #651000: Batch Loss = 0.389058, Accuracy = 0.9531999826431274
    PERFORMANCE ON TEST SET: Batch Loss = 0.6494946479797363, Accuracy = 0.9053682684898376
    Training iter #651500: Batch Loss = 0.398458, Accuracy = 0.9391999840736389
    PERFORMANCE ON TEST SET: Batch Loss = 0.6720257997512817, Accuracy = 0.8823969960212708
    Training iter #652000: Batch Loss = 0.380706, Accuracy = 0.9559999704360962
    PERFORMANCE ON TEST SET: Batch Loss = 0.6380515098571777, Accuracy = 0.9101123809814453
    =================================================
    2.160000000000003e-263
    1.0000000000000022e-264
    Training iter #652500: Batch Loss = 0.382255, Accuracy = 0.9520000219345093
    PERFORMANCE ON TEST SET: Batch Loss = 0.6452522277832031, Accuracy = 0.91161048412323
    Training iter #653000: Batch Loss = 0.380037, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6567201614379883, Accuracy = 0.9031211137771606
    Training iter #653500: Batch Loss = 0.368683, Accuracy = 0.9643999934196472
    PERFORMANCE ON TEST SET: Batch Loss = 0.6441012024879456, Accuracy = 0.91161048412323
    Training iter #654000: Batch Loss = 0.410023, Accuracy = 0.9419999718666077
    PERFORMANCE ON TEST SET: Batch Loss = 0.6838744878768921, Accuracy = 0.8761547803878784
    Training iter #654500: Batch Loss = 0.487250, Accuracy = 0.8827999830245972
    PERFORMANCE ON TEST SET: Batch Loss = 0.6741544008255005, Accuracy = 0.8843945264816284
    =================================================
    2.160000000000003e-264
    1.0000000000000022e-265
    Training iter #655000: Batch Loss = 0.453925, Accuracy = 0.9088000059127808
    PERFORMANCE ON TEST SET: Batch Loss = 0.7744479179382324, Accuracy = 0.8244693875312805
    Training iter #655500: Batch Loss = 0.372374, Accuracy = 0.9616000056266785
    PERFORMANCE ON TEST SET: Batch Loss = 0.6464146375656128, Accuracy = 0.9068664312362671
    Training iter #656000: Batch Loss = 0.390462, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6569157242774963, Accuracy = 0.898377001285553
    Training iter #656500: Batch Loss = 0.409458, Accuracy = 0.9327999949455261
    PERFORMANCE ON TEST SET: Batch Loss = 0.6519544124603271, Accuracy = 0.9006242156028748
    Training iter #657000: Batch Loss = 0.488084, Accuracy = 0.8820000290870667
    PERFORMANCE ON TEST SET: Batch Loss = 0.6697789430618286, Accuracy = 0.8913857936859131
    =================================================
    2.1600000000000033e-265
    1.0000000000000022e-266
    Training iter #657500: Batch Loss = 0.382888, Accuracy = 0.9508000016212463
    PERFORMANCE ON TEST SET: Batch Loss = 0.6585962176322937, Accuracy = 0.8973782658576965
    Training iter #658000: Batch Loss = 0.460082, Accuracy = 0.9016000032424927
    PERFORMANCE ON TEST SET: Batch Loss = 0.735458254814148, Accuracy = 0.8524344563484192
    Training iter #658500: Batch Loss = 0.414288, Accuracy = 0.9340000152587891
    PERFORMANCE ON TEST SET: Batch Loss = 0.7327561974525452, Accuracy = 0.8459425568580627
    Training iter #659000: Batch Loss = 0.459263, Accuracy = 0.8999999761581421
    PERFORMANCE ON TEST SET: Batch Loss = 0.6532295346260071, Accuracy = 0.8943819999694824
    Training iter #659500: Batch Loss = 0.543260, Accuracy = 0.8600000143051147
    PERFORMANCE ON TEST SET: Batch Loss = 0.7520738244056702, Accuracy = 0.849188506603241
    =================================================
    2.1600000000000033e-266
    1.0000000000000021e-267
    Training iter #660000: Batch Loss = 0.415069, Accuracy = 0.9312000274658203
    PERFORMANCE ON TEST SET: Batch Loss = 0.6548365950584412, Accuracy = 0.9026217460632324
    Training iter #660500: Batch Loss = 0.378388, Accuracy = 0.9567999839782715
    PERFORMANCE ON TEST SET: Batch Loss = 0.6414926648139954, Accuracy = 0.9148564338684082
    Training iter #661000: Batch Loss = 0.370347, Accuracy = 0.9628000259399414
    PERFORMANCE ON TEST SET: Batch Loss = 0.6429638862609863, Accuracy = 0.9126092195510864
    Training iter #661500: Batch Loss = 0.394193, Accuracy = 0.9520000219345093
    PERFORMANCE ON TEST SET: Batch Loss = 0.6507112979888916, Accuracy = 0.9061173796653748
    Training iter #662000: Batch Loss = 0.378146, Accuracy = 0.9575999975204468
    PERFORMANCE ON TEST SET: Batch Loss = 0.6409437656402588, Accuracy = 0.9133583307266235
    =================================================
    2.1600000000000035e-267
    1.0000000000000021e-268
    Training iter #662500: Batch Loss = 0.379849, Accuracy = 0.9516000151634216
    PERFORMANCE ON TEST SET: Batch Loss = 0.6724441647529602, Accuracy = 0.8878901600837708
    Training iter #663000: Batch Loss = 0.375936, Accuracy = 0.9588000178337097
    PERFORMANCE ON TEST SET: Batch Loss = 0.6391922235488892, Accuracy = 0.9136080145835876
    Training iter #663500: Batch Loss = 0.383725, Accuracy = 0.9503999948501587
    PERFORMANCE ON TEST SET: Batch Loss = 0.6437073349952698, Accuracy = 0.9131085872650146
    Training iter #664000: Batch Loss = 0.380214, Accuracy = 0.9556000232696533
    PERFORMANCE ON TEST SET: Batch Loss = 0.649381160736084, Accuracy = 0.9126092195510864
    Training iter #664500: Batch Loss = 0.431418, Accuracy = 0.9251999855041504
    PERFORMANCE ON TEST SET: Batch Loss = 0.8431695699691772, Accuracy = 0.7982521653175354
    =================================================
    2.1600000000000036e-268
    1.0000000000000021e-269
    Training iter #665000: Batch Loss = 0.385605, Accuracy = 0.9556000232696533
    PERFORMANCE ON TEST SET: Batch Loss = 0.6449909806251526, Accuracy = 0.9103620648384094
    Training iter #665500: Batch Loss = 0.380207, Accuracy = 0.9575999975204468
    PERFORMANCE ON TEST SET: Batch Loss = 0.6410624384880066, Accuracy = 0.9131085872650146
    Training iter #666000: Batch Loss = 0.379355, Accuracy = 0.954800009727478
    PERFORMANCE ON TEST SET: Batch Loss = 0.6405993700027466, Accuracy = 0.9151061177253723
    Training iter #666500: Batch Loss = 0.370928, Accuracy = 0.9616000056266785
    PERFORMANCE ON TEST SET: Batch Loss = 0.6426202058792114, Accuracy = 0.9128589034080505
    Training iter #667000: Batch Loss = 0.389853, Accuracy = 0.9524000287055969
    PERFORMANCE ON TEST SET: Batch Loss = 0.6417115330696106, Accuracy = 0.9126092195510864
    =================================================
    2.1600000000000038e-269
    1.0000000000000021e-270
    Training iter #667500: Batch Loss = 0.377069, Accuracy = 0.9611999988555908
    PERFORMANCE ON TEST SET: Batch Loss = 0.6418988704681396, Accuracy = 0.9118601679801941
    Training iter #668000: Batch Loss = 0.369533, Accuracy = 0.9603999853134155
    PERFORMANCE ON TEST SET: Batch Loss = 0.6395241618156433, Accuracy = 0.9113608002662659
    Training iter #668500: Batch Loss = 0.373165, Accuracy = 0.9607999920845032
    PERFORMANCE ON TEST SET: Batch Loss = 0.637636125087738, Accuracy = 0.9128589034080505
    Training iter #669000: Batch Loss = 0.377663, Accuracy = 0.9588000178337097
    PERFORMANCE ON TEST SET: Batch Loss = 0.634769082069397, Accuracy = 0.91161048412323
    Training iter #669500: Batch Loss = 0.511281, Accuracy = 0.876800000667572
    PERFORMANCE ON TEST SET: Batch Loss = 0.7657918334007263, Accuracy = 0.8249688148498535
    =================================================
    2.1600000000000037e-270
    1.0000000000000022e-271
    Training iter #670000: Batch Loss = 0.400299, Accuracy = 0.9308000206947327
    PERFORMANCE ON TEST SET: Batch Loss = 0.6661520600318909, Accuracy = 0.8981273174285889
    Training iter #670500: Batch Loss = 0.421414, Accuracy = 0.9332000017166138
    PERFORMANCE ON TEST SET: Batch Loss = 0.6674116253852844, Accuracy = 0.8943819999694824
    Training iter #671000: Batch Loss = 0.717405, Accuracy = 0.782800018787384
    PERFORMANCE ON TEST SET: Batch Loss = 0.7464597225189209, Accuracy = 0.8421972393989563
    Training iter #671500: Batch Loss = 0.384867, Accuracy = 0.9520000219345093
    PERFORMANCE ON TEST SET: Batch Loss = 0.6454986929893494, Accuracy = 0.9091135859489441
    Training iter #672000: Batch Loss = 0.370748, Accuracy = 0.9567999839782715
    PERFORMANCE ON TEST SET: Batch Loss = 0.638965368270874, Accuracy = 0.91435706615448
    =================================================
    2.1600000000000036e-271
    1.0000000000000022e-272
    Training iter #672500: Batch Loss = 0.384968, Accuracy = 0.9571999907493591
    PERFORMANCE ON TEST SET: Batch Loss = 0.6398500800132751, Accuracy = 0.9136080145835876
    Training iter #673000: Batch Loss = 0.376548, Accuracy = 0.9575999975204468
    PERFORMANCE ON TEST SET: Batch Loss = 0.6366418600082397, Accuracy = 0.9141073822975159
    Training iter #673500: Batch Loss = 0.372327, Accuracy = 0.9584000110626221
    PERFORMANCE ON TEST SET: Batch Loss = 0.6512470245361328, Accuracy = 0.9041198492050171
    Training iter #674000: Batch Loss = 0.432205, Accuracy = 0.9164000153541565
    PERFORMANCE ON TEST SET: Batch Loss = 0.7603892087936401, Accuracy = 0.8329588174819946
    Training iter #674500: Batch Loss = 0.418310, Accuracy = 0.9272000193595886
    PERFORMANCE ON TEST SET: Batch Loss = 0.6524858474731445, Accuracy = 0.9056179523468018
    =================================================
    2.1600000000000035e-272
    1.0000000000000021e-273
    Training iter #675000: Batch Loss = 0.393046, Accuracy = 0.946399986743927
    PERFORMANCE ON TEST SET: Batch Loss = 0.6917651295661926, Accuracy = 0.872409462928772
    Training iter #675500: Batch Loss = 0.560906, Accuracy = 0.8515999913215637
    PERFORMANCE ON TEST SET: Batch Loss = 0.8288483023643494, Accuracy = 0.8122346997261047
    Training iter #676000: Batch Loss = 0.382587, Accuracy = 0.9616000056266785
    PERFORMANCE ON TEST SET: Batch Loss = 0.645340621471405, Accuracy = 0.9146067500114441
    Training iter #676500: Batch Loss = 0.377835, Accuracy = 0.9588000178337097
    PERFORMANCE ON TEST SET: Batch Loss = 0.6359864473342896, Accuracy = 0.9153558015823364
    Training iter #677000: Batch Loss = 0.377256, Accuracy = 0.9539999961853027
    PERFORMANCE ON TEST SET: Batch Loss = 0.6396780014038086, Accuracy = 0.9151061177253723
    =================================================
    2.1600000000000035e-273
    1.0000000000000021e-274
    Training iter #677500: Batch Loss = 0.383553, Accuracy = 0.9535999894142151
    PERFORMANCE ON TEST SET: Batch Loss = 0.6663534045219421, Accuracy = 0.8966292142868042
    Training iter #678000: Batch Loss = 0.425844, Accuracy = 0.9296000003814697
    PERFORMANCE ON TEST SET: Batch Loss = 0.6614137887954712, Accuracy = 0.9001248478889465
    Training iter #678500: Batch Loss = 0.387313, Accuracy = 0.9495999813079834
    PERFORMANCE ON TEST SET: Batch Loss = 0.6651899814605713, Accuracy = 0.8938826322555542
    Training iter #679000: Batch Loss = 0.417393, Accuracy = 0.9200000166893005
    PERFORMANCE ON TEST SET: Batch Loss = 0.6450942754745483, Accuracy = 0.903870165348053
    Training iter #679500: Batch Loss = 0.398291, Accuracy = 0.9399999976158142
    PERFORMANCE ON TEST SET: Batch Loss = 0.6474828124046326, Accuracy = 0.9101123809814453
    =================================================
    2.1600000000000035e-274
    1.0000000000000022e-275
    Training iter #680000: Batch Loss = 0.380514, Accuracy = 0.9556000232696533
    PERFORMANCE ON TEST SET: Batch Loss = 0.64075767993927, Accuracy = 0.91435706615448
    Training iter #680500: Batch Loss = 0.373562, Accuracy = 0.9616000056266785
    PERFORMANCE ON TEST SET: Batch Loss = 0.6406874060630798, Accuracy = 0.9131085872650146
    Training iter #681000: Batch Loss = 0.372629, Accuracy = 0.9580000042915344
    PERFORMANCE ON TEST SET: Batch Loss = 0.6439042091369629, Accuracy = 0.9146067500114441
    Training iter #681500: Batch Loss = 0.385869, Accuracy = 0.9575999975204468
    PERFORMANCE ON TEST SET: Batch Loss = 0.6491464376449585, Accuracy = 0.90886390209198
    Training iter #682000: Batch Loss = 0.389434, Accuracy = 0.9484000205993652
    PERFORMANCE ON TEST SET: Batch Loss = 0.6379757523536682, Accuracy = 0.91435706615448
    =================================================
    2.1600000000000034e-275
    1.000000000000002e-276
    Training iter #682500: Batch Loss = 0.402565, Accuracy = 0.9404000043869019
    PERFORMANCE ON TEST SET: Batch Loss = 0.720383882522583, Accuracy = 0.8529338240623474
    Training iter #683000: Batch Loss = 0.377194, Accuracy = 0.9503999948501587
    PERFORMANCE ON TEST SET: Batch Loss = 0.6456549167633057, Accuracy = 0.9096130132675171
    Training iter #683500: Batch Loss = 0.385621, Accuracy = 0.9544000029563904
    PERFORMANCE ON TEST SET: Batch Loss = 0.6488852500915527, Accuracy = 0.9096130132675171
    Training iter #684000: Batch Loss = 0.379154, Accuracy = 0.9544000029563904
    PERFORMANCE ON TEST SET: Batch Loss = 0.6453949213027954, Accuracy = 0.9091135859489441
    Training iter #684500: Batch Loss = 0.467259, Accuracy = 0.8952000141143799
    PERFORMANCE ON TEST SET: Batch Loss = 0.8817577362060547, Accuracy = 0.7820224761962891
    =================================================
    2.1600000000000034e-276
    1.0000000000000021e-277
    Training iter #685000: Batch Loss = 0.384356, Accuracy = 0.9527999758720398
    PERFORMANCE ON TEST SET: Batch Loss = 0.6850208640098572, Accuracy = 0.882896363735199
    Training iter #685500: Batch Loss = 0.518915, Accuracy = 0.8636000156402588
    PERFORMANCE ON TEST SET: Batch Loss = 0.7272800803184509, Accuracy = 0.8581773042678833
    Training iter #686000: Batch Loss = 0.534277, Accuracy = 0.8640000224113464
    PERFORMANCE ON TEST SET: Batch Loss = 0.6978127360343933, Accuracy = 0.8833957314491272
    Training iter #686500: Batch Loss = 0.395918, Accuracy = 0.9435999989509583
    PERFORMANCE ON TEST SET: Batch Loss = 0.6914960145950317, Accuracy = 0.8776529431343079
    Training iter #687000: Batch Loss = 0.456638, Accuracy = 0.9052000045776367
    PERFORMANCE ON TEST SET: Batch Loss = 0.8232874274253845, Accuracy = 0.8132334351539612
    =================================================
    2.1600000000000034e-277
    1.0000000000000021e-278
    Training iter #687500: Batch Loss = 0.446959, Accuracy = 0.906000018119812
    PERFORMANCE ON TEST SET: Batch Loss = 0.6474595665931702, Accuracy = 0.9091135859489441
    Training iter #688000: Batch Loss = 0.390266, Accuracy = 0.9476000070571899
    PERFORMANCE ON TEST SET: Batch Loss = 0.6544313430786133, Accuracy = 0.9041198492050171
    Training iter #688500: Batch Loss = 0.373149, Accuracy = 0.9575999975204468
    PERFORMANCE ON TEST SET: Batch Loss = 0.6399070620536804, Accuracy = 0.9126092195510864
    Training iter #689000: Batch Loss = 0.386711, Accuracy = 0.9516000151634216
    PERFORMANCE ON TEST SET: Batch Loss = 0.6451526284217834, Accuracy = 0.91161048412323

  • 相关阅读:
    【Git】git使用
    【Git】git使用
    【Git】git使用
    【spring boot】SpringBoot初学(8)– 简单整合redis
    「Flink」使用Managed Keyed State实现计数窗口功能
    「Flink」Flink的状态管理与容错
    「数据挖掘入门序列」数据挖掘模型分类与预测
    「Flink」使用Java lambda表达式实现Flink WordCount
    Java 8 函数式编程
    「数据挖掘入门系列」数据挖掘模型之分类和预测
  • 原文地址:https://www.cnblogs.com/herd/p/10738389.html
Copyright © 2011-2022 走看看