PERFORMANCE ON TEST SET: Batch Loss = 0.6423985362052917, Accuracy = 0.9051185846328735
Training iter #584292: Batch Loss = 0.357018, Accuracy = 0.9660000205039978
PERFORMANCE ON TEST SET: Batch Loss = 0.6445194482803345, Accuracy = 0.9026217460632324
Training iter #584296: Batch Loss = 0.371959, Accuracy = 0.9516000151634216
PERFORMANCE ON TEST SET: Batch Loss = 0.6355495452880859, Accuracy = 0.9136080145835876
Training iter #584300: Batch Loss = 0.379772, Accuracy = 0.9495999813079834
PERFORMANCE ON TEST SET: Batch Loss = 0.6288002133369446, Accuracy = 0.9158551692962646
Training iter #584304: Batch Loss = 0.364809, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.630466878414154, Accuracy = 0.9111111164093018
Training iter #584308: Batch Loss = 0.362532, Accuracy = 0.9664000272750854
PERFORMANCE ON TEST SET: Batch Loss = 0.6333655714988708, Accuracy = 0.9141073822975159
Training iter #584312: Batch Loss = 0.367023, Accuracy = 0.9607999920845032
PERFORMANCE ON TEST SET: Batch Loss = 0.6294339895248413, Accuracy = 0.9158551692962646
Training iter #584316: Batch Loss = 0.358729, Accuracy = 0.9696000218391418
PERFORMANCE ON TEST SET: Batch Loss = 0.6266082525253296, Accuracy = 0.9186017513275146
Training iter #584320: Batch Loss = 0.364320, Accuracy = 0.9652000069618225
PERFORMANCE ON TEST SET: Batch Loss = 0.6318942904472351, Accuracy = 0.9181023836135864
Training iter #584324: Batch Loss = 0.362601, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.6292382478713989, Accuracy = 0.9161048531532288
Training iter #584328: Batch Loss = 0.354073, Accuracy = 0.967199981212616
PERFORMANCE ON TEST SET: Batch Loss = 0.6256863474845886, Accuracy = 0.9181023836135864
Training iter #584332: Batch Loss = 0.358450, Accuracy = 0.9692000150680542
PERFORMANCE ON TEST SET: Batch Loss = 0.6302834153175354, Accuracy = 0.9198501706123352
Training iter #584336: Batch Loss = 0.352892, Accuracy = 0.9652000069618225
PERFORMANCE ON TEST SET: Batch Loss = 0.6302400827407837, Accuracy = 0.9161048531532288
Training iter #584340: Batch Loss = 0.352440, Accuracy = 0.9700000286102295
PERFORMANCE ON TEST SET: Batch Loss = 0.6251929998397827, Accuracy = 0.9186017513275146
Training iter #584344: Batch Loss = 0.351044, Accuracy = 0.9667999744415283
PERFORMANCE ON TEST SET: Batch Loss = 0.6341428756713867, Accuracy = 0.9141073822975159
Training iter #584348: Batch Loss = 0.351749, Accuracy = 0.9652000069618225
PERFORMANCE ON TEST SET: Batch Loss = 0.6319619417190552, Accuracy = 0.9173533320426941
Training iter #584352: Batch Loss = 0.349911, Accuracy = 0.9679999947547913
PERFORMANCE ON TEST SET: Batch Loss = 0.6262838840484619, Accuracy = 0.9183520674705505
Training iter #584356: Batch Loss = 0.351974, Accuracy = 0.9715999960899353
PERFORMANCE ON TEST SET: Batch Loss = 0.6314669847488403, Accuracy = 0.9158551692962646
Training iter #584360: Batch Loss = 0.357018, Accuracy = 0.9656000137329102
PERFORMANCE ON TEST SET: Batch Loss = 0.6328622102737427, Accuracy = 0.9163545370101929
Training iter #584364: Batch Loss = 0.360940, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.6280090808868408, Accuracy = 0.91710364818573
Training iter #584368: Batch Loss = 0.351369, Accuracy = 0.9692000150680542
PERFORMANCE ON TEST SET: Batch Loss = 0.6339057683944702, Accuracy = 0.91435706615448
Training iter #584372: Batch Loss = 0.354622, Accuracy = 0.9652000069618225
PERFORMANCE ON TEST SET: Batch Loss = 0.6330746412277222, Accuracy = 0.9151061177253723
Training iter #584376: Batch Loss = 0.367034, Accuracy = 0.9595999717712402
PERFORMANCE ON TEST SET: Batch Loss = 0.6285595893859863, Accuracy = 0.9183520674705505
Training iter #584380: Batch Loss = 0.348664, Accuracy = 0.9728000164031982
PERFORMANCE ON TEST SET: Batch Loss = 0.6320279836654663, Accuracy = 0.916604220867157
Training iter #584384: Batch Loss = 0.358708, Accuracy = 0.9624000191688538
PERFORMANCE ON TEST SET: Batch Loss = 0.6281108856201172, Accuracy = 0.9208489656448364
Training iter #584388: Batch Loss = 0.358550, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.6256601214408875, Accuracy = 0.9186017513275146
Training iter #584392: Batch Loss = 0.347133, Accuracy = 0.9724000096321106
PERFORMANCE ON TEST SET: Batch Loss = 0.6251524090766907, Accuracy = 0.919350802898407
Training iter #584396: Batch Loss = 0.354937, Accuracy = 0.9660000205039978
PERFORMANCE ON TEST SET: Batch Loss = 0.6278438568115234, Accuracy = 0.9200998544692993
Training iter #584400: Batch Loss = 0.360063, Accuracy = 0.9656000137329102
PERFORMANCE ON TEST SET: Batch Loss = 0.6272940635681152, Accuracy = 0.9205992221832275
Training iter #584404: Batch Loss = 0.350955, Accuracy = 0.9692000150680542
PERFORMANCE ON TEST SET: Batch Loss = 0.6247848272323608, Accuracy = 0.9176030158996582
Training iter #584408: Batch Loss = 0.357840, Accuracy = 0.967199981212616
PERFORMANCE ON TEST SET: Batch Loss = 0.6298718452453613, Accuracy = 0.9218477010726929
Training iter #584412: Batch Loss = 0.364055, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.6254392862319946, Accuracy = 0.9210986495018005