zoukankan      html  css  js  c++  java
  • [机器学习] keras:MNIST手写数字体识别(DeepLearning 的 HelloWord程序)

    深度学习界的Hello Word程序:MNIST手写数字体识别
    learn from(仍然是李宏毅老师《机器学习》课程):http://speech.ee.ntu.edu.tw/~tlkagk/courses_ML17_2.html

    import numpy as np 
    from keras.models import Sequential
    from keras.layers.core import Dense, Dropout, Activation
    from keras.layers import Conv2D, MaxPooling2D, Flatten
    from keras.optimizers import SGD, Adam
    from keras.utils import np_utils
    from keras.datasets import mnist
    
    #定义数据加载、预处理函数
    def load_data():
        (x_train, y_train), (x_test, y_test) = mnist.load_data()
        number = 10000
        x_train = x_train[0:number]
        y_train = y_train[0:number]
        x_train = x_train.reshape(number, 28*28)
        x_test = x_test.reshape(x_test.shape[0], 28*28)
        x_train = x_train.astype('float32')
        x_test = x_test.astype('float32')
    #keras.utils.to_categorical(y, num_classes=None, dtype='float32')
    #自动处理成one-hot的numpy数组。y为标签数据,num_classes是一共有几类)
        y_train = np_utils.to_categorical(y_train, 10)
        y_test = np_utils.to_categorical(y_test, 10)
    
        x_train = x_train/255   #图像数据预处理 标准化
        x_test = x_test/255
        return (x_train, y_train), (x_test, y_test)
    
    
    (x_train, y_train), (x_test, y_test) = load_data()
    
    #搭建网络
    model = Sequential()
    model.add(Dense(input_dim=28*28, units=689, activation='relu'))
    model.add(Dense(units=689, activation='relu'))
    model.add(Dense(units=10, activation='softmax'))
    model.compile(loss='categorical_crossentropy',optimizer=SGD(lr=0.1),metrics=['accuracy'])
    #训练走起!
    model.fit(x_train, y_train, batch_size=50, epochs=20)
    #看train出模型在test data上的表现如何
    result = model.evaluate(x_test,y_test)
    print('Test Acc:',result[1])
    

    训练量很小,我们就不切GPU训了。效果:

    Using TensorFlow backend.
    Epoch 1/20
    10000/10000 [==============================] - 2s 166us/step - loss: 0.5926 - acc: 0.8301
    Epoch 2/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.2686 - acc: 0.9234
    Epoch 3/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.2015 - acc: 0.9407
    Epoch 4/20
    10000/10000 [==============================] - 1s 106us/step - loss: 0.1546 - acc: 0.9568
    Epoch 5/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.1238 - acc: 0.9645
    Epoch 6/20
    10000/10000 [==============================] - 1s 106us/step - loss: 0.0977 - acc: 0.9744
    Epoch 7/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0808 - acc: 0.9790
    Epoch 8/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0638 - acc: 0.9838
    Epoch 9/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0514 - acc: 0.9875
    Epoch 10/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0424 - acc: 0.9905
    Epoch 11/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0331 - acc: 0.9936
    Epoch 12/20
    10000/10000 [==============================] - 1s 108us/step - loss: 0.0267 - acc: 0.9960
    Epoch 13/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0209 - acc: 0.9972
    Epoch 14/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0177 - acc: 0.9977
    Epoch 15/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0144 - acc: 0.9986
    Epoch 16/20
    10000/10000 [==============================] - 1s 106us/step - loss: 0.0115 - acc: 0.9993
    Epoch 17/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0102 - acc: 0.9994
    Epoch 18/20
    10000/10000 [==============================] - 1s 106us/step - loss: 0.0084 - acc: 0.9997
    Epoch 19/20
    10000/10000 [==============================] - 1s 106us/step - loss: 0.0071 - acc: 0.9998
    Epoch 20/20
    10000/10000 [==============================] - 1s 107us/step - loss: 0.0064 - acc: 0.9999
    10000/10000 [==============================] - 0s 38us/step
    Test Acc: 0.9573
    
  • 相关阅读:
    4章假设检验
    参数估计
    3 抽样分布
    2.描述性统计的matlab 实现
    《做时间的朋友》第五章 小心所谓成功学
    《把时间当作朋友》第四章 开拓我们的心智
    《把时间当作朋友》第三章 提高心智,和时间做朋友
    《把时间当作朋友》第二章 开启自己的心智
    《把时间当作朋友》第1章 心智的力量
    《把时间当作朋友》1
  • 原文地址:https://www.cnblogs.com/importGPX/p/11263919.html
Copyright © 2011-2022 走看看