用Keras搭建神經網絡 簡單模版(二)——Classifier分類(手寫數字識別)



#
-*- coding: utf-8 -*- import numpy as np np.random.seed(1337) #for reproducibility再現性 from keras.datasets import mnist from keras.utils import np_utils from keras.models import Sequential#按層 from keras.layers import Dense, Activation#全連接層 import matplotlib.pyplot as plt from keras.optimizers import RMSprop

從mnist下載手寫數字圖片數據集,圖片為28*28,將每個像素的顏色(0到255)改為(0倒1),將標簽y變為10個長度,若為1,則在1處為1,剩下的都標為0。

#dowmload the mnisst the path '~/.keras/datasets/' if it is the first time to be called #x shape (60000 28*28),y shape(10000,) (x_train,y_train),(x_test,y_test) = mnist.load_data()#0-9的圖片數據集 #data pre-processing x_train = x_train.reshape(x_train.shape[0],-1)/255 #normalize 到【0,1】 x_test = x_test.reshape(x_test.shape[0],-1)/255 y_train = np_utils.to_categorical(y_train, num_classes=10) #把標簽變為10個長度,若為1,則在1處為1,剩下的都標為0 y_test = np_utils.to_categorical(y_test,num_classes=10)

 

搭建神經網絡,Activation為激活函數。由於第一個Dense傳出32.所以第二個的Dense默認傳進32,不用特意設置。

#Another way to build neural net model = Sequential([ Dense(32,input_dim=784),#傳出32 Activation('relu'), Dense(10), Activation('softmax') ]) #Another way to define optimizer rmsprop = RMSprop(lr=0.001,rho=0.9,epsilon=1e-08,decay=0.0) # We add metrics to get more results you want to see model.compile( #編譯 optimizer = rmsprop, loss = 'categorical_crossentropy', metrics=['accuracy'], #在更新時同時計算一下accuracy )

 

訓練和測試

print("Training~~~~~~~~") #Another way to train the model model.fit(x_train,y_train, epochs=2, batch_size=32) #訓練2大批,每批32個 print("\nTesting~~~~~~~~~~") #Evalute the model with the metrics we define earlier loss,accuracy = model.evaluate(x_test,y_test) print('test loss:',loss) print('test accuracy:', accuracy)

 

全代碼:

 

# -*- coding: utf-8 -*-
import numpy as np
np.random.seed(1337) #for reproducibility再現性
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential#按層
from keras.layers import Dense, Activation#全連接層
import matplotlib.pyplot as plt
from keras.optimizers import RMSprop

#dowmload the mnisst the path '~/.keras/datasets/' if it is the first time to be called
#x shape (60000 28*28),y shape(10000,)
(x_train,y_train),(x_test,y_test) = mnist.load_data()#0-9的圖片數據集

#data pre-processing
x_train = x_train.reshape(x_train.shape[0],-1)/255 #normalize 到【0,1】
x_test = x_test.reshape(x_test.shape[0],-1)/255
y_train = np_utils.to_categorical(y_train, num_classes=10) #把標簽變為10個長度,若為1,則在1處為1,剩下的都標為0
y_test = np_utils.to_categorical(y_test,num_classes=10)

#Another way to build neural net
model = Sequential([
        Dense(32,input_dim=784),#傳出32
        Activation('relu'),
        Dense(10),
        Activation('softmax')
        ])

#Another way to define optimizer
rmsprop = RMSprop(lr=0.001,rho=0.9,epsilon=1e-08,decay=0.0)

# We add metrics to get more results you want to see
model.compile( #編譯
        optimizer = rmsprop,
        loss = 'categorical_crossentropy',
        metrics=['accuracy'], #在更新時同時計算一下accuracy
        )

print("Training~~~~~~~~")
#Another way to train the model
model.fit(x_train,y_train, epochs=2, batch_size=32) #訓練2大批,每批32個

print("\nTesting~~~~~~~~~~")
#Evalute the model with the  metrics we define earlier
loss,accuracy = model.evaluate(x_test,y_test)

print('test loss:',loss)
print('test accuracy:', accuracy)
View Code

 

 

 

結果為:

 

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM