python 神經網絡包 NeuroLab


neurolab模塊相當於Matlab的神經網絡工具箱(NNT)

neurolab模塊支持的網絡類型:

  • 單層感知機(single layer perceptron)
  • 多層前饋感知機(Multilayer feed forward perceptron)
  • 競爭層(Kohonen Layer)
  • 學習向量量化(Learning Vector Quantization)
  • Elman循環網絡(Elman recurrent network)
  • Hopfield循環網絡(Hopfield recurrent network)
  • 卷邊循環網絡(Hemming recurrent network)

這里以多層前饋網絡為例:neurolab.net.newff(minmaxsizetransf=None)

Parameters:
minmax: list of list, the outer list is the number of input neurons,

inner lists must contain 2 elements: min and max

Range of input value

size: the length of list equal to the number of layers except input layer,

the element of the list is the neuron number for corresponding layer

Contains the number of neurons for each layer

transf: list (default TanSig)

List of activation function for each layer

minmax:列表的列表,外層列表表示輸入層的神經元個數,內層列表必須包含兩個元素:max和min

size:列表的長度等於出去輸入層的網絡的層數,列表的元素對應於各層的神經元個數

transf:激活函數,默認為TanSig。

 

 

 

舉例2:

perceptron = nl.net.newp([[0, 2],[0, 2]], 1)
第一個參數列表的長度表示輸出的節點的個數,列表中得每一個元素包含兩個值:最大值和最小值。
第二個參數:The value “1” indicates that there is a single neuron in this network.
error = perceptron.train(input_data, output, epochs=50, show=15, lr=0.01)
epochs:表示迭代訓練的次數,show:表示終端輸出的頻率,lr:表示學習率

舉例3:

import numpy as np
import neurolab as nl

input = np.random.uniform(0, 0.1, (1000, 225))
output = input[:,:10] + input[:,10:20]
# 2 layers with 225 inputs 50 neurons in hidden\input layer and 10 in output
# for 3 layers use some thet: nl.net.newff([[0, .1]]*225, [50, 40, 10])
net = nl.net.newff([[0, .1]]*225, [50, 10])
net.trainf = nl.train.train_bfgs

e = net.train(input, output, show=1, epochs=100, goal=0.0001)

舉例4:

import neurolab as nl
import numpy as np
# Create train samples
x = np.linspace(-7, 7, 20)
y = np.sin(x) * 0.5

size = len(x)

inp = x.reshape(size,1)
tar = y.reshape(size,1)

# Create network with 2 layers and random initialized
net = nl.net.newff([[-7, 7]],[5, 1])

# Train network
error = net.train(inp, tar, epochs=500, show=100, goal=0.02)

# Simulate network
out = net.sim(inp)

# Plot result
import pylab as pl
pl.subplot(211)
pl.plot(error)
pl.xlabel('Epoch number')
pl.ylabel('error (default SSE)')

x2 = np.linspace(-6.0,6.0,150)
y2 = net.sim(x2.reshape(x2.size,1)).reshape(x2.size)
print(len(y2))
y3 = out.reshape(size)
pl.subplot(212)
pl.plot(x2, y2, '-',x , y, '.', x, y3, 'p')
pl.legend(['train target', 'net output'])
pl.show()

 

資料還有很多,以后繼續補充

重點參考:官網
資料




免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM