代碼來源:https://github.com/eriklindernoren/ML-From-Scratch
卷積神經網絡中卷積層Conv2D(帶stride、padding)的具體實現:https://www.cnblogs.com/xiximayou/p/12706576.html
激活函數的實現(sigmoid、softmax、tanh、relu、leakyrelu、elu、selu、softplus):https://www.cnblogs.com/xiximayou/p/12713081.html
損失函數定義(均方誤差、交叉熵損失):https://www.cnblogs.com/xiximayou/p/12713198.html
優化器的實現(SGD、Nesterov、Adagrad、Adadelta、RMSprop、Adam):https://www.cnblogs.com/xiximayou/p/12713594.html
卷積層反向傳播過程:https://www.cnblogs.com/xiximayou/p/12713930.html
全連接層實現:https://www.cnblogs.com/xiximayou/p/12720017.html
批量歸一化層實現:https://www.cnblogs.com/xiximayou/p/12720211.html
池化層實現:https://www.cnblogs.com/xiximayou/p/12720324.html
padding2D實現:https://www.cnblogs.com/xiximayou/p/12720454.html
Flatten層實現:https://www.cnblogs.com/xiximayou/p/12720518.html
class UpSampling2D(Layer): """ Nearest neighbor up sampling of the input. Repeats the rows and columns of the data by size[0] and size[1] respectively. Parameters: ----------- size: tuple (size_y, size_x) - The number of times each axis will be repeated. """ def __init__(self, size=(2,2), input_shape=None): self.prev_shape = None self.trainable = True self.size = size self.input_shape = input_shape def forward_pass(self, X, training=True): self.prev_shape = X.shape # Repeat each axis as specified by size X_new = X.repeat(self.size[0], axis=2).repeat(self.size[1], axis=3) return X_new def backward_pass(self, accum_grad): # Down sample input to previous shape accum_grad = accum_grad[:, :, ::self.size[0], ::self.size[1]] return accum_grad def output_shape(self): channels, height, width = self.input_shape return channels, self.size[0] * height, self.size[1] * width
核心就是numpy.repeat()函數。