【python實現卷積神經網絡】Dropout層實現


代碼來源:https://github.com/eriklindernoren/ML-From-Scratch

卷積神經網絡中卷積層Conv2D(帶stride、padding)的具體實現:https://www.cnblogs.com/xiximayou/p/12706576.html

激活函數的實現(sigmoid、softmax、tanh、relu、leakyrelu、elu、selu、softplus):https://www.cnblogs.com/xiximayou/p/12713081.html

損失函數定義(均方誤差、交叉熵損失):https://www.cnblogs.com/xiximayou/p/12713198.html

優化器的實現(SGD、Nesterov、Adagrad、Adadelta、RMSprop、Adam):https://www.cnblogs.com/xiximayou/p/12713594.html

卷積層反向傳播過程:https://www.cnblogs.com/xiximayou/p/12713930.html

全連接層實現:https://www.cnblogs.com/xiximayou/p/12720017.html

批量歸一化層實現:https://www.cnblogs.com/xiximayou/p/12720211.html

池化層實現:https://www.cnblogs.com/xiximayou/p/12720324.html

padding2D實現:https://www.cnblogs.com/xiximayou/p/12720454.html

Flatten層實現:https://www.cnblogs.com/xiximayou/p/12720518.html

上采樣層UpSampling2D實現:https://www.cnblogs.com/xiximayou/p/12720558.html

 

class Dropout(Layer):
    """A layer that randomly sets a fraction p of the output units of the previous layer
    to zero.
    Parameters:
    -----------
    p: float
        The probability that unit x is set to zero.
    """
    def __init__(self, p=0.2):
        self.p = p
        self._mask = None
        self.input_shape = None
        self.n_units = None
        self.pass_through = True
        self.trainable = True

    def forward_pass(self, X, training=True):
        c = (1 - self.p)
        if training:
            self._mask = np.random.uniform(size=X.shape) > self.p
            c = self._mask
        return X * c

    def backward_pass(self, accum_grad):
        return accum_grad * self._mask

    def output_shape(self):
        return self.input_shape

核心就是生成一個隨機失活神經元的遮罩。


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM