Densely Connected Convolutional Networks 論文閱讀


畢設終於告一段落,傳統方法的視覺做得我整個人都很奔潰,終於結束,可以看些擱置很久的一些論文了,嚶嚶嚶

Densely Connected Convolutional Networks 其實很早就出來了,cvpr 2017 best paper

 

覺得讀論文前,還是把dense net的整個網絡結構放到http://ethereon.github.io/netscope/#/editor 上面可視化看一下,會更加容易理解,總體這篇論文很好理解

 

 

上圖是一個5層的dense block,每個dense block的growth rate k=4

 

論文開頭給出了densnet的幾個優點:

1、Our proposed DenseNet architecture explicitly differentiates between information that is added to the network and information that is preserved.DenseNet layers are very narrow (e.g., 12 filters per layer),adding only a small set of feature-maps to the “collective knowledge” of the network and keep the remaining featuremaps
unchanged—and the final classifier makes a decision based on all feature-maps in the network

densnet 網絡結構參數少,每個block里面的filter也比較少,而我們在使用alexnet,通常filter都是上百的,而這里的filter 12、24、16 等,所以非常narrow

2、one big advantage of DenseNets is their improved flow of information and gradients throughout the network, which makes them easy to train.

densenet網絡從上圖中可以看出,每層都和后面的層相連,(第一幅圖沒有畫出來每個block中的層之間的連接,覺得應該結合第一個圖和第二個圖,才算完整,因為第二個圖每個block后面的輸入是前面所有層concat一起的結果,相當於圖一顯示的那樣。。。。在可視化工具里面看,最明顯了,而且還能看到每一層的實際大小)有利於信息和梯度在整個網絡中的傳遞。

3、we also observe that dense connections have a regularizing effect, which reduces overfitting on tasks with smaller training set sizes.

同時densenet網絡也有正則化的作用,在小數據集上訓練也能減少過擬合的風險

 

densenet中是將前面幾層concat一起,而resnet是求和,論文中提到這種求和會影響信息在網絡中的傳遞

 

 transition layers:

這層就是連接兩個block之間的層,由BN層,1x1 卷積層和2x2的avg pooling層構成,如下圖所示

 

Growth rate:

也就是每個block里面的層數,如圖一中,每個block里面有4層,所以growth rate=4

Botttleneck layers:

It has been noted in [36, 11] that a 1X1 convolution can be introduced as bottleneck layer before each 3X3 convolution to reduce the number of input feature-maps, and thus to improve computational efficiency.

 是指每一層中在3X3卷積核前面有個1X1的卷積核,作用是減少輸入feature-map的數量,如下圖所示,512的數量變成了128個

 

Compression:

If a dense block contains m feature-maps, we let the following transition layer generate  [θm] output featuremaps,where 0<θ <=1referred to as the compression factor.

讓transition layer壓縮block輸出的feature map數量。

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM