為caffe添加最簡單的全通層AllPassLayer


參考趙永科的博客,這里我們實現一個新 Layer,名稱為 AllPassLayer,顧名思義就是全通 Layer,“全通”借鑒於信號處理中的全通濾波器,將信號無失真地從輸入轉到輸出。

雖然這個 Layer 並沒有什么卵用,但是在這個基礎上增加你的處理是非常簡單的事情。另外也是出於實驗考慮,全通層的 Forward/Backward 函數非常簡單不需要讀者有任何高等數學和求導的背景知識。讀者使用該層時可以插入到任何已有網絡中,而不會影響訓練、預測的准確性。

 

首先,要把你的實現,要像正常的 Layer 類一樣,分解為聲明部分和實現部分,分別放在 .hpp 與 .cpp、.cu 中。Layer 名稱要起一個能區別於原版實現的新名稱。.hpp 文件置於 $CAFFE_ROOT/include/caffe/layers/,而 .cpp 和 .cu 置於 $CAFFE_ROOT/src/caffe/layers/,這樣你在 $CAFFE_ROOT 下執行 make 編譯時,會自動將這些文件加入構建過程,省去了手動設置編譯選項的繁瑣流程。

其次,在 $CAFFE_ROOT/src/caffe/proto/caffe.proto 中,增加新 LayerParameter 選項,這樣你在編寫 train.prototxt 或者 test.prototxt 或者 deploy.prototxt 時就能把新 Layer 的描述寫進去,便於修改網絡結構和替換其他相同功能的 Layer 了。

最后也是最容易忽視的一點,在 Layer 工廠注冊新 Layer 加工函數,不然在你運行過程中可能會報如下錯誤:

首先看頭文件(all_pass_layer.hpp):

#ifndef CAFFE_ALL_PASS_LAYER_HPP_  
#define CAFFE_ALL_PASS_LAYER_HPP_  
  
#include <vector>  
  
#include "caffe/blob.hpp"  
#include "caffe/layer.hpp"  
#include "caffe/proto/caffe.pb.h"  
  
#include "caffe/layers/neuron_layer.hpp"  
  
namespace caffe {  
template <typename Dtype>  
class AllPassLayer : public NeuronLayer<Dtype> {  
 public:  
  explicit AllPassLayer(const LayerParameter& param)  
      : NeuronLayer<Dtype>(param) {}  
  
  virtual inline const char* type() const { return "AllPass"; }  
  
 protected:  
  
  virtual void Forward_cpu(const vector<Blob<Dtype>*>& bottom,  
      const vector<Blob<Dtype>*>& top);  
  virtual void Forward_gpu(const vector<Blob<Dtype>*>& bottom,  
      const vector<Blob<Dtype>*>& top);  
  virtual void Backward_cpu(const vector<Blob<Dtype>*>& top,  
      const vector<bool>& propagate_down, const vector<Blob<Dtype>*>& bottom);  
  virtual void Backward_gpu(const vector<Blob<Dtype>*>& top,  
      const vector<bool>& propagate_down, const vector<Blob<Dtype>*>& bottom);  
};  
  
}  // namespace caffe  

 

再看源文件(all_pass_layer.cpp和all_pass_layer.cu,這兩個文件我暫時還不知道有什么區別,我是直接復制粘貼的,兩個文件內容一樣):

#include <algorithm>
#include <vector>

#include "caffe/layers/all_pass_layer.hpp"
#include <iostream>

using namespace std;
#define DEBUG_AP(str) cout << str << endl;
namespace caffe{

template <typename Dtype>
void AllPassLayer<Dtype>::Forward_cpu(const vector<Blob<Dtype>*>& bottom , const vector<Blob<Dtype>*>& top)
{
  const Dtype* bottom_data = bottom[0] -> cpu_data(); //cpu_data()只讀訪問cpu data
  Dtype* top_data = top[0] -> mutable_cpu_data(); //mutable_cpu_data讀寫訪問cpu data
  const int count = bottom[0] -> count(); //計算Blob中的元素總數
  for (int i = 0 ; i < count ; i ++)
  {
    top_data[i] = bottom_data[i];  //只是單純的通過,全通
  }
  DEBUG_AP("Here is All Pass Layer , forwarding.");
  DEBUG_AP(this -> layer_param_.all_pass_param().key()); //讀取prototxt預設值
}

template <typename Dtype>
void AllPassLayer<Dtype>::Backward_cpu(const vector<Blob<Dtype>*>& top , const vector<bool>& propagate_down , const vector<Blob<Dtype>*>& bottom)
{
  if (propagate_down[0])  //propagate_down[0] = 1則指定計算權值的剃度 , ...[1] = 1則指定計算偏置項的剃度,這里是計算權值的剃度
  {
    const Dtype* bottom_data = bottom[0] -> cpu_data();
    const Dtype* top_diff = top[0] -> cpu_diff();
    Dtype* bottom_diff = bottom[0] -> mutable_cpu_diff();
    const int count = bottom[0] -> count();
    for ( int i = 0 ; i < count ; i ++ )
    {
      bottom_diff[i] = top_diff[i];
    }
  }
    DEBUG_AP("Here is All Pass Layer , backwarding.");
    DEBUG_AP(this -> layer_param_.all_pass_param().key());
}

#ifndef CPU_ONLY
#define CPU_ONLY
#endif

INSTANTIATE_CLASS(AllPassLayer);
REGISTER_LAYER_CLASS(AllPass);
} //namespace caffe

 

時間考慮,我沒有實現 GPU 模式的 forward、backward,故本文例程僅支持 CPU_ONLY 模式。

編輯 caffe.proto,找到 LayerParameter 描述,增加一項:

message LayerParameter {  
  optional string name = 1; // the layer name  
  optional string type = 2; // the layer type  
  repeated string bottom = 3; // the name of each bottom blob  
  repeated string top = 4; // the name of each top blob  
  
  // The train / test phase for computation.  
  optional Phase phase = 10;  
  
  // The amount of weight to assign each top blob in the objective.  
  // Each layer assigns a default value, usually of either 0 or 1,  
  // to each top blob.  
  repeated float loss_weight = 5;  
  
  // Specifies training parameters (multipliers on global learning constants,  
  // and the name and other settings used for weight sharing).  
  repeated ParamSpec param = 6;  
  
  // The blobs containing the numeric parameters of the layer.  
  repeated BlobProto blobs = 7;  
  
  // Specifies on which bottoms the backpropagation should be skipped.  
  // The size must be either 0 or equal to the number of bottoms.  
  repeated bool propagate_down = 11;  
  
  // Rules controlling whether and when a layer is included in the network,  
  // based on the current NetState.  You may specify a non-zero number of rules  
  // to include OR exclude, but not both.  If no include or exclude rules are  
  // specified, the layer is always included.  If the current NetState meets  
  // ANY (i.e., one or more) of the specified rules, the layer is  
  // included/excluded.  
  repeated NetStateRule include = 8;  
  repeated NetStateRule exclude = 9;  
  
  // Parameters for data pre-processing.  
  optional TransformationParameter transform_param = 100;  
  
  // Parameters shared by loss layers.  
  optional LossParameter loss_param = 101;  
  
  // Layer type-specific parameters.  
  //  
  // Note: certain layers may have more than one computational engine  
  // for their implementation. These layers include an Engine type and  
  // engine parameter for selecting the implementation.  
  // The default for the engine is set by the ENGINE switch at compile-time.  
  optional AccuracyParameter accuracy_param = 102;  
  optional ArgMaxParameter argmax_param = 103;  
  optional BatchNormParameter batch_norm_param = 139;  
  optional BiasParameter bias_param = 141;  
  optional ConcatParameter concat_param = 104;  
  optional ContrastiveLossParameter contrastive_loss_param = 105;  
  optional ConvolutionParameter convolution_param = 106;  
  optional CropParameter crop_param = 144;  
  optional DataParameter data_param = 107;  
  optional DropoutParameter dropout_param = 108;  
  optional DummyDataParameter dummy_data_param = 109;  
  optional EltwiseParameter eltwise_param = 110;  
  optional ELUParameter elu_param = 140;  
  optional EmbedParameter embed_param = 137;  
  optional ExpParameter exp_param = 111;  
  optional FlattenParameter flatten_param = 135;  
  optional HDF5DataParameter hdf5_data_param = 112;  
  optional HDF5OutputParameter hdf5_output_param = 113;  
  optional HingeLossParameter hinge_loss_param = 114;  
  optional ImageDataParameter image_data_param = 115;  
  optional InfogainLossParameter infogain_loss_param = 116;  
  optional InnerProductParameter inner_product_param = 117;  
  optional InputParameter input_param = 143;  
  optional LogParameter log_param = 134;  
  optional LRNParameter lrn_param = 118;  
  optional MemoryDataParameter memory_data_param = 119;  
  optional MVNParameter mvn_param = 120;  
  optional PoolingParameter pooling_param = 121;  
  optional PowerParameter power_param = 122;  
  optional PReLUParameter prelu_param = 131;  
  optional PythonParameter python_param = 130;  
  optional ReductionParameter reduction_param = 136;  
  optional ReLUParameter relu_param = 123;  
  optional ReshapeParameter reshape_param = 133;  
  optional ScaleParameter scale_param = 142;  
  optional SigmoidParameter sigmoid_param = 124;  
  optional SoftmaxParameter softmax_param = 125;  
  optional SPPParameter spp_param = 132;  
  optional SliceParameter slice_param = 126;  
  optional TanHParameter tanh_param = 127;  
  optional ThresholdParameter threshold_param = 128;  
  optional TileParameter tile_param = 138;  
  optional WindowDataParameter window_data_param = 129;  
  optional AllPassParameter all_pass_param = 155;  
}  

注意新增數字不要和以前的 Layer 數字重復。

 

仍然在 caffe.proto 中,增加 AllPassParameter 聲明,位置任意。我設定了一個參數,可以用於從 prototxt 中讀取預設值。

message AllPassParameter {  
  optional float key = 1 [default = 0];  
}  

這句來讀取 prototxt 預設值。

 

在 $CAFFE_ROOT 下執行 make clean,然后重新 make all。要想一次編譯成功,務必規范代碼,對常見錯誤保持敏銳的嗅覺並加以避免。

 

萬事具備,只欠 prototxt 了。

 

不難,我們寫個最簡單的 deploy.prototxt,不需要 data layer 和 softmax layer,just for fun。

name: "AllPassTest"  
layer {  
  name: "data"  
  type: "Input"  
  top: "data"  
  input_param { shape: { dim: 10 dim: 3 dim: 227 dim: 227 } }  
}  
layer {  
  name: "ap"  
  type: "AllPass"  
  bottom: "data"  
  top: "conv1"  
  all_pass_param {  
    key: 12.88  
  }  
}  

 

注意,這里的 type :后面寫的內容,應該是你在 .hpp 中聲明的新類 class name 去掉 Layer 后的名稱。

上面設定了 key 這個參數的預設值為 12.88,嗯,你想到了劉翔對不對。

 

為了檢驗該 Layer 是否能正常創建和執行  forward, backward,我們運行 caffe time 命令並指定剛剛實現的 prototxt :

$ ./build/tools/caffe.bin time -model deploy.prototxt  
I1002 02:03:41.667682 1954701312 caffe.cpp:312] Use CPU.  
I1002 02:03:41.671360 1954701312 net.cpp:49] Initializing net from parameters:  
name: "AllPassTest"  
state {  
  phase: TRAIN  
}  
layer {  
  name: "data"  
  type: "Input"  
  top: "data"  
  input_param {  
    shape {  
      dim: 10  
      dim: 3  
      dim: 227  
      dim: 227  
    }  
  }  
}  
layer {  
  name: "ap"  
  type: "AllPass"  
  bottom: "data"  
  top: "conv1"  
  all_pass_param {  
    key: 12.88  
  }  
}  
I1002 02:03:41.671463 1954701312 layer_factory.hpp:77] Creating layer data  
I1002 02:03:41.671484 1954701312 net.cpp:91] Creating Layer data  
I1002 02:03:41.671499 1954701312 net.cpp:399] data -> data  
I1002 02:03:41.671555 1954701312 net.cpp:141] Setting up data  
I1002 02:03:41.671566 1954701312 net.cpp:148] Top shape: 10 3 227 227 (1545870)  
I1002 02:03:41.671592 1954701312 net.cpp:156] Memory required for data: 6183480  
I1002 02:03:41.671605 1954701312 layer_factory.hpp:77] Creating layer ap  
I1002 02:03:41.671620 1954701312 net.cpp:91] Creating Layer ap  
I1002 02:03:41.671630 1954701312 net.cpp:425] ap <- data  
I1002 02:03:41.671644 1954701312 net.cpp:399] ap -> conv1  
I1002 02:03:41.671663 1954701312 net.cpp:141] Setting up ap  
I1002 02:03:41.671674 1954701312 net.cpp:148] Top shape: 10 3 227 227 (1545870)  
I1002 02:03:41.671685 1954701312 net.cpp:156] Memory required for data: 12366960  
I1002 02:03:41.671695 1954701312 net.cpp:219] ap does not need backward computation.  
I1002 02:03:41.671705 1954701312 net.cpp:219] data does not need backward computation.  
I1002 02:03:41.671710 1954701312 net.cpp:261] This network produces output conv1  
I1002 02:03:41.671720 1954701312 net.cpp:274] Network initialization done.  
I1002 02:03:41.671746 1954701312 caffe.cpp:320] Performing Forward  
Here is All Pass Layer, forwarding.  
12.88  
I1002 02:03:41.679689 1954701312 caffe.cpp:325] Initial loss: 0  
I1002 02:03:41.679714 1954701312 caffe.cpp:326] Performing Backward  
I1002 02:03:41.679738 1954701312 caffe.cpp:334] *** Benchmark begins ***  
I1002 02:03:41.679746 1954701312 caffe.cpp:335] Testing for 50 iterations.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.681139 1954701312 caffe.cpp:363] Iteration: 1 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.682394 1954701312 caffe.cpp:363] Iteration: 2 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.683653 1954701312 caffe.cpp:363] Iteration: 3 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.685096 1954701312 caffe.cpp:363] Iteration: 4 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.686326 1954701312 caffe.cpp:363] Iteration: 5 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.687713 1954701312 caffe.cpp:363] Iteration: 6 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.689038 1954701312 caffe.cpp:363] Iteration: 7 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.690251 1954701312 caffe.cpp:363] Iteration: 8 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.691548 1954701312 caffe.cpp:363] Iteration: 9 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.692805 1954701312 caffe.cpp:363] Iteration: 10 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.694056 1954701312 caffe.cpp:363] Iteration: 11 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.695264 1954701312 caffe.cpp:363] Iteration: 12 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.696761 1954701312 caffe.cpp:363] Iteration: 13 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.698225 1954701312 caffe.cpp:363] Iteration: 14 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.699653 1954701312 caffe.cpp:363] Iteration: 15 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.700945 1954701312 caffe.cpp:363] Iteration: 16 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.702761 1954701312 caffe.cpp:363] Iteration: 17 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.704056 1954701312 caffe.cpp:363] Iteration: 18 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.706471 1954701312 caffe.cpp:363] Iteration: 19 forward-backward time: 2 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.708784 1954701312 caffe.cpp:363] Iteration: 20 forward-backward time: 2 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.710043 1954701312 caffe.cpp:363] Iteration: 21 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.711272 1954701312 caffe.cpp:363] Iteration: 22 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.712528 1954701312 caffe.cpp:363] Iteration: 23 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.713964 1954701312 caffe.cpp:363] Iteration: 24 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.715248 1954701312 caffe.cpp:363] Iteration: 25 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.716487 1954701312 caffe.cpp:363] Iteration: 26 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.717725 1954701312 caffe.cpp:363] Iteration: 27 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.718962 1954701312 caffe.cpp:363] Iteration: 28 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.720289 1954701312 caffe.cpp:363] Iteration: 29 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.721837 1954701312 caffe.cpp:363] Iteration: 30 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.723042 1954701312 caffe.cpp:363] Iteration: 31 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.724261 1954701312 caffe.cpp:363] Iteration: 32 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.725587 1954701312 caffe.cpp:363] Iteration: 33 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.726771 1954701312 caffe.cpp:363] Iteration: 34 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.728013 1954701312 caffe.cpp:363] Iteration: 35 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.729249 1954701312 caffe.cpp:363] Iteration: 36 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.730716 1954701312 caffe.cpp:363] Iteration: 37 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.732275 1954701312 caffe.cpp:363] Iteration: 38 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.733809 1954701312 caffe.cpp:363] Iteration: 39 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.735049 1954701312 caffe.cpp:363] Iteration: 40 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.737144 1954701312 caffe.cpp:363] Iteration: 41 forward-backward time: 2 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.739090 1954701312 caffe.cpp:363] Iteration: 42 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.741575 1954701312 caffe.cpp:363] Iteration: 43 forward-backward time: 2 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.743450 1954701312 caffe.cpp:363] Iteration: 44 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.744732 1954701312 caffe.cpp:363] Iteration: 45 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.745970 1954701312 caffe.cpp:363] Iteration: 46 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.747185 1954701312 caffe.cpp:363] Iteration: 47 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.748430 1954701312 caffe.cpp:363] Iteration: 48 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.749826 1954701312 caffe.cpp:363] Iteration: 49 forward-backward time: 1 ms.  
Here is All Pass Layer, forwarding.  
12.88  
Here is All Pass Layer, backwarding.  
12.88  
I1002 02:03:41.751124 1954701312 caffe.cpp:363] Iteration: 50 forward-backward time: 1 ms.  
I1002 02:03:41.751147 1954701312 caffe.cpp:366] Average time per layer:  
I1002 02:03:41.751157 1954701312 caffe.cpp:369]       data  forward: 0.00108 ms.  
I1002 02:03:41.751183 1954701312 caffe.cpp:372]       data  backward: 0.001 ms.  
I1002 02:03:41.751194 1954701312 caffe.cpp:369]         ap  forward: 1.37884 ms.  
I1002 02:03:41.751205 1954701312 caffe.cpp:372]         ap  backward: 0.01156 ms.  
I1002 02:03:41.751220 1954701312 caffe.cpp:377] Average Forward pass: 1.38646 ms.  
I1002 02:03:41.751231 1954701312 caffe.cpp:379] Average Backward pass: 0.0144 ms.  
I1002 02:03:41.751240 1954701312 caffe.cpp:381] Average Forward-Backward: 1.42 ms.  
I1002 02:03:41.751250 1954701312 caffe.cpp:383] Total Time: 71 ms.  
I1002 02:03:41.751260 1954701312 caffe.cpp:384] *** Benchmark ends ***  

可見該 Layer 可以正常創建、加載預設參數、執行 forward、backward 函數。

 

實際上對於算法 Layer,還要寫 Test Case 保證功能正確。由於我們選擇了極為簡單的全通 Layer,故這一步可以省去。我這里偷點懶,您省點閱讀時間。

補充:make all后還要重新配置python接口,make pycaffe ,然后再按之前的操作就行了。。。

 

參考鏈接:http://blog.csdn.net/kkk584520/article/details/52721838

 

 

 

 

 

 

 

 

 

 

 

 

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM