指定維度softmax 層tensorRT api實現


解決了問題才來記錄一下的,現在的心情是好點兒的,但是之前,昨天,真是無厘頭,被折騰的一點脾氣都沒有。
本來就是一個softmax嘛,很簡單的嘛,就是按照公式e的指數再相加求和,官方有api實現,比如我找的例子,

 // Add activation layer using the ReLU algorithm.
    IActivationLayer* relu1 = network->addActivation(*ip1->getOutput(0), ActivationType::kRELU);
    assert(relu1);

    // Add second fully connected layer with 20 outputs.
    IFullyConnectedLayer* ip2 = network->addFullyConnected(
        *relu1->getOutput(0), mParams.outputSize, mWeightMap["ip2filter"], mWeightMap["ip2bias"]);
    assert(ip2);

    // Add softmax layer to determine the probability.
    ISoftMaxLayer* prob = network->addSoftMax(*ip2->getOutput(0));
    assert(prob);
    prob->getOutput(0)->setName(mParams.outputTensorNames[0].c_str());
    network->markOutput(*prob->getOutput(0));

這是官方例子簡單分類的例子,
https://github.com/NVIDIA/TensorRT/blob/master/samples/opensource/sampleMNISTAPI/sampleMNISTAPI.cpp
里面就有ISoftMaxLayer* prob = network->addSoftMax(*ip2->getOutput(0));
我這里要實現的稍微有點兒不同,在於我需要按照某個維度來softmax,上面這個是全局的。我需要實現的對應pytorch代碼如下:

        #[1,6375,2]                # arm_conf[1,12750]
        arm_conf_view = arm_conf.view(arm_conf.size(0), -1,2)
        softmax_1 = nn.Softmax(dim=-1)
        m2 = softmax_1(arm_conf_view)  # #[1,6375,2]

對應的m2部分數值如下:

tensor([[[0.9575, 0.0425],
         [0.9326, 0.0674],
         [0.9131, 0.0869],
         ...,
         [0.8707, 0.1293],
         [0.8746, 0.1254],
         [0.8783, 0.1217]]], grad_fn=<SoftmaxBackward>)

可以看到,是按照第二維進行softmax的,每行加起來之和為1.
在tensorrt端,我現在也得到了一個一維數組arm_conf[12750],再把這個數組也一樣softmax就可以了。但是我這個不是全局的softmax。然后我就去看
https://github.com/wang-xinyu/tensorrtx這個倉庫有沒有人用,一搜索有個人用。
https://github.com/wang-xinyu/tensorrtx/blob/18fa419ae35bfcbd27248b3eb9329f415f604366/retinafaceAntiCov/retinafaceAntiCov.cpp

ILayer* reshapeSoftmax(INetworkDefinition *network, ITensor& input, int c) {
    auto re1 = network->addShuffle(input);
    assert(re1);
    re1->setReshapeDimensions(Dims3(c / 2, -1, 0));

    auto sm = network->addSoftMax(*re1->getOutput(0));
    assert(sm);

    auto re2 = network->addShuffle(*sm->getOutput(0));
    assert(re2);
    re2->setReshapeDimensions(Dims3(c, -1, 0));

    return re2;
}

好像和我需要的功能一樣,因為我是12750數據,也需要先變成[6375,2]形狀的數據,最后傳出去的時候再變成[12750]. 很快,我也寫類似代碼:

ILayer* reshapeSoftmax_ok(INetworkDefinition *network, ITensor& input, int ch) {
    //輸入進來是一維的[12750]
    //先變成[XX,ch]
    auto re1 = network->addShuffle(input);
    assert(re1);
    re1->setReshapeDimensions(Dims3(1, -1, ch)); //[1,6375,2];
//     re1->setReshapeDimensions(Dims2(-1, ch)); //[6375,2];

    Dims dim0 = re1->getOutput(0)->getDimensions();
    std::cout <<"debug  re1 dim==" << dim0.d[0] << " " << dim0.d[1] << " " << dim0.d[2] << " " << dim0.d[3] << std::endl;
    
    auto sm = network->addSoftMax(*re1->getOutput(0));
    sm->setAxes(2);
    assert(sm);

    //再變成一維的,保持和傳進來的形狀一樣
    Dims dim_;
    dim_.nbDims=1;
    dim_.d[0]=-1;
    auto re2 = network->addShuffle(*sm->getOutput(0));
    assert(re2);
    re2->setReshapeDimensions(dim_);

    return re2;
}

我這里多了sm->setAxes(2);因為我需要在第二維度softmax,但是!結果不對啊。怎么整結果都不對,三維,二維都不對!
sm->setAxes(0);,sm->setAxes(1);sm->setAxes(2);都試了都不對,有的出來的結果好像是全局softmax的結果,有的都是1.
期間,也是查看了文檔幫助,

    //!
    //! \brief Add a SoftMax layer to the network.
    //!
    //! \see ISoftMaxLayer
    //! \warning Int32 tensors are not valid input tensors.
    //!
    //! \return The new SoftMax layer, or nullptr if it could not be created.
    //!
    virtual ISoftMaxLayer* addSoftMax(ITensor& input) TRTNOEXCEPT = 0;
//!
//! \class ISoftMaxLayer
//!
//! \brief A Softmax layer in a network definition.
//!
//! This layer applies a per-channel softmax to its input.
//!
//! The output size is the same as the input size.
//!
//! \warning Do not inherit from this class, as doing so will break forward-compatibility of the API and ABI.
//!
class ISoftMaxLayer : public ILayer
{
protected:
    virtual ~ISoftMaxLayer() {}
public:
    //!
    //! \brief Set the axis along which softmax is computed. Currently, only one axis can be set.
    //!
    //! The axis is specified by setting the bit corresponding to the axis to 1.
    //! Let's say we have an NCHW tensor as input (three non-batch dimensions).
    //!
    //! In implicit mode :
    //! Bit 0 corresponds to the C dimension boolean.
    //! Bit 1 corresponds to the H dimension boolean.
    //! Bit 2 corresponds to the W dimension boolean.
    //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if
    //! there are fewer than 3 non-batch axes. For example, if the input is NCHW, the default axis is C. If the input
    //! is NHW, then the default axis is H.
    //!
    //! In explicit mode :
    //! Bit 0 corresponds to the N dimension boolean.
    //! Bit 1 corresponds to the C dimension boolean.
    //! Bit 2 corresponds to the H dimension boolean.
    //! Bit 3 corresponds to the W dimension boolean.
    //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if
    //! there are fewer than 3 axes. For example, if the input is NCHW, the default axis is C. If the input
    //! is NHW, then the default axis is N.
    //!
    //! For example, to perform softmax on axis R of a NPQRCHW input, set bit 2 with implicit batch mode,
    //! set bit 3 with explicit batch mode.
    //!
    //! \param axes The axis along which softmax is computed.
    //!        Here axes is a bitmap. For example, when doing softmax along axis 0, bit 0 is set to 1, axes = 1 << axis = 1.
    //!
    virtual void setAxes(uint32_t axes) TRTNOEXCEPT = 0;

    //!
    //! \brief Get the axis along which softmax occurs.
    //!
    //! \see setAxes()
    //!
    virtual uint32_t getAxes() const TRTNOEXCEPT = 0;
};

就兩個函數可以調用設置維度啊,就是不對,晚上加班各種實驗,都是和之前一樣的結果,然后又懷疑輸入數據不一樣,或者tensorrt reshape之后數據不一樣,

    //輸入進來是一維的[12750]
    //先變成[XX,ch]
    auto re1 = network->addShuffle(input);
    assert(re1);
    re1->setReshapeDimensions(Dims3(1, -1, ch)); //[1,6375,2];
//     re1->setReshapeDimensions(Dims2(-1, ch)); //[6375,2];
   return re1;/////////////////////////////////////////

reshape之后提前return,然后看是否和pytorch一致,發現是一致的!
這里實驗一次特別麻煩,要先序列化生成engine,然后再反序列化推理出結果才能完成一次實驗。
晚上加班到10點多,沒有解決,解決不了,郁郁悶悶的回家了。
期間還有個思路就是用cuda編程完成softmax操作,但是感覺太麻煩了。
群里問了大佬,沒人回答,然后又私聊問了群里的球哥,他也表示代碼是對的,說明天給我實驗。
最后我總結了一下就是
“你明天做實驗的時候就是比如輸入是一個形狀是[8]的tensor,然后進函數里面4️ 2,softmax,應該也得到4乘2的tensor,4行2列,每行加起來和為1,然后再變為形狀為[8]的tensor輸出來。”

然后第二天早上又來搞,檢查了一下其他環節有沒有問題,沒有問題的。又實驗了幾次,現象還是和之前一樣。無解。
中午球哥聯系我,讓我確認了輸入和pytorch代碼數據一致。我太困了,拿着手機趴會兒因為一有消息有就能醒,果真到13點的時候,

可以!!!
我意識到這是bit位操作的一些寫法。球哥說是的。
給我發了一個圖片:

注意圖片上面的這段英文:
reduceAxes: the least significant bit corresponds to the first explicit dimension
reduceAxes:最低有效位對應於第一個顯式維度

mask是0010,bitmax就是0100
比如我是[6375,2],想在第1維度softmax,mask就是[0,1]對應的bitmap就是[1,0],所以需要這么寫1<<1
比如我是[1,6375,2],想在第2維度softmax,mask就是[0,0,1]對應的bitmap就是[1,0,0],所以需要這么寫1<<2
bitmap和mask是倒着的啊!
現在反過來再去看官方文檔,

  //! In explicit mode :
    //! Bit 0 corresponds to the N dimension boolean.
    //! Bit 1 corresponds to the C dimension boolean.
    //! Bit 2 corresponds to the H dimension boolean.
    //! Bit 3 corresponds to the W dimension boolean.
    //! By default, softmax is performed on the axis which is the number of axes minus three. It is 0 if
    //! there are fewer than 3 axes. For example, if the input is NCHW, the default axis is C. If the input
    //! is NHW, then the default axis is N.
    //!
    //! For example, to perform softmax on axis R of a NPQRCHW input, set bit 2 with implicit batch mode,
    //! set bit 3 with explicit batch mode.
    //!
    //! \param axes The axis along which softmax is computed.
    //!        Here axes is a bitmap. For example, when doing softmax along axis 0, bit 0 is set to 1, axes = 1 << axis = 1.

還是看不懂,axes = 1 << axis = 1.這什么鬼,就不能好好寫個例子的嗎?
感謝球哥!!

正確的代碼如下

ILayer* reshapeSoftmax(INetworkDefinition *network, ITensor& input, int ch) {
    //輸入進來是一維的[12750]
    //先變成[XX,ch]
    auto re1 = network->addShuffle(input);
    assert(re1);
    re1->setReshapeDimensions(Dims3(1, -1, ch)); //[1,6375,2];
//     re1->setReshapeDimensions(Dims2(-1, ch)); //[6375,2];

    Dims dim0 = re1->getOutput(0)->getDimensions();
    std::cout <<"debug  re1 dim==" << dim0.d[0] << " " << dim0.d[1] << " " << dim0.d[2] << " " << dim0.d[3] << std::endl;

//    return re1;/////////////////////////////////////////

    auto sm = network->addSoftMax(*re1->getOutput(0));
    sm->setAxes(1<<2);
    assert(sm);

    //再變成一維的,保持和傳進來的形狀一樣
    Dims dim_;
    dim_.nbDims=1;
    dim_.d[0]=-1;
    auto re2 = network->addShuffle(*sm->getOutput(0));
    assert(re2);
    re2->setReshapeDimensions(dim_);
    return re2;
}

和球哥討論,不寫setAxes,應該是默認第二個。nchw是c chw是h
然后回顧一下倉庫里面的這段代碼,人家之所以reshape,可能就是這個大佬發現了softmax操作只能按照某個維度來,需要按照自己指定的維度需要reshape一下才能得到正確結果。
可能他沒有知道需要這么寫->setAxes(1<<2);指定維度。

感覺球哥研究tensorrt比較深入,還用到了其他的高級api,


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM