torch.nn.LSTM()函數維度詳解


1
2
3
4
5
6
7
8
9
10
11
12
lstm=nn.LSTM(input_size,                     hidden_size,                      num_layers)
x                         seq_len,                          batch,                              input_size
h0            num_layers× \times×num_directions,   batch,                             hidden_size
c0            num_layers× \times×num_directions,   batch,                             hidden_size

output                 seq_len,                         batch,                num_directions× \times×hidden_size
hn            num_layers× \times×num_directions,   batch,                             hidden_size
cn            num_layers× \times×num_directions,    batch,                            hidden_size

舉個例子:
對句子進行LSTM操作

假設有100個句子(sequence),每個句子里有7個詞,batch_size=64,embedding_size=300

此時,各個參數為:
input_size=embedding_size=300
batch=batch_size=64
seq_len=7

另外設置hidden_size=100, num_layers=1

import torch
import torch.nn as nn
lstm = nn.LSTM(300, 100, 1)
x = torch.randn(7, 64, 300)
h0 = torch.randn(1, 64, 100)
c0 = torch.randn(1, 64, 100)
output, (hn, cn)=lstm(x, (h0, c0))

>>
output.shape torch.Size([7, 64, 100])
hn.shape torch.Size([1, 64, 100])
cn.shape torch.Size([1, 64, 100])
---------------------
作者:huxuedan01
來源:CSDN
原文:https://blog.csdn.net/m0_37586991/article/details/88561746
版權聲明:本文為博主原創文章,轉載請附上博文鏈接!


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM