@keras_export('keras.layers.TimeDistributed')
class TimeDistributed(Wrapper):
"""This wrapper allows to apply a layer to every temporal slice of an input.
這個包裝類可以將一個layer應用到input的每個時序切片。
The input should be at least 3D, and the dimension of index one
will be considered to be the temporal dimension.
input至少需要是3維,並且索引1對應的維度會被當作時序維度。
Consider a batch of 32 samples,
where each sample is a sequence of 10 vectors of 16 dimensions.
假設有一個batch size=32的input,每個sample包含10個16維的向量,可以認為是一個sample包含10個時間步,每個時序步是16維向量。
The batch input shape of the layer is then `(32, 10, 16)`,
and the `input_shape`, not including the samples dimension, is `(10, 16)`.
那么這個batch input的形狀就是(32, 10, 16),而input的形狀就是(10, 16)。
You can then use `TimeDistributed` to apply a `Dense` layer
to each of the 10 timesteps, independently:
你可以使用TimeDistributed將Dense層分別應用到每個sample的10個時間步。
```python
# as the first layer in a model
model = Sequential()
model.add(TimeDistributed(Dense(8), input_shape=(10, 16))) # Dense(8)中的8指的是Dense Layer的output size。
# now model.output_shape == (None, 10, 8) # 將(10, 16)轉換成了(10, 8)
The output will then have shape (32, 10, 8)
.
In subsequent layers, there is no need for the input_shape
:
在隨后的層中,不需要指定input_shape參數。
model.add(TimeDistributed(Dense(32)))
# now model.output_shape == (None, 10, 32) # 將(10, 8)轉換成了(10, 32)
The output will then have shape (32, 10, 32)
.
TimeDistributed
can be used with arbitrary layers, not just Dense
,
for instance with a Conv2D
layer:
TimeDistributed可以配合任意layer使用,不僅僅是Dense層,比如和Conv2D搭配:
model = Sequential()
model.add(TimeDistributed(Conv2D(64, (3, 3)),
input_shape=(10, 299, 299, 3))) # 10個時序步,每個時序步是299*299*3的高維向量
Arguments:
layer: a layer instance.
參數就是一個layer的實例。
Call arguments:
inputs: Input tensor.
training: Python boolean indicating whether the layer should behave in
training mode or in inference mode. This argument is passed to the
wrapped layer (only if the layer supports this argument).
mask: Binary tensor of shape (samples, timesteps)
indicating whether
a given timestep should be masked. This argument is passed to the
wrapped layer (only if the layer supports this argument).
Call方法的參數
-
inputs:輸入的tensor
-
training:用於標記這個layer是否處於訓練模式,這個參數會被透傳給被包裝的layer(如果這個layer支持這個參數的話)
-
mask:一個形狀為
(samples, timesteps)
的二元Tensor,用於表示給定的timestep是否需要被mask,這個參數會被透傳給被包裝的layer(如果這個layer支持這個參數的話)Raises:
ValueError: If not initialized with aLayer
instance.
如果不用Layer實例初始化會拋出ValueError錯誤。
"""