@keras_export('keras.layers.TimeDistributed')
class TimeDistributed(Wrapper):
"""This wrapper allows to apply a layer to every temporal slice of an input.
这个包装类可以将一个layer应用到input的每个时序切片。
The input should be at least 3D, and the dimension of index one
will be considered to be the temporal dimension.
input至少需要是3维,并且索引1对应的维度会被当作时序维度。
Consider a batch of 32 samples,
where each sample is a sequence of 10 vectors of 16 dimensions.
假设有一个batch size=32的input,每个sample包含10个16维的向量,可以认为是一个sample包含10个时间步,每个时序步是16维向量。
The batch input shape of the layer is then `(32, 10, 16)`,
and the `input_shape`, not including the samples dimension, is `(10, 16)`.
那么这个batch input的形状就是(32, 10, 16),而input的形状就是(10, 16)。
You can then use `TimeDistributed` to apply a `Dense` layer
to each of the 10 timesteps, independently:
你可以使用TimeDistributed将Dense层分别应用到每个sample的10个时间步。
```python
# as the first layer in a model
model = Sequential()
model.add(TimeDistributed(Dense(8), input_shape=(10, 16))) # Dense(8)中的8指的是Dense Layer的output size。
# now model.output_shape == (None, 10, 8) # 将(10, 16)转换成了(10, 8)
The output will then have shape (32, 10, 8)
.
In subsequent layers, there is no need for the input_shape
:
在随后的层中,不需要指定input_shape参数。
model.add(TimeDistributed(Dense(32)))
# now model.output_shape == (None, 10, 32) # 将(10, 8)转换成了(10, 32)
The output will then have shape (32, 10, 32)
.
TimeDistributed
can be used with arbitrary layers, not just Dense
,
for instance with a Conv2D
layer:
TimeDistributed可以配合任意layer使用,不仅仅是Dense层,比如和Conv2D搭配:
model = Sequential()
model.add(TimeDistributed(Conv2D(64, (3, 3)),
input_shape=(10, 299, 299, 3))) # 10个时序步,每个时序步是299*299*3的高维向量
Arguments:
layer: a layer instance.
参数就是一个layer的实例。
Call arguments:
inputs: Input tensor.
training: Python boolean indicating whether the layer should behave in
training mode or in inference mode. This argument is passed to the
wrapped layer (only if the layer supports this argument).
mask: Binary tensor of shape (samples, timesteps)
indicating whether
a given timestep should be masked. This argument is passed to the
wrapped layer (only if the layer supports this argument).
Call方法的参数
-
inputs:输入的tensor
-
training:用于标记这个layer是否处于训练模式,这个参数会被透传给被包装的layer(如果这个layer支持这个参数的话)
-
mask:一个形状为
(samples, timesteps)
的二元Tensor,用于表示给定的timestep是否需要被mask,这个参数会被透传给被包装的layer(如果这个layer支持这个参数的话)Raises:
ValueError: If not initialized with aLayer
instance.
如果不用Layer实例初始化会抛出ValueError错误。
"""