TimeDistributedDense64
TimeDistributedDense64 is a neural network layer designed for processing sequence data. It applies a dense transformation with 64 output units to each timestep of the input independently, preserving the time structure of the sequence. For an input tensor with shape (batch_size, timesteps, input_dim), the output has shape (batch_size, timesteps, 64). Conceptually, it is equivalent to wrapping a Dense(64) layer with a TimeDistributed wrapper, so the same weights are used across all timesteps.
The layer uses a weight matrix of shape (input_dim, 64) and a bias vector of length 64.
By default, the layer can be linear, but an activation function (such as ReLU, tanh, or sigmoid)
TimeDistributedDense64 is commonly used to transform per-timestep features before feeding them into recurrent layers, sequence-to-sequence models,
It is a specific instance of the TimeDistributed layer concept and relates to Dense layers and recurrent