I have a Sequential model in PyTorch:
model = nn.Sequential(
nn.Embedding(alphabet_size, 64),
nn.LSTM(64, ...),
nn.Flatten(),
nn.Linear(...),
nn.Softmax()
)
I would like to force the batch size to Embedding layer:
nn.Embedding(alphabet_size, 64)
as in Keras:
Embedding(alphabet_size, 64, batch_input_shape=(batch_size, time_steps)))
How to do it?
Here is the answer to your question. The example already available below
PyTorch sequential model and batch_size