How to specify the batch_size in PyTorch Sequantial model?

309 views Asked by At

I have a Sequential model in PyTorch:

model = nn.Sequential(
        nn.Embedding(alphabet_size, 64),
        nn.LSTM(64, ...),
        nn.Flatten(),
        nn.Linear(...),
        nn.Softmax()
    )

I would like to force the batch size to Embedding layer:

nn.Embedding(alphabet_size, 64)

as in Keras:

Embedding(alphabet_size, 64, batch_input_shape=(batch_size, time_steps)))

How to do it?

1

There are 1 answers

0
ML85 On

Here is the answer to your question. The example already available below

PyTorch sequential model and batch_size