I have a series of tensors of shape [B, n, C], where B and C are of constant size, and n varies between tensors (say, n in [1, 5]). I would like to concatenate all of these tensors into a single one of shape [B, k, C], where k is the sum of all values of n. In numpy, this would be:
import numpy as np
batch_size = 2
num_channels = 4
values = []
for i in range(0, 5):
tensor = np.ones([batch_size, i+1, num_channels]) * i
values.append(tensor)
print(np.concatenate(values, axis=1))
With the result
[[[0. 0. 0. 0.]
[1. 1. 1. 1.]
[1. 1. 1. 1.]
[2. 2. 2. 2.]
[2. 2. 2. 2.]
[2. 2. 2. 2.]
[3. 3. 3. 3.]
[3. 3. 3. 3.]
[3. 3. 3. 3.]
[3. 3. 3. 3.]
[4. 4. 4. 4.]
[4. 4. 4. 4.]
[4. 4. 4. 4.]
[4. 4. 4. 4.]
[4. 4. 4. 4.]]
[[0. 0. 0. 0.]
[1. 1. 1. 1.]
[1. 1. 1. 1.]
[2. 2. 2. 2.]
[2. 2. 2. 2.]
[2. 2. 2. 2.]
[3. 3. 3. 3.]
[3. 3. 3. 3.]
[3. 3. 3. 3.]
[3. 3. 3. 3.]
[4. 4. 4. 4.]
[4. 4. 4. 4.]
[4. 4. 4. 4.]
[4. 4. 4. 4.]
[4. 4. 4. 4.]]]
How can I achieve this in Eigen?
I found a solution using slicing:
The output is equivalent to Python's: