Is it possible to delete an element from a pytorch tensor referentially?

30 views Asked by At

Currently, I'm trying to delete an element in a vector corresponding to an index in pytorch, however, it keeps creating an extra copy with the methods I'm currently using.

This makes sense under the hood; the only way I'm thinking the memory address could remain the same is if the tensor was implemented like a linkedlist.

That being said, is it possible to delete an element from a PyTorch tensor referentially. i.e is it possible to delete an element from the tensor such that the tensor with the deleted index and the old tensor have the same memory address.

Methods I've already tried (but involved creating a copy/new memory address) involve (pseudocode)...

  1. new_data = torch.cat((data[:i], data[i+1:))
  2. data[torch.LongTensor(indices_to_keep)] where indices_to_keep = [index for index in range(data.shape[0]) if index != i]
1

There are 1 answers

2
Ivan On

As far as I understand, you don't need a special function to achieve such behavior. If you assign the existing tensor to a new variable, both variable names in that scope will be pointing to the same tensor.

x = torch.arange(5)
> tensor([0, 1, 2, 3, 4])

y = x # assign to y

# change y[0] which is x[0]
y[0] = -1 

x
> tensor([-1,  1,  2,  3,  4])