In pytorch, I want to save the output in every epoch for late caculation. But it leads to OUT OF MEMORY ERROR after several epochs. The code is like below:

    L=[]
    optimizer.zero_grad()
    for i, (input, target) in enumerate(train_loader):
        output = model(input)
        L.append(output)
    *** updata my model to minimize a loss function. List L will be used here. 

I know the reason is because pytorch save all computation graphs from every epoch. But the loss function can only be calculated after obtaining all of the prediction results

Is there a way I can train my model?

1

There are 1 answers

0
ezekiel On

are you training on a GPU?

If so, you could move it main memory like

    L.append(output.detach().cpu())