The model loss.grad is not None,but the parameters.grad in model is None when training

18 views Asked by At

I'm traing a adversarial net against DQN, the loss uses the Q value of original obs and attacked obs , when training the net, I find the gard_value of parameter is None.How can I resolve it?

for epoch in range(n_epoch):
            for i,(obs,q_value) in enumerate(train_loader):
                obs = (obs.float()).squeeze(1).to(device)
                self.data.obs = obs.cpu().detach().numpy()
                q_value = (self.policy(self.data).logits.detach()).to(device)
                noise = (torch.randn(obs.size()) * std + mean).to(device)
                noisy_obs = self.obs_attacks(Coder,obs,noise)
                self.data.obs = noisy_obs.cpu().detach().numpy()
                q_hx=self.policy(self.data).logits
                loss = -model_loss(q_value,q_hx)
                optimizer.zero_grad()
                loss.retain_grad()
                loss.backward()
                optimizer.step()
                print(loss.grad)
                for name, parms in Coder.named_parameters():
                    print('-->name:', name, '-->grad_requirs:',parms.requires_grad,' -->grad_value:', parms.grad)
                losses.append(loss.item())

                if (i+1) % 10 == 0:
                    print(f'Epoch [{epoch+1}/{n_epoch}], Loss: {loss.item():.6f}')

And the output is:

-->name: generator.0.weight -->grad_requirs: True  -->grad_value: None

how can I resolve it

0

There are 0 answers