The infoGraphS example in DGL

27 views Asked by At

I am trying to understand the evaluate function in the InfoGraphS example thats part of the DGL examples folder. Here is the function with some of my additions:

    def evaluate(model, loader, num, device):
        error = 0
        perc_error = 0
        model_values = 0
       target_values = 0
        for graphs, targets in loader:
            graphs = graphs.to(device) 
            
            nfeat, efeat = graphs.ndata['pos'], graphs.edata['edge_attr']
            targets = targets.to(device)
        
            error += (model(graphs, nfeat, efeat) - targets).abs().sum().item()
            print(':: model' , model(graphs, nfeat, efeat))
            print(':: targets ', targets)
           print('difference  :', model(graphs, nfeat, efeat) - targets)
          # perc_error += ((model(graphs, nfeat, efeat) -    targets)targets).abs().sum().item()
           print('perc error  :', perc_error)
            model_values += model(graphs, nfeat, efeat).sum().item()
             target_values += targets.sum().item()

And here is an example the output for one epoch:

Epoch: 0, Sup_Loss: 22.369363, Unsup_loss: 2634.0174, Consis_loss: 3737.3324
:: model tensor([-0.7245, -0.5676, -0.3841, -0.7138, -0.5115, -0.5954, -0.6301, -0.3808,
        -0.4842, -0.6984, -0.7026, -0.6204, -0.4985, -0.3325, -0.5307, -0.5177,
        -0.3542, -0.4297, -0.3765, -0.4596], grad_fn=<ViewBackward>)
:: targets  tensor([3.0166e-09, 4.1832e-12, 1.8106e-12, 9.1151e-10, 3.8396e-09, 3.8556e-10,
        4.1832e-12, 3.3620e-09, 5.0744e-09, 4.3329e-10, 5.1246e-09, 8.6245e-10,
        8.8167e-09, 1.8087e-12, 2.1886e-10, 1.8087e-12, 1.4394e-09, 2.3161e-09,
        2.2429e-09, 1.8106e-12])
difference  : tensor([-0.7245, -0.5676, -0.3841, -0.7138, -0.5115, -0.5954, -0.6301, -0.3808,
        -0.4842, -0.6984, -0.7026, -0.6204, -0.4985, -0.3325, -0.5307, -0.5177,
        -0.3542, -0.4297, -0.3765, -0.4596], grad_fn=<SubBackward0>)

So the question is, how come the difference (model - target) works out to be so different from the targets (the targets are like ~ ?e-10 give or take). Is there like some gradient magic happening somewhere?

0

There are 0 answers