partial derivative of output with respect to the inputs in pytorch

33 views Asked by At

I wish to calculate the partial derivative function exp_reducer with respect to X[:,0] and X[:,1] ( X so far is set to be 2 dimensional), So what I need eventually is the result calculated using derivative function in the following code. But when I use the grad function. I only get partial derivative with respect each one of the dimension of X. Does any one know how to do it using this grad function please? Since in my own problem it is not going to be this simple function with low dimension. I would like to find a generic solution.

see picture.

enter image description here

import numpy as np
import torch
from torch.autograd import grad 
X = torch.from_numpy(np.array([[2, 3],[4,5]],dtype = np.float32))
X.requires_grad = True
def derivative(X):
    res = 9* (X[:,0]**2 )* ( X[:,1]**2)
    return res
def exp_reducer(x):
    return x.pow(3).prod(axis = 1)
y = exp_reducer(X)

print(derivative(X)) # This is what I need
grad(y, X,grad_outputs=torch.ones_like(y))
0

There are 0 answers