I have a NN that is a function f:(t, v) -> (x,z). Am I able to calculate an autograd partial derivative df/dt? I want to use the the autograd calculation in a regularization term in my loss function.
yhat = net((t,v))
#calculate current value of df/dt here
penalized_loss(yhat, y) = loss(yhat, y) + penalty(df/dt)
I want to do something like
df/dt = gradient(net,t)
but I don't know how to tell the gradient function what the input (t) is
Based on the documentation, you can use the
gradient
this way:This
gradient(ps) do ... end
is the Julian idiom to:gradient
comes from Zygote.jl, you can read more about here.