Skip to content Skip to sidebar Skip to footer

Partial Derivative Using Autograd

I have a function that takes in a multivariate argument x. Here x = [x1,x2,x3]. Let's say my function looks like: f(x,T) = np.dot(x,T) + np.exp(np.dot(x,T) where T is a constant. I

Solution 1:

I found the following description of the grad function in the autograd source code:

def grad(fun, x)
"Returns a function which computes the gradient of `fun` with
respect to positional argument number `argnum`. The returned
function takes the same arguments as `fun`, but returns the
gradient instead. The function `fun`should be scalar-valued. The
gradient has the same type as the argument."

So

defh(x,t):
    return np.dot(x,t) + np.exp(np.dot(x,t))
h_x = grad(h,0) # derivative with respect to x
h_t = grad(h,1) # derivative with respect to t

Also make sure to use the numpy libaray that comes with autograd

import autograd.numpyas np

instead of

import numpy as np

in order to make use of all numpy functions.

Post a Comment for "Partial Derivative Using Autograd"