Partial derivatives using Jax?

2.8k views Asked by At

I'm confused by Jax documentation, here's what I'm trying to do:

def line(m,x,b):
  return m*x + b

grad(line)(1,2,3)

And the error:

---------------------------------------------------------------------------
FilteredStackTrace                        Traceback (most recent call last)
<ipython-input-48-d14b17620b30> in <module>()
      3 
----> 4 grad(line)(1,2,3)

FilteredStackTrace: TypeError: grad requires real- or complex-valued inputs (input dtype that is a sub-dtype of np.floating or np.complexfloating), but got int32. If you want to use integer-valued inputs, use vjp or set allow_int to True.

The stack trace above excludes JAX-internal frames.
The following is the original exception that occurred, unmodified.

--------------------

The above exception was the direct cause of the following exception:

TypeError                                 Traceback (most recent call last)
6 frames
/usr/local/lib/python3.7/dist-packages/jax/api.py in _check_input_dtype_revderiv(name, holomorphic, allow_int, x)
    844   elif not allow_int and not (dtypes.issubdtype(aval.dtype, np.floating) or
    845                               dtypes.issubdtype(aval.dtype, np.complexfloating)):
--> 846     raise TypeError(f"{name} requires real- or complex-valued inputs (input dtype that "
    847                     "is a sub-dtype of np.floating or np.complexfloating), "
    848                     f"but got {aval.dtype.name}. If you want to use integer-valued "

TypeError: grad requires real- or complex-valued inputs (input dtype that is a sub-dtype of np.floating or np.complexfloating), but got int32. If you want to use integer-valued inputs, use vjp or set allow_int to True.

I'm referencing the official tutorial code:

import jax.numpy as jnp
from jax import grad, jit, vmap
from jax import random

key = random.PRNGKey(0)

def sigmoid(x):
    return 0.5 * (jnp.tanh(x / 2) + 1)

# Outputs probability of a label being true.
def predict(W, b, inputs):
    return sigmoid(jnp.dot(inputs, W) + b)

# Build a toy dataset.
inputs = jnp.array([[0.52, 1.12,  0.77],
                   [0.88, -1.08, 0.15],
                   [0.52, 0.06, -1.30],
                   [0.74, -2.49, 1.39]])
targets = jnp.array([True, True, False, True])

# Training loss is the negative log-likelihood of the training examples.
def loss(W, b):
    preds = predict(W, b, inputs)
    label_probs = preds * targets + (1 - preds) * (1 - targets)
    return -jnp.sum(jnp.log(label_probs))

# Initialize random model coefficients
key, W_key, b_key = random.split(key, 3)
W = random.normal(W_key, (3,))
b = random.normal(b_key, ())

W_grad = grad(loss, argnums=0)(W, b)
print('W_grad', W_grad)

And the result:

W_grad [-0.16965576 -0.8774648  -1.4901345 ]

What am I doing wrong here? I gather key is being used in some important way, but I can't figure out why/how it's necessary. To answer this question, please adjust code in the first block as necessary to remove the error.

2

There are 2 answers

0
tomy0608 On BEST ANSWER

I think the Error here is clear:

TypeError: grad requires real- or complex-valued inputs (input dtype that is a sub-dtype of np.floating or np.complexfloating), but got int32. If you want to use integer-valued inputs, use vjp or set allow_int to True.

To use grad(line)(1,2,3) with Int32, change it to grad(line, allow_int=True)(1,2,3)

0
Seon On

Jax is telling you it doesn't like integers. grad(line)(1.,2.,3.) (using floats) fixes the problem.