How to fix some dimensions of a kernel lengthscale in gpflow?

519 views Asked by At

I have a 2d kernel,

k = gpflow.kernels.RBF(lengthscales=[24*5,1e-5])
m = gpflow.models.GPR(data=(X,Y), kernel=k, mean_function=None)

and I want to fix the lengthscale in the 2nd dimension, and just optimise the other.

I can disable all lengthscale optimisation using,

gpflow.set_trainable(m.kernel.lengthscales, False) 

but I can't pass just one dimension to this method.

In GPy we would call m.kern.lengthscale[1:].fixed() or something.

Maybe I could use a transform to roughly achieve this (e.g. here), but that's quite complicated.

1

There are 1 answers

0
STJ On BEST ANSWER

GPflow uses a single tf.Variable for each parameter - such as a kernel's lengthscales - and TensorFlow only allows you to change the trainable status of a Variable as a whole. Having a separate parameter per dimension would not be easy to implement for arbitrary dimensions, but you can easily subclass the kernel you want and override lengthscales with a property as follows:

import gpflow
import tensorflow as tf

class MyKernel(gpflow.kernels.SquaredExponential):  # or whichever kernel you want
    @property
    def lengthscales(self) -> tf.Tensor:
        return tf.stack([self.lengthscale_0, self.lengthscale_1])

    @lengthscales.setter
    def lengthscales(self, value):
        self.lengthscale_0 = gpflow.Parameter(value[0], transform=gpflow.utilities.positive())
        self.lengthscale_1 = value[1]  # fixed

Then you can simply use k = MyKernel(lengthscales=[24*5, 1e-5]). (Though a lengthscale of 1e-5 doesn't look right! But that's outside the scope of this question.)

This works because the superclass's __init__ (in gpflow.kernels.Stationary) assigns self.lengthscales = Parameter(lengthscales, transform=positive()), so in this custom class this instead calls the property setter which in turn creates the two separate attributes. The property getter then stitches them back together for the methods that actually expect the two-dimensional vector.