GPyTorch, how to set initial value for "lengthscale" hyperparameter?

2.5k views Asked by At

I am using GPyTorch regressor according to the documentation.

I would like to set an initial value for the "lengthscale" hyperparameter in RBF kernel.

I want to set a constant number as initial value for "lengthscale" (similar to what we can do in scikit-learn Gaussian Process Regressor).

If you have any idea, please let me know.

1

There are 1 answers

3
aiish On BEST ANSWER

There are two cases that follow from your question:

  1. You want to initialize your lengthscale with some value but the lengthscale is then optimized on further by the optimizer

    Assuming you have the same model as given in the documentation you have linked, just add the following before your training loop:

    init_lengthscale = 0.1
    model.covar_module.base_kernel.lengthscale = init_lengthscale
    

    The model.covar_module gets your entire kernel and the base_kernel gets you your RBF kernel.

  2. You want to fix your lengthscale as some constant value which will not be further optimized

    In addition to the code for the first case, you do not feed the lengthscale as a hyperparameter to be optimized to your optimizer.

    all_params = set(exactModel.parameters())
    final_params = list(all_params - {exactModel.covar_module.base_kernel.raw_lengthscale})
    optimizer = torch.optim.Adam(final_params, lr=0.1)
    

    We remove the set of raw lengthscale values from all_params to create final_params which we then pass to the optimizer.

Some sources to help: