Here are some examples highlighting GPyTorch’s more advanced features.
GPyTorch makes it possible to train/perform inference with a batch of Gaussian processes in parallel. This can be useful for a number of applications:
- Modeling a function with multiple (independent) outputs
- Performing efficient cross-validation
- Parallel acquisition function sampling for Bayesian optimization
- And more!
Here we highlight a number of common batch GP scenarios and how to construct them in GPyTorch.
- Multi-output functions (with independent outputs). Batch GPs are extremely efficient at modelling multi-output functions, when each of the output functions are independent. See the Batch Independent Multioutput GP example for more details.
- For cross validation, or for some BayesOpt applications, it may make sense to evaluate the GP on different batches of test data. This can be accomplished by using a standard (non-batch) GP model. At test time, feeding a b x n x d tensor into the model will then return b batches of n test points. See the Batch Mode Regression example for more details.