gpytorch.optim¶
NGD¶
-
class
gpytorch.optim.
NGD
(params: Iterable[Union[torch.nn.parameter.Parameter, dict]], num_data: int, lr: float = 0.1)[source]¶ Implements a natural gradient descent step. It can only be used in conjunction with a
_NaturalVariationalDistribution
.See also
- Example:
>>> ngd_optimizer = torch.optim.NGD(model.variational_parameters(), num_data=train_y.size(0), lr=0.1) >>> ngd_optimizer.zero_grad() >>> mll(gp_model(input), target).backward() >>> ngd_optimizer.step()