It would be great if gpytorch.Module had methods for getting different specific kinds of parameters. For example, model.named_kernel_parameters() might only get parameters registered to Kernel modules.
A better solution would be to get parameters belonging to modules that are instances of VariationalDistribution or VariationalStrategy.
It might make sense for these methods to live in more specific classes than Module. For example, named_variational_parameters might live in AbstractVariationalGP, while named_kernel_parameters might live in the GP super class
It would be great if gpytorch.Module had methods for getting different specific kinds of parameters. For example,
model.named_kernel_parameters()
might only get parameters registered toKernel
modules.This exists for variational parameters, but it is extremely hacky: https://github.com/cornellius-gp/gpytorch/blob/7450719cdfda2e5e00efe00cc6a13e17a03a09c5/gpytorch/module.py#L41-L44
A better solution would be to get parameters belonging to modules that are instances of VariationalDistribution or VariationalStrategy.
It might make sense for these methods to live in more specific classes than Module. For example,
named_variational_parameters
might live inAbstractVariationalGP
, whilenamed_kernel_parameters
might live in theGP
super class