Closed vandalt closed 4 weeks ago
Hi @vandalt the ndims_params
is available from the instantiated RandomVariable Op, but not passed to the classmethod rng_fn
indeed. Would it be a solution to just hardcode the ndims_params in your call to broadcast_params
?
Also there's some inefficiency in your rng_fn. If size is provided (size is not None), you shouldn't have to broadcast the params together, it's enough to have your call below where you broadcast each param to size (there you are also hardcoding implicitly the ndims_params when you do param.shape[-n:]
btw).
Only when size is not provided do you need to broadcast the params, as size is implicitly the broadcasted batch shape of all the parameters.
Here is an example how we're using it internally, also hard-coded: https://github.com/pymc-devs/pytensor/blob/5d4b0c4b9a1e478dda48e912ee708a9e557e9343/pytensor/tensor/random/basic.py#L1797-L1801
Thanks! Hardcoding works and I opened a pr in Celerite2. I'll open a separate issue regarding inefficiencies in rng_fn()
.
Describe the issue:
I'm trying to fix a deprecation warnings in celerite2 following the recent update, namely the one about
ndims_params
andndim_supp
. I updated the customRandomVariable
with asignature
attribute.However, the
rng_fn
classmethod (link here) in celerite2 usedndim_params
, which is no longer available viacls
because it is now created when creating a new instance. I saw that some distributions in PyMC use the private method_parse_gufunc_signature
along with_class_or_instancemethod
, but I was not sure this was the best way forward for code outside PyMC 3.I'm not sure this migration/update issue is technically a bug, but any advice on how to update this code following the recent update would be welcome!
Reproducable code example:
Error message:
PyTensor version information:
Context for the issue:
No response