Closed n28div closed 1 week ago
When using a constant parametrization on the reals domain in the categorical layer the probabilities are then used in the log domain and result in NaN.
Minimal code to reproduce the issue:
import numpy as np import torch from cirkit.symbolic.layers import CategoricalLayer from cirkit.symbolic.parameters import Parameter, ConstantParameter cl = CategoricalLayer(Scope([0]), 1, 1, num_categories=2, probs=Parameter.from_input( ConstantParameter(1, 1, 2, value=np.array([0.0, 1.0]).reshape(1, 1, -1)) ) ) ctx = PipelineContext(backend='torch', fold=False, optimize=False, semiring='sum-product') symbolic_circuit = Circuit(1, [cl], {}, [cl]) circuit = ctx.compile(symbolic_circuit) circuit(torch.tensor([1]).reshape(1, 1, 1)) >>> tensor([[[nan]]])
When using a constant parametrization on the reals domain in the categorical layer the probabilities are then used in the log domain and result in NaN.
Minimal code to reproduce the issue: