Closed loreloc closed 1 year ago
Some comments on nomenclature:
pc
in cirkit.models.pcs.tensorized_pc
? I'd go with cirkit.models.tensorized_circuit
as we do not need to stick to PCs now and one more layer in the hierarchy does not feel useful to disambiguate among what we will have factorized
feels a bit ambiguous. Why not having directly sum_product
after layers
?Some comments on nomenclature:
* do we really need `pc` in `cirkit.models.pcs.tensorized_pc`? I'd go with `cirkit.models.tensorized_circuit` as we do not need to stick to PCs now and one more layer in the hierarchy does not feel useful to disambiguate among what we will have * `factorized` feels a bit ambiguous. Why not having directly `sum_product` after `layers`?
Having cirkig.models.tensorized_circuit
is ok for me, although we have only PCs for now.
We can have cirkit.layers.sum_product
, but I am not sure yet if this kind of layers will require a common interface that is different from the one of Layer
.
we could specialize it later, if needed
cirkit.models.einet
tocirkit.models.pcs.tensorized_pc
.einet_address
from the entire code base:einet_address
was an attribute of region graph nodes that was updated in-place from the previous EiNets implementation. It was replaced by a "throw-away" dictionary that maps region graph node IDs to (i) the corresponding index in the folded tensor and (ii) the layer that computes the folded tensor.replica_idx
attribute of region graph nodes was retained instead, as it is used to fold the input layers. I propose to keep it for now, as it can be a useful attribute for all those region graphs that are "replicated" (e.g., the binary randomized structure in RAT-SPNs), hence requiring "replicated" input layers.EinsumNetworks
:cirkit.layers.factorized
that will contains all those layers whose sums and products are "fused", e.g., CP and Tucker2 implementation. The base class for this kind of layers is found incirkit.layers.factorized.sum_product
.Closes #73 .