Added TensorizedPC.integrate() instance method for computing arbitrary marginals/integrals over sets of variables. This lazily instantiates a new layer that wraps the input layer.
x: torch.Tensor
pc: TensorizedPC
ivars: [3, 4] # Marginalize two variables, or
ivars: [[3, 4], [3, 4, 5]] # batch of marginals, assuming x.shape[0] = 2
log_mar_scores = pc.integrate(x, ivars)
Added integrate() standalone function that does not copy parameters and with the following typical usage scenario
It seems that shallow copies do not work with torch modules. Therefore, the addition above required specifying a TensorizedPC.from_region_graph() class method that builds a folded & tensorized circuit given a region graph, layers and hyper-parameters. The __init__() method instead requires passing the modules (e.g., input and inner layers) and the book-keeping data structure. As such, the constructor is expected to be used internally only.
Added a ScopeLayer in order to disentangle the re-ordering of outputs of the input layer with the input layer itself, as intended in the original EinsumNetworks implementation by Robert. This was necessary to implement the IntegralInputLayer that applies integration/marginalization masks.
Fixed some names in tests as I was already including tests for marginalization in PCs.
Closes #94. Closes #106 (minor issue).
Will rename commits before merging.
TensorizedPC.integrate()
instance method for computing arbitrary marginals/integrals over sets of variables. This lazily instantiates a new layer that wraps the input layer.integrate()
standalone function that does not copy parameters and with the following typical usage scenarioTensorizedPC.from_region_graph()
class method that builds a folded & tensorized circuit given a region graph, layers and hyper-parameters. The__init__()
method instead requires passing the modules (e.g., input and inner layers) and the book-keeping data structure. As such, the constructor is expected to be used internally only.ScopeLayer
in order to disentangle the re-ordering of outputs of the input layer with the input layer itself, as intended in the original EinsumNetworks implementation by Robert. This was necessary to implement theIntegralInputLayer
that applies integration/marginalization masks.Closes #94. Closes #106 (minor issue). Will rename commits before merging.