This PR fixed the bp edge case where the network contracts to the scalar 0 which would previously yield Nan.
This is achieved by:
Updating the default_message_update function to only divide by the norm if its not zero -> avoiding the creating of message tensors with nan in them.
Adding to logscalar(alg::Algorithm"bp", tn::AbstractITensorNetwork) a check if any of the denominators terms are zero. If this is the case then log(tn) should be -Inf. This is because, on a tree, if the product of message(pe) and message(reverse(pe) on any partitionedge is zero then the contraction of the tree must be zero (assuming the messages are converged).
A quick test is added.
@ryanlevy this should fix the issue with bp in ITensorNumericalAnalysis.jl
This PR fixed the
bp
edge case where the network contracts to the scalar0
which would previously yieldNan
.This is achieved by:
default_message_update
function to only divide by thenorm
if its not zero -> avoiding the creating of message tensors withnan
in them.logscalar(alg::Algorithm"bp", tn::AbstractITensorNetwork)
a check if any of the denominators terms are zero. If this is the case thenlog(tn)
should be-Inf
. This is because, on a tree, if the product ofmessage(pe)
andmessage(reverse(pe)
on anypartitionedge
is zero then the contraction of the tree must be zero (assuming the messages are converged).A quick test is added.
@ryanlevy this should fix the issue with
bp
inITensorNumericalAnalysis.jl