TensorBFS / TensorInference.jl

Probabilistic inference using contraction of tensor networks
https://tensorbfs.github.io/TensorInference.jl/
MIT License
18 stars 2 forks source link

Rescale tensors and choose correct contraction order optimizer in tests #16

Closed GiggleLiu closed 2 years ago

GiggleLiu commented 2 years ago

In this PR, I

Test Report

For "relational" dataset, none of the optimizers in OMEinsum find any proper contraction order. (there are more than 60k tensors, the size is too big I think).

The rest instances all pass except linkage_15 fails and ObjectDetection_35, I still do not have any clue about why these two instances are so special.

GiggleLiu commented 2 years ago

Update

Now I use the following solution to the rescaling problem:

  1. create a new array type RescaledArray
    struct RescaledArray{T, N, AT<:AbstractArray{T, N}} <: AbstractArray{T, N}
    logf::T
    value::AT
    end

It represents array exp(logf) * value

  1. implement einsum and some other function interfaces,

    for CT in [:DynamicEinCode, :StaticEinCode]
    @eval function OMEinsum.einsum(code::$CT, @nospecialize(xs::NTuple{N,RescaledArray}), size_dict::Dict) where N
        res = einsum(code, getfield.(xs, :value), size_dict)
        return rescale_array(RescaledArray(sum(x->x.logf, xs), res))
    end
    end
  2. Then the overflow problem should disappear if we use this new array type for tensor network contraction.