TensorBFS / TensorInference.jl

Probabilistic inference using contraction of tensor networks
https://tensorbfs.github.io/TensorInference.jl/
MIT License
18 stars 2 forks source link

Performance evaluation is missing #68

Closed gdalle closed 1 year ago

gdalle commented 1 year ago

This package exists because tensor networks are assumed to be performant, but this claim is not backed by quantitative results. I see that there are already benchmarks in the repo, would it be hard to compare them with other existing algorithms (at the very least JunctionTrees.jl)?

https://github.com/openjournals/joss-reviews/issues/5700

mroavi commented 1 year ago

Thanks for bringing this up, and you're absolutely right. Quantitative results are essential for backing up performance claims. We're already in the process of comparing our package against Merlin and libdai, which are two C++ libraries that have performed well in past UAI inference competitions. These benchmarks should provide a good point of reference for the performance of our package.

Adding some of these results to the paper is a great idea. We'll make sure to include them to strengthen our claims about the package's performance. And yes, throwing in a comparison with JunctionTrees.jl could also be valuable. We'll look into it.

mroavi commented 1 year ago

Hi @gdalle . We just wrapped up a performance comparison against three other libraries: Merlin, libDAI, and, of course, JunctionTrees.jl. Merlin and libDAI are C++ libraries that have historically performed well in UAI inference competitions. JunctionTrees.jl is essentially the predecessor of TensorInference.jl and is also implemented in Julia. You can check out all the benchmark results on our docs dev site: TensorInference Performance Evaluation.

mroavi commented 1 year ago

We added a reference in the paper to the performance evaluation section in the documentation.