dgasmith / opt_einsum

⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
https://dgasmith.github.io/opt_einsum/
MIT License
848 stars 68 forks source link

Small fixes: 'dp' and memory_limit + tensordot axes order #154

Closed jcmgray closed 3 years ago

jcmgray commented 3 years ago

Description

This PR:

  1. Fixes the bug where a too low memory_limit meant the 'dp' optimizer would search forever (#153)
  2. Puts tensordot axes in a canonical order (the order they appear on the first operand), so that performance shouldn't change unexpectedly (#143)
  3. Changes an isinstance call to infer_backend (is cleaner? and fixes a rare bug when mixing inputs for e.g. jax compiled contraction)

Status

codecov[bot] commented 3 years ago

Codecov Report

Merging #154 into master will increase coverage by 0.00%. The diff coverage is 100.00%.