dgasmith / opt_einsum

⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
https://dgasmith.github.io/opt_einsum/
MIT License
822 stars 67 forks source link

Remove NumPy as a hard dependency #204

Closed dgasmith closed 5 days ago

dgasmith commented 1 year ago

Description

This PR removes NumPy from opt_einsum as a runtime dependency. The change has little effect on the opt_einsum code base but significantly effects the testing infrastructure. Several testing strategies are currently employed to help understand the most effective one. Currently, these are:

Feedback welcome!

Todos

Notable points that this PR has either accomplished or will accomplish.

Questions

Status

dgasmith commented 1 year ago

I didn't get to every test yet. I'll hack a bit this weekend.

dgasmith commented 1 year ago

@janeyx99 Is this still of interest to Torch? I think we're relatively close on removing that last of the NumPy dependancies.

janeyx99 commented 1 year ago

Yes! Removing the last dependencies would be awesome!

dgasmith commented 1 year ago

Apologies- I started a new job. This is in my mind and I hope to hack a little tomorrow night!

dgasmith commented 2 months ago

@janeyx99 I'm looking to push another release shortly, is this still a requested feature?

janeyx99 commented 1 month ago

@dgasmith yes, torch still doesn't have a hard dependency on numpy (our tests do) so this is still desirable! Sorry it took a week to get back to you--this fell in my personal inbox.

dgasmith commented 1 month ago

I'll get in #232 first and then finalize this PR. It will be quite tidy to have no formal dependancies.

codecov[bot] commented 2 weeks ago

Codecov Report

Attention: Patch coverage is 92.68293% with 12 lines in your changes missing coverage. Please review.

Project coverage is 93.77%. Comparing base (1a984b7) to head (8211264). Report is 5 commits behind head on master.

Additional details and impacted files
dgasmith commented 2 weeks ago

@jcmgray I think this is getting much closer. I updated the outstanding items in the OP.

dgasmith commented 1 week ago

@jcmgray and @janeyx99 This is ready for review, but likely has a few bugs to still iron out. In particular @janeyx99 is there any tests you would recommend for PyTorch only envs?

janeyx99 commented 1 week ago

@jcmgray and @janeyx99 This is ready for review, but likely has a few bugs to still iron out. In particular @janeyx99 is there any tests you would recommend for PyTorch only envs?

Thanks for furthering this effort! The main reason this would be nice for PyTorch is so that we allow opt-einsum to be a more accessible dependency vs one the user has to specify when pip installing torch.

To that end, we have a few test commands written in bash to ensure that we don't accidentally depend on numpy: https://github.com/pytorch/pytorch/blob/ca5d13c67287efe78a09dcf9b06f7e4fba898689/.ci/pytorch/test.sh#L742-L752

I don't think it's crucial for opt-einsum to have similar tests, because we will test these as a downstream library, but I'm mentioning in case you were looking for similar sanity checks.

dgasmith commented 1 week ago

@janeyx99 I do have a torch-only env explicitly for testing non-numpy evaluations. I like adding the check to ensure that NumPy is not present.