pyg-team / pytorch_geometric

Graph Neural Network Library for PyTorch
https://pyg.org
MIT License
21.42k stars 3.68k forks source link

Packages for conda-forge #2790

Open raimis opened 3 years ago

raimis commented 3 years ago

@rusty1s and other developers,

I'm excited to see that PyTorch Geometric can already be installed with conda (https://github.com/rusty1s/pytorch_geometric#anaconda) and seems to be compatible with conda-forge.

Do you have any specific reason not to contribute the packages directly to conda-forge? There has already been some effort to create PyTorch Geometric packages for conda-forge (https://github.com/conda-forge/pytorch_geometric-feedstock, ping @oblute @rluria14).

For context, we are expanding OpenMM (https://github.com/openmm/openmm) capabilities to use ML for molecular simulations (https://github.com/openmm/openmm-torch). The common problem is that many packages from PyPI and other conda channels are binary incompatible and it makes hard to use them. conda-forge solves many of these issues.

Recently, we have migrated OpenMM to conda-forge. If you are interested, @jchodera could connect you with the right people in the conda-forge community.

jchodera commented 3 years ago

It looks like there is a PR to update to the latest pytorch-geometric release that appears to be stalled due to a pinned dependency that causes an ABI consistency. It may be a simple matter to update to the latest versions of everything to eliminate the ABI incompatibility?

rusty1s commented 3 years ago

Thanks for reaching out. I actually didn‘t know about the efforts of putting PyTorch Geometric into conda-forge. This is great to see.

I actually put my repos in my own channel because it is quite cumbersome to deal with different CUDA versions and PyTorch ABI issues, as this allows me to update packages more conveniently whenever a new update/PyTorch version/CUDA support is available. If you think this is well doable in conda-forge, I‘m happy to join forces. However, it is unclear to me what the advantages are of putting repos into conda-forge rather than my own channel, as PyTorch is doing the same as well.

jchodera commented 3 years ago

@rusty1s : The conda-forge ecosystem aims to make it easier to solve compatibility issues (like the ABI compatibility problems) by using a harmonized compiler suite for all packages built in the conda-forge channels. There is an elegant feedstock-based automatic mechanism for keeping packages up to date when new releases are cut. Recently, we've made significant progress in building GPU-enabled packages that use CUDA/OpenCL for tools like OpenMM.

A brief overview of advantages of migrating tools like pytorch to conda-forge is given here, but one of the principal advantages is that it makes it possible to build conda-installable packages that depend on your library (here, PyTorch Geometric) since it can be pulled from conda-forge as a dependency. This isn't possible if it lives in a separate channel without more complicated installation procedures (like distributing an environment.yml or having users manually install dependencies).

jchodera commented 3 years ago

Oh, I should note that these aren't exclusive distribution mechanisms! It's perfectly reasonable to have both conda-forge and non-conda-forge channels at the same time. But there are significant advantages to the conda-forge ecosystem in making it easy to build tools/applications on top of other libraries within conda-forge.

sarthakpati commented 3 years ago

Hi,

An incomplete conda-forge package is available via https://github.com/conda-forge/torch-scatter-feedstock

I agree that having availability through own mechanisms definitely makes targeting all OS's easier (e.g., windows pytorch is only available through the pytorch channel), and I feel that having both options is definitely feasible. I am more than happy to help, of course.

rusty1s commented 3 years ago

Thanks. Is there a roadmap for what is missing in conda-forge/torch-scatter-feedback or where you might need help? I'm happy to help if time allows.

sarthakpati commented 3 years ago

There isn't a roadmap, per say.

As far as missing items go, I had to use this line [ref] to ensure the conda-forge builds worked correctly:

skip: true  # [win or (linux and cuda_compiler_version in (undefined, 'None', '10.2'))]

But this basically means that only OSX CPU builds are getting taken up, and nothing else. I tried multiple variants to get Windows and Linux builds to work but didn't get far.

rusty1s commented 3 years ago

Can you show me the error that is produced by failing builds (happy to move this discussion to somewhere else, e.g., directly in conda-forge/torch-scatter-feedback`)? I build with GitHub actions and there are no modifications required to let the build succeed.

sarthakpati commented 3 years ago

Tracking this here: https://github.com/conda-forge/torch-scatter-feedstock/issues/3

sarthakpati commented 3 years ago

This should be operational: https://anaconda.org/conda-forge/torch-scatter

raimis commented 3 years ago

@sarthakpati thanks for your effort!

Do you have plant to make packages for PyTorch Geometric and its other dependencies?

sarthakpati commented 3 years ago

I had a quick look and it does seem to be pretty involved. I won't be able to get to it any time soon but I am happy to help out if you are starting this on your own.

hadim commented 2 years ago

Maybe a quick feedback here: we have been using the pytorch_geometric CF package for a while now, and it's working great for us. We moved away from using the pyg channel since it was creating many dep solving issues or binary incompat issues when used with CF.

The only missing piece compared to the packages on the pyg channel is windows support: https://github.com/conda-forge/pytorch_geometric-feedstock/issues/25

I also want to highlight that in addition to moving away from the pyg channel, we also moved away from using the pytorch and anaconda channel as well (for the same reasons as above). We now rely solely on the conda-forge channel, and we saw our dep solving issues decreased significantly since then.