silx-kit / pyFAI

Fast Azimuthal Integration in Python
Other
106 stars 95 forks source link

Spack Package for pyFAI #1465

Open RobertRosca opened 3 years ago

RobertRosca commented 3 years ago

Hey, I'm a member of the Data Analysis team at European XFEL and for a while we've been thinking of using a tool called Spack to manage the software environments we provide our users. For some context, here's a brief description of what Spack is:

Spack is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures, using libraries that do not have a standard ABI. Spack is non-destructive: installing a new version does not break existing installations, so many configurations can coexist on the same system.

You can find more details on the Spack repo (https://github.com/spack/spack/) or on their docs pages (https://spack.readthedocs.io/en/latest/).

Basically, Spack lets you define the build process for some software as a Python class, a lot of build systems are supported, with Python being one of them. Spack has thousands of packages defined already, including hundreds of Python packages.

Creating a 'standard' setuptools-based Python package in Spack is pretty simple, here are some examples:

It's also possible to make Spack packages for more complex Python packages as well:

With pytorch being probably one of the more complex Spack Python packages, as it's a CUDA package as well as a python one, and has a lot of compilation/build environment options.

I've been working with @julianhoersch (student hired to help with this project) on creating Spack packages for our use at European XFEL, and we'd like to make a Spack package for pyFAI as well, but it has a pretty complex setup and neither of us really feel like we understand the process well enough to implement it as a Spack package.

Does creating a Spack package for pyFAI seem like it would be useful? If anybody on the pyFAI team is interested in this I'd be happy to have a meeting where we can discuss this further and try to work out the best way to develop a Spack package for pyFAI.

Thanks,

Robert

kif commented 3 years ago

Hi Robert,

We can have a chat on the subject. I would invite @t20100 since he setup generic build-bots for us as well.

Beside this, the policy of ESRF is to provide end users with python wheels which can be installed everywhere (trading performances for genericity). Internally we are using debian packaging with optimal settings for performances but those won't work externally due to different installation stack.

The more packaging, the better, Soleil is providing official debian packaging, NSLS-II helps for conda-forge, then basically for EuXFEL it would be Spack. There has been recently a release of pyFAI, so I guess it is the perfect time for packaging.

The most tricky part is usually the dependency tree. There are requirements.txt in the root of the source, others in the ci directory and the dependencies in the package directory. Your configuration file should probably go there.

Cheers,

Jerome

RobertRosca commented 3 years ago

Hey Jerome,

We can have a chat on the subject. I would invite @t20100 since he setup generic build-bots for us as well.

Sounds good, any particular time you'd be free to discuss this?

Beside this, the policy of ESRF is to provide end users with python wheels which can be installed everywhere (trading performances for genericity). Internally we are using debian packaging with optimal settings for performances but those won't work externally due to different installation stack.

Aha, Spack is a good usecase for this. A nice feature of spack are the build pipelines, where you can define a matrix of dependencies:

  - compilers:
    - '%gcc@5.5.0'
    - '%gcc@6.5.0'
    - '%gcc@7.3.0'
    - '%clang@6.0.0'
    - '%clang@6.0.1'
  - oses:
    - os=ubuntu18.04
    - os=centos7
  - arches:
    - target=skylake
    - target=ivybridge
    - target=x86_64

Spack will then build your package with those dependencies and can provide binaries/wheels as artifacts at the end of the pipeline, this way you can provide specialised python wheels for different architectures as well.

The most tricky part is usually the dependency tree. There are requirements.txt in the root of the source, others in the ci directory and the dependencies in the package directory. Your configuration file should probably go there.

It could be stored here, Spack (at least currently, this might change later) stores all of the package definitions in the spack repository itself, the idea is you make a PR against Spack to add in your package and set yourself as a maintainer, this can then be updated for new releases of your package, and if anybody else wants to modify it then you are notified automatically if you're on the maintainers list for the package.

If you and @t20100 are free on Wednesday that would be a good time to meet as @julianhoersch works with us on Wednesdays usually.

Cheers,

Robert

kif commented 3 years ago

I sent an invitation to all people involved. Other can join, just ask here.