simpeg / aurora

software for processing natural source electromagnetic data
MIT License
13 stars 2 forks source link

PyPI / Conda Forge Release #255

Open kkappler opened 1 year ago

kkappler commented 1 year ago

Phase 3 SOW requires this be complete by Sept 2023.

jcapriot commented 1 year ago

I can help you get this setup pretty quickly if you need it. It can all be automated.

kkappler commented 1 year ago

Thanks @jcapriot - that would be appreciated. I'd like to release this tagged version of main as we have it now as a placeholder if possible. The plan would be then to update the release in September with a substantially different approach to data management (storing intermediate data structures in mth5 -- issue #57 etc.).

kkappler commented 11 months ago

PyPI

A good outcome:

View at: https://test.pypi.org/project/aurora/0.3.8/

Create an environment to test the testpypi release

conda remove -n pypi --all -y (if env already exists) conda create -n pypi python=3.9 -y conda activate pypi

Then pip install code you just pushed

pip install --extra-index-url https://testpypi.python.org/pypi aurora==0.3.8

Within this env you can try importing aurora. To test notebooks, install jupyter within your env:

pip install jupyterlab jupyter-lab Now try running the notebooks in mt_examples

Conda Forge:

raise ResolvePackageNotFound(bad_deps) conda.exceptions.ResolvePackageNotFound: - obspy

I was able to build by first calling conda config --add channels conda-forge

Then the build succeeded (after a very long time Solving environment: ...working...) , but warned a SHA mismatch

Success INFO:conda_build.source:Success SHA256 mismatch: '45d5f1608d41e61f3e46776b5dd4156de78c67f4b37b2909f6d2b02c27cd2b1e' != 'ca3261d1e0e73fadd353c3e992c2fb98081af9775a084020412f52e4455e'

So I updated the SHA to the 45d one and rebuilt

Almost there, but alas build is still failing:

Preparing transaction: ...working... done Verifying transaction: ...working... done Executing transaction: ...working... done export PREFIX=/home/kkappler/anaconda3/conda-bld/aurora_1695942886534/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeho export SRC_DIR=/home/kkappler/anaconda3/conda-bld/aurora_1695942886534/test_tmp import: 'aurora' import: 'aurora'

  • pip check aurora 0.3.2 requires dask, which is not installed. aurora 0.3.2 requires deprecated, which is not installed. aurora 0.3.2 requires fortranformat, which is not installed. aurora 0.3.2 requires mt-metadata, which is not installed. aurora 0.3.2 requires mth5, which is not installed. aurora 0.3.2 requires psutil, which is not installed. aurora 0.3.2 has requirement pandas<1.5, but you have pandas 2.1.1. Tests failed for aurora-0.3.2-py_0.tar.bz2 - moving package to /home/kkappler/anaconda3/conda-bld/broken WARNING:conda_build.build:Tests failed for aurora-0.3.2-py_0.tar.bz2 - moving package to /home/kkappler/anaconda3/conda-bld/broken

The solution to this was to add these dependencies to recipe/meta.yml

PR has been requested on conda-forge

PyPI release for 0.3.2 is done.

Testing Conda

conda create -n conda_aurora python=3.9 -y conda activate conda_aurora conda install aurora pip install jupyterlab conda install ipykernel python -m ipykernel install --user --name conda_aurora jupyter-lab Now try running operate_aurora notebook

kkappler commented 11 months ago

The PR on conda-forge failed on the docker build (but passed locally).

The error, exit with status 137 suggests the build used too much memory.

A local conda build shows the following output:

#################################################################################### Resource usage summary: Total time: 0:25:46.8 CPU usage: sys=0:00:00.0, user=0:00:00.0 Maximum memory usage observed: 14.1M Total disk usage observed (not including envs): 97.6K ####################################################################################

To verify these numbers, I wrapped the conda build recipe with

/usr/bin/time -f "mem=%M RSS=%M elapsed=%E cpu.sys=%S user=%U" conda build recipe/

Which shows:

mem=13146688 RSS=13146688 elapsed=25:52.38 cpu.sys=13.42 user=1530.94

13146688 is the number of Kbytes, which corresponds to 13.1G.

The docker process on azure says Total Memory: 6.76GiB, so it looks like there are two things to try:

  1. Try to reduce the amount of memory consumed by the build ... (it seems to happen during the "Solving environment: ...working..." step, which ideally would be faster)
  2. Increase the allocation of memory in the docker container

To try 1. above without updating to PyPI, you can just replace the the source in the meta.yaml to point at a local aurora repo

source:
  path: .

This allows iterative changes to the repo without the need to update PyPI to test conda build

kkappler commented 11 months ago

Have merged v0.3.3 into main which uses a mere 6.69G of memory compared to the 6.76G limit.

Despite a (i thought successful) namespace claim on testpypi today I get

Received "503: Service Unavailable"
Package upload appears to have failed. Retry 1 of 5. when executing

python -m twine upload --repository testpypi dist/*

so will try skipping test , but sadly, I heeded the 2FA warning today so I need to setup tokens to avoid

Error during upload. Retry with the --verbose option for more details.
ERROR HTTPError: 401 Unauthorized from https://upload.pypi.org/legacy/
User kkappler has two factor auth enabled, an API Token or Trusted Publisher must be used to upload in place of password.

The solution is to use __token__ as username and then paste your token (including the pypi- in front)

kkappler commented 11 months ago

That passed, now we just need an owner (@lheagy) to review the Pull Request, but a lot of lessons learned, and FWIW in future when we do depend on dask (as the earthscope_tests branch actually does) we'll need a scaleble solution, such as instructions on changing the memory allocations for docker on conda forge.

kkappler commented 6 days ago

@kujaku11

I'm working on the 0.3.14 release -- here is a screengrab of testpypi.

I noticed that the badges on the page are showing 0.3.13 (not 0.3.14) -- do you know how to update these? OR maybe they will update on the actual release ..

image

kujaku11 commented 6 days ago

They will update one uploaded to pypi filter a few minutes