wilson-eft / wilson

A Python package for the running and matching of Wilson coefficients above and below the electroweak scale
https://wilson-eft.github.io
MIT License
26 stars 19 forks source link

EFT WET not defined #75

Closed klieret closed 2 years ago

klieret commented 2 years ago

Hi @jasonaebischerGIT ,

since the latest release, ClusterKinG testing fails with:

Traceback (most recent call last):
  File "/opt/hostedtoolcache/Python/3.8.11/x64/lib/python3.8/site-packages/wilson/wcxf/classes.py", line 371, in validate
    eft_instance = EFT[self.eft]
  File "/opt/hostedtoolcache/Python/3.8.11/x64/lib/python3.8/site-packages/wilson/wcxf/classes.py", line 115, in __getitem__
    return cls.get_instance(item)
  File "/opt/hostedtoolcache/Python/3.8.11/x64/lib/python3.8/site-packages/wilson/wcxf/classes.py", line 146, in get_instance
    return cls.instances[_name]
AttributeError: type object 'EFT' has no attribute 'instances'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/hostedtoolcache/Python/3.8.11/x64/lib/python3.8/multiprocessing/pool.py", line 125, in worker
    result = (True, func(*args, **kwds))
  File "/home/runner/work/clusterking/clusterking/clusterking/scan/scanner.py", line 69, in calc
    spoint = self._prepare_spoint(spoint)
  File "/home/runner/work/clusterking/clusterking/clusterking/scan/wilsonscanner.py", line 24, in _prepare_spoint
    return wilson.Wilson(
  File "/opt/hostedtoolcache/Python/3.8.11/x64/lib/python3.8/site-packages/wilson/classes.py", line 124, in __init__
    self.wc.validate()
  File "/opt/hostedtoolcache/Python/3.8.11/x64/lib/python3.8/site-packages/wilson/wcxf/classes.py", line 373, in validate
    raise ValueError("EFT {} not defined".format(self.eft))
ValueError: EFT WET not defined

see here.

It's not apparent to me from the latest release notes that some breaking changes were introduces, so this looks like a bug in wilson to me.

Cheers, Kilian

DavidMStraub commented 2 years ago

Hi Kilian, you are right. It seems the new deploy pipeline (#73) generates a PyPI package that does not include the wcxf/bases git submodule.

@dvandyk @jasonaebischerGIT

DavidMStraub commented 2 years ago

It does say with: submodules: true here https://github.com/wilson-eft/wilson/blob/master/.github/workflows/check%2Bdeploy.yaml#L14-L18, so no idea why that doesn't work...

klieret commented 2 years ago

Hmm, according to the log from the deploy, the submodule was checked out

dvandyk commented 2 years ago

I upload a zip of all generated distributions as a Github Actions "artifact". For the new version, the artificat is available here: https://github.com/wilson-eft/wilson/suites/3444819328/artifacts/81704384

Checking the tarball (wilson-2.2.tar.gz), I find that the bases are present! Checking the wheel (wilson-2.2-py3-none-any.whl), I also find that the bases are present.

dvandyk commented 2 years ago

So, I don't know who uploaded the release to PyPI, but it was not the new Github workflow.

a) There is only a single tarbal on PyPi but now wheel. However, the workflow built both. b) There is no Github release, which triggers the upload to PyPI. c) The workflow log does list the "upload to PyPI" step as no executed.

In short, not my fault!

DavidMStraub commented 2 years ago

I think there is a problem with the workflow. It only triggers on pushes, but then checks for the presence of a tag. This will not work though when doing the usual thing on Github: pushing a final commit an then creating a release via the web interface.

I guess the workflow is missing something like

on:
  release:
    types:
      - created

(please check, I'm just guessing.)

@jasonaebischerGIT, did you push the release to PyPI manually with twine?

dvandyk commented 2 years ago

I think there is a problem with the workflow. It only triggers on pushes, but then checks for the presence of a tag. This will not work though when doing the usual thing on Github: pushing a final commit an then creating a release via the web interface.

I guess the workflow is missing something like

on:
  release:
    types:
      - created

(please check, I'm just guessing.)

Ah, my bad. I have a different workflow. I use

@DavidMStraub I will check your suggested fix.

jasonaebischerGIT commented 2 years ago

@DavidMStraub Yes, I pushed the release manually with twine. From your discussion I understand that this is not the right way to do it? I should it be done correctly?

DavidMStraub commented 2 years ago

Hi, it used to be the right way to do it, but Danny's new workflow should in principle do it automatically (only that it didn't).

jasonaebischerGIT commented 2 years ago

Oh, I see. Thanks for the explanation

klieret commented 2 years ago

I'm also a bit confused by your workflow. Do you really wish to release on every push?

Asides from how you release, it also pays off to put additional tests that install the released package and then run the unittests. This would also fail if you e.g., forget to bundle a data file in the package or something else goes wrong during the packaging/release/installation.

I currently have the following setup:

  1. Normal testing builds the package in one job, uploads it as an artifact, then downloads the package artifact and installs it under a variety of OS. This tests all the packaging, only the upload to pypi isn't tested.
  2. Release testing uploads to test pypi and installs from there, again testing various OS. It is manually run (can be only run once per version number). After the run you can also check testPypi to e.g. see that the readme there is rendered properly.
  3. Releasing. Almost copy paste of release testing (maybe there's ways to reduce code duplication?) only that it uploads to the real pypi. Also manually run (can be only run once per version number)
MJKirk commented 2 years ago

Just to +1 this issue, the new wilson version also breaks new installs of flavio and smelli, so it would be good to get a fix.

klieret commented 2 years ago

How about you yank the release for now? This can be done as follows: Log in to pypi, go to your projects, click on "manage". Here's what I see for one of my projects: image Click on "options" for the broken release and then hit "yank"

What's a "yanked" release? A yanked release is a release that is always ignored by an installer, unless it is the only release that matches a version specifier (using either == or ===). See PEP 592 for more information.

This would avoid the trouble with all packages that depend on wilson because they wouldn't update automatically.

dvandyk commented 2 years ago

I'm also a bit confused by your workflow. Do you really wish to release on every push?

No, and the workflow doesn't do that. The step 'Upload to PyPI' is only run if: ${{ startsWith(github.ref, 'refs/tags/v') }}, i.e., if the tag starts with a v.

Asides from how you release, it also pays off to put additional tests that install the released package and then run the unittests. This would also fail if you e.g., forget to bundle a data file in the package or something else goes wrong during the packaging/release/installation.

I have little experience with Python unit tests. How do you force nose to run the tests "out of tree", as it were? A quick google did not yield meaningful results, and I see that your example uses pytest instead.

I currently have the following setup:

  1. Normal testing builds the package in one job, uploads it as an artifact, then downloads the package artifact and installs it under a variety of OS. This tests all the packaging, only the upload to pypi isn't tested.

Since we build a py3-none-any wheel, using a varietys of OS would more test PyPI/pip than wilson. However, I agree that it's a good idea to test installation of the wheel, and this should be done prior to uploading to PyPi. Added.

  1. Release testing uploads to test pypi and installs from there, again testing various OS. It is manually run (can be only run once per version number). After the run you can also check testPypi to e.g. see that the readme there is rendered properly.

The idea of the workflow was to avoid having to manually deploy or manually run any aspect of the test/deployment.

klieret commented 2 years ago

No, and the workflow doesn't do that. The step 'Upload to PyPI' is only run

Oh, overlooked that :sweat_smile:

Since we build a py3-none-any wheel, using a varietys of OS would more test PyPI/pip than wilson.

No, this would build the package once and then install and test it under different OS. This would for example catch issues when paths are hard coded with slashes (would fail on Windows) or other file name subtleties. Admittedly it's rare that this catches additional bugs. But it happened to me a couple of times with some OS dependent things.

Not an expert on nose unfortunately.

DavidMStraub commented 2 years ago

Not an expert on nose unfortunately.

I'd recommend using pytest nowadays. nose is no longer being maintained, and nose2 is much less popular/active IMO.