ai2cm / fv3net

explore the FV3 data for parameterization
MIT License
16 stars 3 forks source link

`make lock_pip` and dataflow-versions.txt conflicts #1939

Open frodre opened 2 years ago

frodre commented 2 years ago

I was trying to recreate a constraints file with some additional packages for a pytorch environment and was running into incompatibilities between our pip requirements and the beam image values.

What's the deciding protocol for which version is correct?

Overall, it seems like pip-compile doesn't take into account some more flexible constraints from different files.

Steps to reproduce the problem:

  1. rm constraints.txt
  2. make lock_pip
frodre commented 2 years ago

google-cloud-core:

There are incompatible versions in the resolved dependencies:
  google-cloud-core==1.7.2 (from -r .dataflow-versions.txt (line 34))
  google-cloud-core<2.0dev,>=1.4.1 (from google-cloud-spanner==1.19.1->-r .dataflow-versions.txt (line 42))
  google-cloud-core<2,>=0.28.1 (from apache-beam==2.37.0->-r pip-requirements.txt (line 1))
  google-cloud-core<3.0dev,>=2.3.0 (from google-cloud-storage==2.4.0->vcm (external/vcm/setup.py))
  google-cloud-core<3.0.0dev,>=1.4.1 (from google-cloud-bigquery==2.32.0->-r .dataflow-versions.txt (line 31))
  google-cloud-core<2.0dev,>=1.4.1 (from google-cloud-bigtable==1.7.0->-r .dataflow-versions.txt (line 33))
  google-cloud-core<2.0dev,>=1.4.0 (from google-cloud-datastore==1.15.3->-r .dataflow-versions.txt (line 35))

Fixed by adjusting external/vcm/setup.py requirement google-cloud-storage>=1.18.1 -> google-cloud-storage<2.1.0,>=1.18.1

frodre commented 2 years ago

tenacity:

There are incompatible versions in the resolved dependencies:
  tenacity==5.1.5 (from -r .dataflow-versions.txt (line 112))
  tenacity>=6.2.0 (from plotly==5.9.0->-r pip-requirements.txt (line 21))

Fixed by removing from the dataflow versions output

frodre commented 2 years ago

packaging:

There are incompatible versions in the resolved dependencies:
  packaging==21.3 (from -r .dataflow-versions.txt (line 78))
  packaging (from h5netcdf==1.0.1->vcm (external/vcm/setup.py))
  packaging>=16.8 (from bokeh==2.4.3->wandb[media]==0.12.21->fv3fit (external/fv3fit/setup.py))
  packaging (from jupyterlab==3.4.3->-r pip-requirements.txt (line 53))
  packaging (from jupyter-server==1.18.1->jupyterlab==3.4.3->-r pip-requirements.txt (line 53))
  packaging (from pytest-mpl==0.16.0->-r pip-requirements.txt (line 57))
  packaging (from jupyterlab-server==2.15.0->jupyterlab==3.4.3->-r pip-requirements.txt (line 53))
  packaging (from holoviews==1.15.0->-r pip-requirements.txt (line 19))
  packaging>=14.1 (from streamlit==1.11.0->-r pip-requirements.txt (line 20))
  packaging<21.0,>=20.4 (from poetry==1.1.14->conda-lock==1.1.0->-r pip-requirements.txt (line 48))
  packaging>=20.0 (from xarray==2022.3.0->vcm (external/vcm/setup.py))
  packaging (from sphinx==5.0.2->-r pip-requirements.txt (line 7))
  packaging (from deprecation==2.1.0->-r .dataflow-versions.txt (line 14))
  packaging>=14.3 (from google-api-core[grpc,grpcgcp]==1.31.5->vcm (external/vcm/setup.py))
  packaging>=14.3 (from google-cloud-bigquery==2.32.0->-r .dataflow-versions.txt (line 31))
  packaging>=20.0 (from dask[array]==2022.7.0->vcm (external/vcm/setup.py))
  packaging>=19.0 (from build==0.8.0->pip-tools==6.8.0->-r pip-requirements.txt (line 49))
  packaging>=20.0 (from pooch==1.6.0->metpy==1.3.1->vcm (external/vcm/setup.py))
  packaging>=14 (from tox==3.25.1->-r pip-requirements.txt (line 50))
  packaging>=20.0 (from matplotlib==3.5.2->fv3fit (external/fv3fit/setup.py))
  packaging (from pytest==4.6.11->vcm (external/vcm/setup.py))

removed from dataflow-versions.txt output

frodre commented 2 years ago

Seems to work fine if I don't delete the constraints file before recreating. I'm guessing it just updates a minimal set with the new requirements.