catalystneuro / buzsaki-lab-to-nwb

BSD 3-Clause "New" or "Revised" License
1 stars 1 forks source link

Update contents of `make_env.yml` to match `requirements.txt` #40

Closed h-mayorquin closed 2 years ago

h-mayorquin commented 2 years ago

The contents of the file make_env.yml seem to be outdated and out of synch with requiremens.txt:

make_env.yml https://github.com/catalystneuro/buzsaki-lab-to-nwb/blob/7e7c9c2f7469c652b3de3f1d6bc2ffe06b24d426/make_env.yml#L1-L21

requirements.txt https://github.com/catalystneuro/buzsaki-lab-to-nwb/blob/7e7c9c2f7469c652b3de3f1d6bc2ffe06b24d426/requirements.txt#L1-L19

Is there a reason for this mismatch? I am intending to update the make_env.yml file to make it match as I have been using it in dandi to run our conversion.

Can I move forward with this @CodyCBakerPhD ?

CodyCBakerPhD commented 2 years ago

Never used the make_env.yml myself - go right ahead if you want to synchronize the two.

as I have been using it in dandi to run our conversion.

Not sure what you mean by this, though - you should just be able to clone the repo, pip install -e buzsaki-lab-to-nwb in a fresh environment, and run it from there. Full environment specification is more for communicating python versions and/or conda vs. pip install sources

h-mayorquin commented 2 years ago

pip install -e buzsaki-lab-to-nwb is failing on dandy for me and producing the following error:

ERROR: No matching distribution found for scipy==1.4.1

CodyCBakerPhD commented 2 years ago

The following just worked fine for me on the Hub

# log in
bash
conda create -n test_buz_install python=3.7
git clone https://github.com/catalystneuro/buzsaki-lab-to-nwb
pip install -e buzsaki-lab-to-nwb

Do make sure you always have the intended python version and a fresh environment.

In debugging matching distribution errors, it's also helpful to check PyPI itself for published versioning: https://pypi.org/project/scipy/#history

CodyCBakerPhD commented 2 years ago

Also, if all else fails in these situations, you can always precisely match the installation structure of the continuous integration test that routinely verifies the package is installable in this fashion - https://github.com/catalystneuro/buzsaki-lab-to-nwb/blob/master/.github/workflows/install.yml

bendichter commented 2 years ago

@CodyCBakerPhD can we bump up the scipy dep version? I don't think it is compatible with the latest versions of Python

CodyCBakerPhD commented 2 years ago

@bendichter Sure, I can just update everything to 3.9 and include looser versioning on the central tools as well. I'll try to get that out tomorrow

bendichter commented 2 years ago

@CodyCBakerPhD See my PR here

h-mayorquin commented 2 years ago

The following just worked fine for me on the Hub

# log in
bash
conda create -n test_buz_install python=3.7
git clone https://github.com/catalystneuro/buzsaki-lab-to-nwb
pip install -e buzsaki-lab-to-nwb

Do make sure you always have the intended python version and a fresh environment.

In debugging matching distribution errors, it's also helpful to check PyPI itself for published versioning: https://pypi.org/project/scipy/#history

Indeed, as per Ben's comment, it was the version of Python causing the problem.

Question, the latest version of nwb-conversio-tools is one patch ahead of our requirements here (requirements here are nwb-conversion-tools==0.9.3 whereas the latest version is nwb-conversion-tools==0.9.4. How do we handle these issues? when do we align the requirements? with minor releases?

CodyCBakerPhD commented 2 years ago

Question, the latest version of nwb-conversio-tools is one patch ahead of our requirements here (requirements here are nwb-conversion-tools==0.9.3 whereas the latest version is nwb-conversion-tools==0.9.4. How do we handle these issues? when do we align the requirements? with minor releases?

There shouldn't be any major breaks to back compatibility until major version increments like v0.10.0, so in that case we keep versions in a repo like this pinned to whatever. I guess we really should just make different requirements for each conversion though unless we do want to go back and update previous conversions with new API changes. For all intents and purposes, you can always just use master branch of nwbct and you'll be fine.

CodyCBakerPhD commented 2 years ago

@h-mayorquin Can you make a PR for this? Should be really easy, I know you were kind of already doing this locally on your end.