fusion-flap / flap

Fusion Library of Analysis Programs
MIT License
11 stars 5 forks source link

Upgrading our dependencies #50

Closed Csega closed 3 years ago

Csega commented 4 years ago

I just took a look into the conda_setup.yml file, and noticed, that e.g. our numpy dependency is 1 year old. Since then 2 major numpy versions came out (we have 1.15.4 (release date: 2018. November 4.) now and 1.17.3 is the latest (release date: 2019. October 17.).

It means that the last release on the branch we are at (1.15) was almost one year ago! Even in the Anaconda Cloud the latest version for windows is 1.16.5 (2019. August 28.) and for linux is 1.17.2 (2019. September 7.).

And this is just one example, almost all our dependencies are getting forward.

We talked about this with @mvecsei and as the outcome, we suggest, that we should take a dependency revision at least once a year. Otherwise we can easily slip into the trap, that we support an old version, from which it is really a pain to upgrade to some newest, but right now the upgrade would only need some minor modifications - worst case. And if it turns out, that we can support a newer version, and the old versions are still fine, we just increase the range of accepted version.

This will force us to keep all our tests up-to-date (which is a good thing) and lets us use new features from the dependencies, if we fancy anything.

thelampire commented 4 years ago

I should have answered this issue or at least reacted to it. I am not using this conda_setup.yml at all. I looked at it a while ago, thought it was important, but the whole thing was running without setting up anything. I hope I didn't implement anything from numpy or anything else which are not in the conda_setup.yml. In my opinion, this file should be tracked and maintained if we release a conda package. I think at some point that would still be a great option, especially if the package comes to a state, where it can be handed over to students who could just get it with conda install. Then there won't be any need to setup the PYTHONPATH or any other config file (except that single one for the machine related options)

Csega commented 4 years ago

Actually, we can have a requirements.txt, which helps pip to determine the necessary package versions. So tracking the version we use (and what we could use) is not a bad idea.

mvecsei commented 4 years ago

I also think that we should refresh the conda dependencies, as e.g. the matplotlib version is quite outdated. Originally, the version numbers of the libraries were set so that every library is compatible with the others both on Unix and Windows systems and a conda package is available for both OSs for that version. So even back then, not all of the library versions were the latest releases. So first we would need to find a proper configuration. We also need to check how our scripts work for the different libraries. I have been writing test functions for all of my code, so I can check my contribution But we should check that every module works properly before deciding on what version of the libraries we are updating to.

Csega commented 4 years ago

In my opinion, this means that we should organize a coding camp, where we write test functions for all the existing functionality, and check up on the existing test functions, because they can be outdated. What is your opinion about this?

sandorzoletnik commented 4 years ago

I agree we could set up an automated test procedure.

thelampire commented 4 years ago

I have actually never used that file which set the right versions (conda_setup.yml). As far as I have experienced so far, the utilized packaged (numpy, matplotlib, scipy etc.) are updated in a backwards compatible way. When there is a backwards compatibility issue, or there is possibly going to be one, they clearly state that in a warning message. Thus developers could be prepared well ahead. I don't think we should overengineer our code package. An automated test procedure for each coding block would probably mean hours of coding just to make sure one small module is working correctly under multiple versions of packages. Instead I think we should focus on the other issues which are still existing with FLAP.

mvecsei commented 4 years ago

I have recently started using a new conda environment: conda_setup.zip (I can not upload the yml file directly.) It seems to be working for my purposes. Would anyone care to start checking it on their system? (It uses the latest available packages conda could find.)

sandorzoletnik commented 4 years ago

Have you run the test programs? At present the flap_test.py is the only test program, it has a lot of steps. If it runs through the new environment should be OK.

mvecsei commented 4 years ago

Yes, it seems to work both on Windows and Ubuntu.

I noticed that at some point opencv has been added to the conda_setup.yml file without checking it on Ubuntu. Getting that package to work properly on different systems is not straightforward. For Ubuntu, I could only get it done on my PC and I do not even understand how exactly I did it. (I can not get it to work on our server.) I think that we should tread with caution when adding elaborate libraries, as I would not expect all functionalities to be fully cross-platform per default. E.g. one can also create and save animations with matplotlib. I would only use opencv, if we are actually working with computer vision or machine learning