resonai / ybt

Yet another Build Tool
Apache License 2.0
2 stars 2 forks source link

Handling python tests dependencies the same way as our generated code #104

Open dlehavi-resonai opened 5 years ago

dlehavi-resonai commented 5 years ago

Our python tests import a lot of modules, which we typically do not need in either the run or the build environment. Our current solution is to have a dedicated docket image (python test base) which we manually update, which is horrible.

After a discussion with @aviresonai we suggest to adapt the solution which we use for generated code. Namely exposing certain directories to the build environment docker where the need libraries.

This means we will need to use (see [https://stackoverflow.com/questions/2915471/install-a-python-package-into-a-different-directory-using-pip] and [https://stackoverflow.com/questions/5226311/installing-specific-package-versions-with-pip] ) pip install --install-option="--prefix=$PREFIX_PATH" package_name==

This also means that python test will have to declare which libraries they need.

dlehavi-resonai commented 5 years ago

from an implementation POV, the closest function to what I suggest here is -- I think link_cpp_artifacts . Only that instead of placing soft links, we have two options

itamaro commented 5 years ago

I think the issue is more basic to any kind of builder that has a "testenv_image" / "buildenv_image" property - there's no easy way to distinguish between deps that go into the artifact of that target, or deps that need to be added to the "[build/test]env_image" in order to build/test the target.

if we solve this issue in a more robust way, I think the reported behavior will also be solved, and the suggested fix will not be necessary.

WDYT?

dlehavi-resonai commented 5 years ago

Here is the way I see it:

itamaro commented 5 years ago

discussed, agreed on a solution that should work as long as we don't have version conflicts (pip install stuff under yabtwork, bind-mount, append to PYTHONPATH)