Open ipocentro87 opened 1 month ago
Why was it killed? It's been known to run for a while.
Also, if this doesn't work for you there is a file called https://github.com/sparcians/map/blob/master/scripts/rendered_safe_environment.yaml you can directly try.
We've been having trouble with a safe rendering of conda. Thinking about going with something else.
conda env create -f scripts/rendered_safe_environment.yaml
Hi @klingaard , thanks for your reply. I don't know actually, this is what the prompt is giving me.
Out of curiosity, is conda used for the visualisation tools, or it is also used by the C++ framework?
By the way I get the following error when I try to run it with a safe environment. Here's the log:
> conda env create -f scripts/rendered_safe_environment.yaml;
/root/miniconda3/lib/python3.12/argparse.py:2000: FutureWarning: `remote_definition` is deprecated and will be removed in 25.9. Use `conda env create --file=URL` instead.
action(self, namespace, argument_values, option_string)
/root/miniconda3/lib/python3.12/site-packages/conda/base/context.py:982: FutureWarning: Adding 'defaults' to the channel list implicitly is deprecated and will be removed in 25.3.
To remove this warning, please choose a default channel explicitly via 'conda config --add channels <name>', e.g. 'conda config --add channels defaults'.
deprecated.topic(
/root/miniconda3/lib/python3.12/site-packages/conda/base/context.py:982: FutureWarning: Adding 'defaults' to the channel list implicitly is deprecated and will be removed in 25.3.
To remove this warning, please choose a default channel explicitly via 'conda config --add channels <name>', e.g. 'conda config --add channels defaults'.
deprecated.topic(
Channels:
- conda-forge
- defaults
Platform: linux-64
Collecting package metadata (repodata.json): done
Solving environment: \ Killed
The log don't show an explicit reason on why the command failed. If you have any clue on how I can debug this it would be very helpful!
When it says "Killed" is that you or did it kill itself?
We're trying to use conda
to provide a known/working environment for sparta's C++ needs plus the visualization tools. Truth be told, the list of requirements for just sparta is small. I think it's:
You can find that list here: https://github.com/sparcians/map/blob/master/sparta/cmake/sparta-config.cmake
I finally managed to build sparta through Docker. The Docker imager needed more GBs of RAM.
I still have problems on podman build
though, the compilation seems to hang here:
-- Found Boost: /root/miniconda3/envs/sparta/lib/cmake/Boost-1.78.0/BoostConfig.cmake (found suitable version "1.78.0", minimum required is "1.74.0") found components: date_time iostreams serialization timer program_options
-- Using BOOST 1.78.0
-- Using YAML CPP 0.8.0
-- Using RapidJSON CPP 1.1.0
Any feedback would be very helpful 🙏
BTW, I was wondering whether linux/aarch64 is supported. On the supported environments I only see support for arm64 on macOS, but not linux.
I unfortunately don't have much experience with container builds, so I can't help you with your specific container (unless you want to share it).
Is cmake
hanging? This is the expected output past RapidJSON check:
-- Using BOOST 1.78.0
-- Using YAML CPP 0.8.0
-- Using RapidJSON CPP 1.1.0
-- Using SQLite3 3.46.0
-- Using zlib 1.3.1
-- Using HDF5 1.14.3
... more stuff related to my machine
I was wondering whether linux/aarch64 is supported
Well, it's not not supported. :wink: Never tried, but don't expect any issues.
I confirm that the compilation hangs here (with cmake <...> --trace
), on a linux/amd64 podman container
> mkdir release && cd release
> cmake -DCMAKE_BUILD_TYPE=Release .. --trace
...
-- Using RapidJSON CPP 1.1.0
...
/root/miniconda3/envs/sparta/share/cmake-3.30/Modules/FindPkgConfig.cmake(145): execute_process(COMMAND ${PKG_CONFIG_EXECUTABLE} ${PKG_CONFIG_ARGN} --libs-only-l sqlite3 OUTPUT_VARIABLE _pkgconfig_invoke_result RESULT_VARIABLE _pkgconfig_failed OUTPUT_STRIP_TRAILING_WHITESPACE )
With linux/arm64, I had a problem with wxpython
not being compiled for this platform. I did a custom build this way
pip install grayskull
grayskull pypi wxpython
conda-build wxpython
However I ran into another issue with package dependencies. Basically it wanted 3.12.* *_cpython
but the yaml platform configuration has 3.10.* *_cpython
. I tried to change this but I ended up into another packaging problem so I gave up for now.
I can share the Dockerfile to you and give you instructions on how to build it with podman if you are curious :)
Hi all, I've been trying to build Sparta on a Docker image emulating a debian linux x86_64 machine. After installing the requirements, I am stuck here:
Output
Do you have an idea on how to fix this? Thanks in advance.