The AlphaPept version actually installed in the container was v0.5.2.
2024-04-02 09:47:27> Size check:
2024-04-02 09:47:27> Size of job (raw files) 0.97 Gb
2024-04-02 09:47:27> Required disk space for / - 0.97 Gb, Available 0.02 Gb.
2024-04-02 09:47:27> Not enough disk space for analysis. Please free disk space.
An exception occurred running AlphaPept version 0.5.2:
The actual available size is for sure large enough (multiple locations tested).
The equivalent setup worked on Windows (no Docker).
I assume the free space is calculated at line 251 in the utils.py script (v0.5.0).
free = psutil.disk_usage(base).free/1024**3
If i get it correct, the calculation is based on the size of the individual drives of the raw files (line 234 in the utils.py script (v0.5.0)):
base_dirs = [os.path.splitdrive(os.path.abspath(_))[0] for _ in settings['experiment']['file_paths']]
Hi, we would like to test AlphaPept for our QC on a Linux cluster (AMD64). The AlphaPept Docker container that caused the issue was build based on this file (v0.5.0): https://github.com/MannLabs/alphapept/blob/master/Dockerfile_thermo
The AlphaPept version actually installed in the container was v0.5.2.
The actual available size is for sure large enough (multiple locations tested). The equivalent setup worked on Windows (no Docker).
I assume the free space is calculated at line 251 in the utils.py script (v0.5.0).
free = psutil.disk_usage(base).free/1024**3
If i get it correct, the calculation is based on the size of the individual drives of the raw files (line 234 in the utils.py script (v0.5.0)):
base_dirs = [os.path.splitdrive(os.path.abspath(_))[0] for _ in settings['experiment']['file_paths']]
This works nicely on Windows but seemingly causes problems in the current configuration on Linux. https://www.geeksforgeeks.org/python-os-path-splitdrive-method/
I assume using
dirname
instead ofsplitdrive
should work on both systems.We would be very happy if we could get it to run on our Linux cluster.