I'm working on service that uses your library. I'm building an image based on mcr.microsoft.com/dotnet/aspnet:6.0.16-jammy.
Dockerfile looks like:
FROM mcr.microsoft.com/dotnet/aspnet:6.0.16-jammy AS base
RUN apt-get update -y && apt-get install python3 -y && apt-get install python3-pip -y
CMD ["python3"]
RUN pip install pyequilib
RUN pip install exif
RUN pip install torch torchvision torchaudio --no-cache-dir
RUN pip install laspy
RUN pip install requests
RUN pip install pyopencl
RUN pip install pyquaternion
RUN pip install simplejpeg
RUN pip install opencv-python
RUN pip install omegaconf
RUN pip install easydict
WORKDIR /app
COPY ./Schnider2 /app/Schnider2
WORKDIR /app/Schnider2/lama
RUN export TORCH_HOME=$(pwd) && export PYTHONPATH=$(pwd)
RUN pip install numpy
RUN pip install -r requirements.txt
WORKDIR /app
FROM base AS final
WORKDIR /source
WORKDIR /target
WORKDIR /app
COPY ./app .
ENTRYPOINT ["dotnet", "ProcessingWorkerService.dll"]
I've faced with troubles when it comes to install requirements.txt
Here the logs of failed installation:
root@fedora:/home/berkunov/Documents/GitHub/DIT-pano-office/processingWorkerService# docker build -t processing_worker_service:v1.1.0 -f ./Dockerfile .
[+] Building 145.5s (23/28) docker:default
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 953B 0.0s
=> [internal] load metadata for mcr.microsoft.com/dotnet/aspnet:6.0.16-jammy 0.5s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [internal] load build context 1.7s
=> => transferring context: 448.88MB 1.6s
=> [base 1/20] FROM mcr.microsoft.com/dotnet/aspnet:6.0.16-jammy@sha256:aa07ba18bd133dded9cd46bdba4c77530623182ab7ecf402e24d7f189087820e 0.0s
=> CACHED [base 2/20] RUN apt-get update -y && apt-get install python3 -y && apt-get install python3-pip -y 0.0s
=> CACHED [base 3/20] RUN pip install pyequilib 0.0s
=> CACHED [base 4/20] RUN pip install exif 0.0s
=> CACHED [base 5/20] RUN pip install torch torchvision torchaudio --no-cache-dir 0.0s
=> [base 6/20] RUN pip install laspy 2.0s
=> [base 7/20] RUN pip install requests 2.7s
=> [base 8/20] RUN pip install pyopencl 2.3s
=> [base 9/20] RUN pip install pyquaternion 1.2s
=> [base 10/20] RUN pip install simplejpeg 1.7s
=> [base 11/20] RUN pip install opencv-python 7.5s
=> [base 12/20] RUN pip install omegaconf 2.6s
=> [base 13/20] RUN pip install easydict 1.3s
=> [base 14/20] WORKDIR /app 0.0s
=> [base 15/20] COPY ./Schnider2 /app/Schnider2 0.6s
=> [base 16/20] WORKDIR /app/Schnider2/lama 0.0s
=> [base 17/20] RUN export TORCH_HOME=$(pwd) && export PYTHONPATH=$(pwd) 0.2s
=> [base 18/20] RUN pip install numpy 0.8s
=> ERROR [base 19/20] RUN pip install -r requirements.txt 121.9s
------
> [base 19/20] RUN pip install -r requirements.txt:
0.515 Requirement already satisfied: pyyaml in /usr/local/lib/python3.10/dist-packages (from -r requirements.txt (line 1)) (6.0.1)
0.833 Collecting tqdm
1.103 Downloading tqdm-4.66.4-py3-none-any.whl (78 kB)
1.233 βββββββββββββββββββββββββββββββββββββββ 78.3/78.3 KB 567.5 kB/s eta 0:00:00
1.241 Requirement already satisfied: numpy in /usr/local/lib/python3.10/dist-packages (from -r requirements.txt (line 3)) (2.0.0)
1.296 Collecting easydict==1.9.0
1.357 Downloading easydict-1.9.tar.gz (6.4 kB)
1.380 Preparing metadata (setup.py): started
1.550 Preparing metadata (setup.py): finished with status 'done'
1.746 Collecting scikit-image==0.17.2
1.807 Downloading scikit-image-0.17.2.tar.gz (29.8 MB)
4.048 ββββββββββββββββββββββββββββββββββββββββ 29.8/29.8 MB 10.4 MB/s eta 0:00:00
4.985 Preparing metadata (setup.py): started
5.472 Preparing metadata (setup.py): finished with status 'done'
5.473 Requirement already satisfied: opencv-python in /usr/local/lib/python3.10/dist-packages (from -r requirements.txt (line 6)) (4.10.0.84)
5.754 Collecting tensorflow
5.817 Downloading tensorflow-2.16.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (589.8 MB)
64.41 βββββββββββββββββββββββββββββββββββββββ 589.8/589.8 MB 3.4 MB/s eta 0:00:00
66.04 Collecting joblib
66.10 Downloading joblib-1.4.2-py3-none-any.whl (301 kB)
66.14 βββββββββββββββββββββββββββββββββββββββ 301.8/301.8 KB 9.3 MB/s eta 0:00:00
66.52 Collecting matplotlib
66.58 Downloading matplotlib-3.9.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (8.3 MB)
67.49 ββββββββββββββββββββββββββββββββββββββββ 8.3/8.3 MB 9.2 MB/s eta 0:00:00
67.86 Collecting pandas
67.92 Downloading pandas-2.2.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (13.0 MB)
69.23 ββββββββββββββββββββββββββββββββββββββββ 13.0/13.0 MB 9.8 MB/s eta 0:00:00
69.33 Collecting albumentations==0.5.2
69.39 Downloading albumentations-0.5.2-py3-none-any.whl (72 kB)
69.41 ββββββββββββββββββββββββββββββββββββββββ 72.2/72.2 KB 8.6 MB/s eta 0:00:00
69.51 Collecting hydra-core==1.1.0
69.57 Downloading hydra_core-1.1.0-py3-none-any.whl (144 kB)
69.58 βββββββββββββββββββββββββββββββββββββββ 144.6/144.6 KB 9.6 MB/s eta 0:00:00
69.70 Collecting pytorch-lightning==1.2.9
69.76 Downloading pytorch_lightning-1.2.9-py3-none-any.whl (841 kB)
69.85 βββββββββββββββββββββββββββββββββββββββ 841.9/841.9 KB 9.9 MB/s eta 0:00:00
69.92 Collecting tabulate
69.98 Downloading tabulate-0.9.0-py3-none-any.whl (35 kB)
70.10 Collecting kornia==0.5.0
70.16 Downloading kornia-0.5.0-py2.py3-none-any.whl (271 kB)
70.19 ββββββββββββββββββββββββββββββββββββββ 271.5/271.5 KB 10.3 MB/s eta 0:00:00
70.38 Collecting webdataset
70.44 Downloading webdataset-0.2.86-py3-none-any.whl (70 kB)
70.45 ββββββββββββββββββββββββββββββββββββββββ 70.4/70.4 KB 9.8 MB/s eta 0:00:00
70.55 Collecting packaging
70.61 Downloading packaging-24.1-py3-none-any.whl (53 kB)
70.62 ββββββββββββββββββββββββββββββββββββββββ 54.0/54.0 KB 9.9 MB/s eta 0:00:00
70.85 Collecting scikit-learn==0.24.2
70.91 Downloading scikit-learn-0.24.2.tar.gz (7.5 MB)
71.65 ββββββββββββββββββββββββββββββββββββββββ 7.5/7.5 MB 10.1 MB/s eta 0:00:00
72.35 Installing build dependencies: started
85.41 Installing build dependencies: finished with status 'done'
85.42 Getting requirements to build wheel: started
85.75 Getting requirements to build wheel: finished with status 'done'
85.75 Preparing metadata (pyproject.toml): started
121.1 Preparing metadata (pyproject.toml): finished with status 'error'
121.2 error: subprocess-exited-with-error
121.2
121.2 Γ Preparing metadata (pyproject.toml) did not run successfully.
121.2 β exit code: 1
121.2 β°β> [1498 lines of output]
121.2 Partial import of sklearn during the build process.
121.2 setup.py:116: DeprecationWarning:
121.2
121.2 `numpy.distutils` is deprecated since NumPy 1.23.0, as a result
121.2 of the deprecation of `distutils` itself. It will be removed for
121.2 Python >= 3.12. For older Python versions it will remain present.
121.2 It is recommended to use `setuptools < 60.0` for those Python versions.
121.2 For more details, see:
121.2 https://numpy.org/devdocs/reference/distutils_status_migration.html
121.2
121.2
121.2 from numpy.distutils.command.build_ext import build_ext # noqa
121.2 Compiling sklearn/__check_build/_check_build.pyx because it changed.
121.2 Compiling sklearn/preprocessing/_csr_polynomial_expansion.pyx because it changed.
121.2 Compiling sklearn/cluster/_dbscan_inner.pyx because it changed.
121.2 Compiling sklearn/cluster/_hierarchical_fast.pyx because it changed.
121.2 Compiling sklearn/cluster/_k_means_fast.pyx because it changed.
121.2 Compiling sklearn/cluster/_k_means_lloyd.pyx because it changed.
121.2 Compiling sklearn/cluster/_k_means_elkan.pyx because it changed.
121.2 Compiling sklearn/datasets/_svmlight_format_fast.pyx because it changed.
121.2 Compiling sklearn/decomposition/_online_lda_fast.pyx because it changed.
121.2 Compiling sklearn/decomposition/_cdnmf_fast.pyx because it changed.
121.2 Compiling sklearn/ensemble/_gradient_boosting.pyx because it changed.
121.2 Compiling sklearn/ensemble/_hist_gradient_boosting/_gradient_boosting.pyx because it changed.
121.2 Compiling sklearn/ensemble/_hist_gradient_boosting/histogram.pyx because it changed.
121.2 Compiling sklearn/ensemble/_hist_gradient_boosting/splitting.pyx because it changed.
121.2 Compiling sklearn/ensemble/_hist_gradient_boosting/_binning.pyx because it changed.
121.2 Compiling sklearn/ensemble/_hist_gradient_boosting/_predictor.pyx because it changed.
121.2 Compiling sklearn/ensemble/_hist_gradient_boosting/_loss.pyx because it changed.
121.2 Compiling sklearn/ensemble/_hist_gradient_boosting/_bitset.pyx because it changed.
121.2 Compiling sklearn/ensemble/_hist_gradient_boosting/common.pyx because it changed.
121.2 Compiling sklearn/ensemble/_hist_gradient_boosting/utils.pyx because it changed.
121.2 Compiling sklearn/feature_extraction/_hashing_fast.pyx because it changed.
121.2 Compiling sklearn/manifold/_utils.pyx because it changed.
121.2 Compiling sklearn/manifold/_barnes_hut_tsne.pyx because it changed.
121.2 Compiling sklearn/metrics/cluster/_expected_mutual_info_fast.pyx because it changed.
121.2 Compiling sklearn/metrics/_pairwise_fast.pyx because it changed.
121.2 Compiling sklearn/neighbors/_ball_tree.pyx because it changed.
121.2 Compiling sklearn/neighbors/_kd_tree.pyx because it changed.
121.2 Compiling sklearn/neighbors/_dist_metrics.pyx because it changed.
121.2 Compiling sklearn/neighbors/_typedefs.pyx because it changed.
121.2 Compiling sklearn/neighbors/_quad_tree.pyx because it changed.
121.2 Compiling sklearn/tree/_tree.pyx because it changed.
121.2 Compiling sklearn/tree/_splitter.pyx because it changed.
121.2 Compiling sklearn/tree/_criterion.pyx because it changed.
121.2 Compiling sklearn/tree/_utils.pyx because it changed.
121.2 Compiling sklearn/utils/sparsefuncs_fast.pyx because it changed.
121.2 Compiling sklearn/utils/_cython_blas.pyx because it changed.
121.2 Compiling sklearn/utils/arrayfuncs.pyx because it changed.
121.2 Compiling sklearn/utils/murmurhash.pyx because it changed.
121.2 Compiling sklearn/utils/graph_shortest_path.pyx because it changed.
121.2 Compiling sklearn/utils/_fast_dict.pyx because it changed.
121.2 Compiling sklearn/utils/_openmp_helpers.pyx because it changed.
121.2 Compiling sklearn/utils/_seq_dataset.pyx because it changed.
121.2 Compiling sklearn/utils/_weight_vector.pyx because it changed.
121.2 Compiling sklearn/utils/_random.pyx because it changed.
121.2 Compiling sklearn/utils/_logistic_sigmoid.pyx because it changed.
121.2 Compiling sklearn/svm/_newrand.pyx because it changed.
121.2 Compiling sklearn/svm/_libsvm.pyx because it changed.
121.2 Compiling sklearn/svm/_liblinear.pyx because it changed.
121.2 Compiling sklearn/svm/_libsvm_sparse.pyx because it changed.
121.2 Compiling sklearn/linear_model/_cd_fast.pyx because it changed.
121.2 Compiling sklearn/linear_model/_sgd_fast.pyx because it changed.
121.2 Compiling sklearn/linear_model/_sag_fast.pyx because it changed.
121.2 Compiling sklearn/_isotonic.pyx because it changed.
121.2 warning: sklearn/cluster/_dbscan_inner.pyx:17:5: Only extern functions can throw C++ exceptions.
121.2 warning: sklearn/neighbors/_dist_metrics.pxd:19:64: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_dist_metrics.pxd:29:65: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_dist_metrics.pxd:38:73: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_dist_metrics.pxd:42:73: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_dist_metrics.pxd:65:51: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_dist_metrics.pxd:68:52: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_dist_metrics.pxd:75:68: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_dist_metrics.pxd:77:67: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:336:5: Exception check on '_update_chunk_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_update_chunk_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_update_chunk_dense' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:574:5: Exception check on '_update_chunk_sparse' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_update_chunk_sparse' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_update_chunk_sparse' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:294:31: Exception check after calling '__pyx_fuse_0_update_chunk_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_0_update_chunk_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_0_update_chunk_dense' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:294:31: Exception check after calling '__pyx_fuse_1_update_chunk_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_1_update_chunk_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_1_update_chunk_dense' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:385:60: Exception check after calling '__pyx_fuse_0_euclidean_dense_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_0_euclidean_dense_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:396:57: Exception check after calling '__pyx_fuse_0_euclidean_dense_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_0_euclidean_dense_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:385:60: Exception check after calling '__pyx_fuse_1_euclidean_dense_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_1_euclidean_dense_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:396:57: Exception check after calling '__pyx_fuse_1_euclidean_dense_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_1_euclidean_dense_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:528:32: Exception check after calling '__pyx_fuse_0_update_chunk_sparse' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_0_update_chunk_sparse' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_0_update_chunk_sparse' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:528:32: Exception check after calling '__pyx_fuse_1_update_chunk_sparse' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_1_update_chunk_sparse' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_1_update_chunk_sparse' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:624:61: Exception check after calling '__pyx_fuse_0_euclidean_sparse_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_0_euclidean_sparse_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:636:58: Exception check after calling '__pyx_fuse_0_euclidean_sparse_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_0_euclidean_sparse_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:624:61: Exception check after calling '__pyx_fuse_1_euclidean_sparse_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_1_euclidean_sparse_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/cluster/_k_means_elkan.pyx:636:58: Exception check after calling '__pyx_fuse_1_euclidean_sparse_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_1_euclidean_sparse_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/cluster/_k_means_fast.pyx:34:5: Exception check on '_euclidean_dense_dense' will always require the GIL to be acquired. Declare '_euclidean_dense_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 performance hint: sklearn/cluster/_k_means_fast.pyx:66:5: Exception check on '_euclidean_sparse_dense' will always require the GIL to be acquired. Declare '_euclidean_sparse_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 performance hint: sklearn/cluster/_k_means_lloyd.pyx:164:5: Exception check on '_update_chunk_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_update_chunk_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_update_chunk_dense' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_lloyd.pyx:361:5: Exception check on '_update_chunk_sparse' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_update_chunk_sparse' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_update_chunk_sparse' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_lloyd.pyx:131:31: Exception check after calling '__pyx_fuse_0_update_chunk_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_0_update_chunk_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_0_update_chunk_dense' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_lloyd.pyx:131:31: Exception check after calling '__pyx_fuse_1_update_chunk_dense' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_1_update_chunk_dense' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_1_update_chunk_dense' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_lloyd.pyx:201:9: Exception check after calling '__pyx_fuse_0_gemm' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_0_gemm' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_0_gemm' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_lloyd.pyx:201:9: Exception check after calling '__pyx_fuse_1_gemm' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_1_gemm' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_1_gemm' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_lloyd.pyx:327:32: Exception check after calling '__pyx_fuse_0_update_chunk_sparse' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_0_update_chunk_sparse' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_0_update_chunk_sparse' to allow an error code to be returned.
121.2 performance hint: sklearn/cluster/_k_means_lloyd.pyx:327:32: Exception check after calling '__pyx_fuse_1_update_chunk_sparse' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '__pyx_fuse_1_update_chunk_sparse' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '__pyx_fuse_1_update_chunk_sparse' to allow an error code to be returned.
121.2 warning: sklearn/tree/_tree.pxd:61:68: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_tree.pxd:62:59: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_tree.pxd:63:63: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_splitter.pxd:84:72: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_splitter.pxd:89:68: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_criterion.pxd:57:45: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_criterion.pxd:58:40: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_criterion.pxd:59:48: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_criterion.pxd:60:57: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_utils.pxd:49:75: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_utils.pxd:87:61: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_utils.pxd:119:56: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_utils.pxd:137:40: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_utils.pxd:139:71: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_utils.pxd:160:71: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/tree/_utils.pxd:161:40: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_quad_tree.pxd:76:59: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_quad_tree.pxd:95:51: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_quad_tree.pxd:98:59: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_quad_tree.pxd:99:63: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 warning: sklearn/neighbors/_quad_tree.pxd:100:80: The keyword 'nogil' should appear at the end of the function signature line. Placing it before 'except' or 'noexcept' will be disallowed in a future version of Cython.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_bitset.pyx:19:5: Exception check on 'init_bitset' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare 'init_bitset' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on 'init_bitset' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_bitset.pyx:27:5: Exception check on 'set_bitset' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare 'set_bitset' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on 'set_bitset' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_bitset.pxd:11:28: No exception value declared for 'in_bitset' in pxd file.
121.2 Users cimporting this function and calling it without the gil will always require an exception check.
121.2 Suggest adding an explicit exception value.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_bitset.pxd:13:40: No exception value declared for 'in_bitset_memoryview' in pxd file.
121.2 Users cimporting this function and calling it without the gil will always require an exception check.
121.2 Suggest adding an explicit exception value.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_bitset.pxd:16:42: No exception value declared for 'in_bitset_2d_memoryview' in pxd file.
121.2 Users cimporting this function and calling it without the gil will always require an exception check.
121.2 Suggest adding an explicit exception value.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_loss.pyx:187:5: Exception check on '_compute_softmax' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_compute_softmax' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_compute_softmax' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_loss.pyx:167:28: Exception check after calling '_compute_softmax' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_compute_softmax' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_compute_softmax' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_loss.pyx:178:28: Exception check after calling '_compute_softmax' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_compute_softmax' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_compute_softmax' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_predictor.pyx:72:38: Exception check after calling 'in_bitset_2d_memoryview' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare 'in_bitset_2d_memoryview' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_predictor.pyx:77:40: Exception check after calling 'in_bitset_2d_memoryview' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare 'in_bitset_2d_memoryview' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/_predictor.pyx:136:38: Exception check after calling 'in_bitset_2d_memoryview' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare 'in_bitset_2d_memoryview' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Declare any exception value explicitly for functions in pxd files.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:252:6: Exception check on '_build_histogram_naive' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_build_histogram_naive' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_build_histogram_naive' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:276:6: Exception check on '_subtract_histograms' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_subtract_histograms' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_subtract_histograms' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:300:6: Exception check on '_build_histogram' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_build_histogram' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_build_histogram' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:347:6: Exception check on '_build_histogram_no_hessian' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_build_histogram_no_hessian' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_build_histogram_no_hessian' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:391:6: Exception check on '_build_histogram_root' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_build_histogram_root' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_build_histogram_root' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:444:6: Exception check on '_build_histogram_root_no_hessian' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_build_histogram_root_no_hessian' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_build_histogram_root_no_hessian' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:158:60: Exception check after calling '_compute_histogram_brute_single_feature' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_compute_histogram_brute_single_feature' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_compute_histogram_brute_single_feature' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:190:48: Exception check after calling '_build_histogram_root_no_hessian' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_build_histogram_root_no_hessian' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_build_histogram_root_no_hessian' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:194:37: Exception check after calling '_build_histogram_root' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_build_histogram_root' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_build_histogram_root' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:199:43: Exception check after calling '_build_histogram_no_hessian' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_build_histogram_no_hessian' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_build_histogram_no_hessian' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:203:32: Exception check after calling '_build_histogram' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_build_histogram' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_build_histogram' to allow an error code to be returned.
121.2 performance hint: sklearn/ensemble/_hist_gradient_boosting/histogram.pyx:244:32: Exception check after calling '_subtract_histograms' will always require the GIL to be acquired.
121.2 Possible solutions:
121.2 1. Declare '_subtract_histograms' as 'noexcept' if you control the definition and you're sure you don't want the function to raise exceptions.
121.2 2. Use an 'int' return type on '_subtract_histograms' to allow an error code to be returned.
121.2 warning: sklearn/ensemble/_hist_gradient_boosting/splitting.pyx:19:0: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See https://github.com/cython/cython/issues/4310
121.2 warning: sklearn/ensemble/_hist_gradient_boosting/splitting.pyx:309:12: The 'IF' statement is deprecated and will be removed in a future Cython version. Consider using runtime conditions or C macros instead. See https://github.com/cython/cython/issues/4310
121.2
121.2 Error compiling Cython file:
121.2 ------------------------------------------------------------
121.2 ...
121.2 if n_used_bins <= 1:
121.2 free(cat_infos)
121.2 return
121.2
121.2 qsort(cat_infos, n_used_bins, sizeof(categorical_info),
121.2 compare_cat_infos)
121.2 ^
121.2 ------------------------------------------------------------
121.2
121.2 sklearn/ensemble/_hist_gradient_boosting/splitting.pyx:912:14: Cannot assign type 'int (const void *, const void *) except? -1 nogil' to 'int (*)(const void *, const void *) noexcept nogil'. Exception values are incompatible. Suggest adding 'noexcept' to the type of 'compare_cat_infos'.
121.2 Traceback (most recent call last):
121.2 File "/tmp/pip-build-env-tbdoo6n2/overlay/local/lib/python3.10/dist-packages/Cython/Build/Dependencies.py", line 1345, in cythonize_one_helper
121.2 return cythonize_one(*m)
121.2 File "/tmp/pip-build-env-tbdoo6n2/overlay/local/lib/python3.10/dist-packages/Cython/Build/Dependencies.py", line 1321, in cythonize_one
121.2 raise CompileError(None, pyx_file)
121.2 Cython.Compiler.Errors.CompileError: sklearn/ensemble/_hist_gradient_boosting/splitting.pyx
121.2
121.2 Error compiling Cython file:
121.2 ------------------------------------------------------------
121.2 ...
121.2 # Max value for our rand_r replacement (near the bottom).
121.2 # We don't use RAND_MAX because it's different across platforms and
121.2 # particularly tiny on Windows/MSVC.
121.2 RAND_R_MAX = 0x7FFFFFFF
121.2
121.2 cpdef sample_without_replacement(np.int_t n_population,
121.2 ^
121.2 ------------------------------------------------------------
121.2
121.2 sklearn/utils/_random.pxd:18:33: 'int_t' is not a type identifier
121.2
121.2 Error compiling Cython file:
121.2 ------------------------------------------------------------
121.2 ...
121.2 # We don't use RAND_MAX because it's different across platforms and
121.2 # particularly tiny on Windows/MSVC.
121.2 RAND_R_MAX = 0x7FFFFFFF
121.2
121.2 cpdef sample_without_replacement(np.int_t n_population,
121.2 np.int_t n_samples,
121.2 ^
121.2 ------------------------------------------------------------
121.2
121.2 sklearn/utils/_random.pxd:19:33: 'int_t' is not a type identifier
...........
121.2 Cython.Compiler.Errors.CompileError: sklearn/ensemble/_hist_gradient_boosting/splitting.pyx
121.2 [end of output]
121.2
121.2 note: This error originates from a subprocess, and is likely not a problem with pip.
121.2 error: metadata-generation-failed
121.2
121.2 Γ Encountered error while generating package metadata.
121.2 β°β> See above for output.
121.2
121.2 note: This is an issue with the package mentioned above, not pip.
121.2 hint: See above for details.
------
Dockerfile:26
--------------------
24 | RUN export TORCH_HOME=$(pwd) && export PYTHONPATH=$(pwd)
25 | RUN pip install numpy
26 | >>> RUN pip install -r requirements.txt
27 |
28 |
--------------------
ERROR: failed to solve: process "/bin/sh -c pip install -r requirements.txt" did not complete successfully: exit code: 1
I'm working on service that uses your library. I'm building an image based on
mcr.microsoft.com/dotnet/aspnet:6.0.16-jammy
.Dockerfile looks like:
I've faced with troubles when it comes to install
requirements.txt
Here the logs of failed installation: