QiuYannnn / Local-File-Organizer

An AI-powered file management tool that ensures privacy by organizing local texts, images. Using Llama3.2 3B and Llava v1.6 models with the Nexa SDK, it intuitively scans, restructures, and organizes files for quick, seamless access and easy retrieval.
MIT License
1.65k stars 115 forks source link

Problem during 3. Install Nexa SDK , Linux Debian 12, fatal: not a git repository: #5

Closed abclution closed 1 month ago

abclution commented 1 month ago

Using the command to install Nexa SDK cpu as stated fails. Doing some google searching but coming up short so far on the solution..

Most important lines as far as I can tell is:

-- Found Git: /usr/bin/git (found version "2.39.5")
fatal: not a git repository: /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/../../.git/modules/dependency/llama.cpp
fatal: not a git repository: /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/../../.git/modules/dependency/llama.cpp

Building wheels for collected packages: nexaai
Building wheel for nexaai (pyproject.toml) ... error
error: subprocess-exited-with-error

× Building wheel for nexaai (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [148 lines of output]
*** scikit-build-core 0.10.7 using CMake 3.30.3 (wheel)
*** Configuring CMake...
loading initial cache file /tmp/tmpb8vo6nlh/build/CMakeInit.txt
-- The C compiler identification is GNU 12.2.0
-- The CXX compiler identification is GNU 12.2.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/g++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Build shared library
/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/clip.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/common.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/conditioner.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/control.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/denoiser.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/diffusion_model.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/esrgan.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/flux.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml_extend.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/lora.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/mmdit.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/model.cpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/model.h/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/pmid.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/preprocessing.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/rng.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/rng_philox.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/stable-diffusion.cpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/stable-diffusion.h/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/t5.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/tae.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/unet.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/upscaler.cpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/util.cpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/util.h/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/vae.hpp/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/vocab.hpp
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Found OpenMP_C: -fopenmp (found version "4.5")
-- Found OpenMP_CXX: -fopenmp (found version "4.5")
-- Found OpenMP: TRUE (found version "4.5")
-- OpenMP found
-- ccache found, compilation results will be cached. Disable with GGML_CCACHE=OFF.
-- CMAKE_SYSTEM_PROCESSOR: x86_64
-- x86 detected
-- SKBUILD_PLATLIB_DIR: /tmp/tmpb8vo6nlh/wheel/platlib
-- Found Git: /usr/bin/git (found version "2.39.5")
fatal: not a git repository: /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/../../.git/modules/dependency/llama.cpp
fatal: not a git repository: /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/../../.git/modules/dependency/llama.cpp
-- OpenMP found
-- ccache found, compilation results will be cached. Disable with GGML_CCACHE=OFF.
-- CMAKE_SYSTEM_PROCESSOR: x86_64
-- x86 detected
CMake Warning at dependency/llama.cpp/common/CMakeLists.txt:26 (message):
Git index not found in git repository.

CMake Warning (dev) at CMakeLists.txt:78 (install):
Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
Call Stack (most recent call first):
CMakeLists.txt:142 (llama_cpp_python_install_target)
This warning is for project developers.  Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:86 (install):
Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
Call Stack (most recent call first):
CMakeLists.txt:142 (llama_cpp_python_install_target)
This warning is for project developers.  Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:78 (install):
Target ggml_llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
Call Stack (most recent call first):
CMakeLists.txt:143 (llama_cpp_python_install_target)
This warning is for project developers.  Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:86 (install):
Target ggml_llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
Call Stack (most recent call first):
CMakeLists.txt:143 (llama_cpp_python_install_target)
This warning is for project developers.  Use -Wno-dev to suppress it.

-- Configuring done (0.4s)
-- Generating done (0.0s)
CMake Warning:
Manually-specified variables were not used by the project:

CMAKE_BUILD_PARALLEL_LEVEL

-- Build files have been written to: /tmp/tmpb8vo6nlh/build
*** Building project with Ninja...
Change Dir: '/tmp/tmpb8vo6nlh/build'

Run Build Command(s): /usr/bin/ninja -v
[1/43] cd /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp && /tmp/pip-build-env-e3n39ldc/normal/lib/python3.12/site-packages/cmake/data/bin/cmake -DMSVC= -DCMAKE_C_COMPILER_VERSION=12.2.0 -DCMAKE_C_COMPILER_ID=GNU -DCMAKE_VS_PLATFORM_NAME= -DCMAKE_C_COMPILER=/usr/bin/gcc -P /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/cmake/build-info-gen-cpp.cmake
-- Found Git: /usr/bin/git (found version "2.39.5")
fatal: not a git repository: /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/../../.git/modules/dependency/llama.cpp
fatal: not a git repository: /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/../../.git/modules/dependency/llama.cpp
[2/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat   -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o -MF dependency/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o.d -o dependency/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/build-info.cpp
[3/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat  -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o -MF dependency/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o.d -o dependency/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/console.cpp
[4/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_BUILD -DGGML_SCHED_MAX_COPIES=4 -DGGML_SHARED -DGGML_USE_OPENMP -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dggml_llama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith-Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-alloc.c.o -MF dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-alloc.c.o.d -o dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-alloc.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/ggml-alloc.c
[5/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_OPENMP -DSD_BUILD_SHARED_LIB -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-alloc.c.o -MF dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-alloc.c.o.d -o dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-alloc.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/ggml-alloc.c
[6/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_BUILD -DGGML_SCHED_MAX_COPIES=4 -DGGML_SHARED -DGGML_USE_OPENMP -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dggml_llama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith-Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-aarch64.c.o-MF dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-aarch64.c.o.d -o dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-aarch64.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/ggml-aarch64.c
[7/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_OPENMP -DSD_BUILD_SHARED_LIB -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-aarch64.c.o -MF dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-aarch64.c.o.d -o dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-aarch64.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/ggml-aarch64.c
[8/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_OPENMP -DSD_BUILD_SHARED_LIB -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-backend.c.o -MF dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-backend.c.o.d -o dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-backend.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/ggml-backend.c
[9/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_BUILD -DGGML_SCHED_MAX_COPIES=4 -DGGML_SHARED -DGGML_USE_OPENMP -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dggml_llama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith-Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-backend.c.o-MF dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-backend.c.o.d -o dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-backend.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/ggml-backend.c
[10/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DLLAMA_BUILD -DLLAMA_SHARED -Dllama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-grammar.cpp.o -MF dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-grammar.cpp.o.d -o dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-grammar.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/llama-grammar.cpp
[11/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat  -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o -MF dependency/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o.d -o dependency/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/grammar-parser.cpp
[12/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/../.. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/../../common -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -O3 -DNDEBUG -fPIC -Wno-cast-qual -MD -MT dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -MF dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o.d -o dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/llava.cpp
[13/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat  -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/../.. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/../../common -O3 -DNDEBUG -MD -MT dependency/llama.cpp/examples/llava/CMakeFiles/llama-minicpmv-cli.dir/minicpmv-cli.cpp.o -MF dependency/llama.cpp/examples/llava/CMakeFiles/llama-minicpmv-cli.dir/minicpmv-cli.cpp.o.d -o dependency/llama.cpp/examples/llava/CMakeFiles/llama-minicpmv-cli.dir/minicpmv-cli.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/minicpmv-cli.cpp
[14/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DLLAMA_BUILD -DLLAMA_SHARED -Dllama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-sampling.cpp.o -MF dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-sampling.cpp.o.d -o dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-sampling.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/llama-sampling.cpp
[15/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat  -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/common/CMakeFiles/common.dir/ngram-cache.cpp.o -MF dependency/llama.cpp/common/CMakeFiles/common.dir/ngram-cache.cpp.o.d -o dependency/llama.cpp/common/CMakeFiles/common.dir/ngram-cache.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/ngram-cache.cpp
[16/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat  -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/../.. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/../../common -O3 -DNDEBUG -MD -MT dependency/llama.cpp/examples/llava/CMakeFiles/llama-llava-cli.dir/llava-cli.cpp.o -MF dependency/llama.cpp/examples/llava/CMakeFiles/llama-llava-cli.dir/llava-cli.cpp.o.d -o dependency/llama.cpp/examples/llava/CMakeFiles/llama-llava-cli.dir/llava-cli.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/llava-cli.cpp
[17/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat  -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o -MF dependency/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o.d -o dependency/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/sampling.cpp
[18/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat  -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o -MF dependency/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o.d -o dependency/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/train.cpp
[19/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DSD_BUILD_SHARED_LIB -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty/. -O3 -DNDEBUG -fPIC -MD -MT dependency/stable-diffusion.cpp/thirdparty/CMakeFiles/zip.dir/zip.c.o -MF dependency/stable-diffusion.cpp/thirdparty/CMakeFiles/zip.dir/zip.c.o.d -o dependency/stable-diffusion.cpp/thirdparty/CMakeFiles/zip.dir/zip.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty/zip.c
In file included from /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty/zip.c:40:
/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty/miniz.h:4988:9: note: ‘#pragma message: Using fopen, ftello, fseeko, stat() etc. path for file I/O - this path may not support large files.’
4988 | #pragma message(                                                               \
|         ^~~~~~~
[20/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DLLAMA_BUILD -DLLAMA_SHARED -Dllama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-vocab.cpp.o -MF dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-vocab.cpp.o.d -o dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-vocab.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/llama-vocab.cpp
[21/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DSD_BUILD_DLL -DSD_BUILD_SHARED_LIB -Dstable_diffusion_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty/. -O3 -DNDEBUG -fPIC -MD -MT dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/upscaler.cpp.o -MF dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/upscaler.cpp.o.d -o dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/upscaler.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/upscaler.cpp
[22/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_BUILD -DGGML_SCHED_MAX_COPIES=4 -DGGML_SHARED -DGGML_USE_OPENMP -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dggml_llama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-quants.c.o-MF dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-quants.c.o.d -o dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-quants.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/ggml-quants.c
[23/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_OPENMP -DSD_BUILD_SHARED_LIB -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-quants.c.o -MF dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-quants.c.o.d -o dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-quants.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/ggml-quants.c
[24/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DSD_BUILD_DLL -DSD_BUILD_SHARED_LIB -Dstable_diffusion_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty/. -O3 -DNDEBUG -fPIC -MD -MT dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/util.cpp.o -MF dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/util.cpp.o.d -o dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/util.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/util.cpp
[25/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_BUILD -DGGML_SCHED_MAX_COPIES=4 -DGGML_SHARED -DGGML_USE_OPENMP -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dggml_llama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml.c.o -MF dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml.c.o.d -o dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/ggml.c
[26/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DLLAMA_BUILD -DLLAMA_SHARED -Dllama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/src/CMakeFiles/llama.dir/unicode.cpp.o -MF dependency/llama.cpp/src/CMakeFiles/llama.dir/unicode.cpp.o.d -o dependency/llama.cpp/src/CMakeFiles/llama.dir/unicode.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/unicode.cpp
[27/43] : && /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -fPIC -O3 -DNDEBUG   -shared -Wl,-soname,libggml_llama.so -o dependency/llama.cpp/ggml_llama/src/libggml_llama.so dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml.c.o dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-alloc.c.o dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-backend.c.o dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-quants.c.o dependency/llama.cpp/ggml_llama/src/CMakeFiles/ggml_llama.dir/ggml-aarch64.c.o  -Wl,-rpath,"\$ORIGIN"  -lm  /usr/lib/gcc/x86_64-linux-gnu/12/libgomp.so && :
[28/43] ccache /usr/bin/gcc  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_OPENMP -DSD_BUILD_SHARED_LIB -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -fopenmp -MD -MT dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o -MFdependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o.d -o dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/ggml.c
[29/43] : && /tmp/pip-build-env-e3n39ldc/normal/lib/python3.12/site-packages/cmake/data/bin/cmake -E rm -f dependency/stable-diffusion.cpp/ggml/src/libggml.a && /usr/bin/ar qc dependency/stable-diffusion.cpp/ggml/src/libggml.a  dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-alloc.c.o dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-backend.c.o dependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-quants.c.odependency/stable-diffusion.cpp/ggml/src/CMakeFiles/ggml.dir/ggml-aarch64.c.o && /usr/bin/ranlib dependency/stable-diffusion.cpp/ggml/src/libggml.a && :
[30/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/../.. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/../../common -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -O3 -DNDEBUG -fPIC -Wno-cast-qual -MD -MT dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -MF dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o.d -o dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/examples/llava/clip.cpp
[31/43] : && /tmp/pip-build-env-e3n39ldc/normal/lib/python3.12/site-packages/cmake/data/bin/cmake -E rm -f dependency/llama.cpp/examples/llava/libllava_static.a && /usr/bin/ar qc dependency/llama.cpp/examples/llava/libllava_static.a  dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o && /usr/bin/ranlib dependency/llama.cpp/examples/llava/libllava_static.a && :
[32/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat  -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/common/CMakeFiles/common.dir/json-schema-to-grammar.cpp.o -MF dependency/llama.cpp/common/CMakeFiles/common.dir/json-schema-to-grammar.cpp.o.d -o dependency/llama.cpp/common/CMakeFiles/common.dir/json-schema-to-grammar.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/json-schema-to-grammar.cpp
[33/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat  -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -MF dependency/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o.d -o dependency/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/common/common.cpp
[34/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DSD_BUILD_DLL -DSD_BUILD_SHARED_LIB -Dstable_diffusion_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty/. -O3 -DNDEBUG -fPIC -MD -MT dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/model.cpp.o -MF dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/model.cpp.o.d -o dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/model.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/model.cpp
[35/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DLLAMA_BUILD -DLLAMA_SHARED -Dllama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/src/CMakeFiles/llama.dir/unicode-data.cpp.o -MF dependency/llama.cpp/src/CMakeFiles/llama.dir/unicode-data.cpp.o.d -o dependency/llama.cpp/src/CMakeFiles/llama.dir/unicode-data.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/unicode-data.cpp
[36/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DGGML_MAX_NAME=128 -DSD_BUILD_DLL -DSD_BUILD_SHARED_LIB -Dstable_diffusion_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/ggml/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/thirdparty/. -O3 -DNDEBUG -fPIC -MD -MT dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/stable-diffusion.cpp.o -MF dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/stable-diffusion.cpp.o.d -o dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/stable-diffusion.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/stable-diffusion.cpp/stable-diffusion.cpp
[37/43] ccache /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -DLLAMA_BUILD -DLLAMA_SHARED -Dllama_EXPORTS -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/. -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/../include -I/tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/ggml_llama/src/../include -O3 -DNDEBUG -fPIC -MD -MT dependency/llama.cpp/src/CMakeFiles/llama.dir/llama.cpp.o -MF dependency/llama.cpp/src/CMakeFiles/llama.dir/llama.cpp.o.d -o dependency/llama.cpp/src/CMakeFiles/llama.dir/llama.cpp.o -c /tmp/pip-install-__v1_j7m/nexaai_9a64b9b039d74eb5ac93ba9ce74781f2/dependency/llama.cpp/src/llama.cpp
[38/43] : && /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -fPIC -O3 -DNDEBUG   -shared -Wl,-soname,libstable-diffusion.so -o bin/libstable-diffusion.so dependency/stable-diffusion.cpp/thirdparty/CMakeFiles/zip.dir/zip.c.o dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/model.cpp.o dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/stable-diffusion.cpp.o dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/upscaler.cpp.o dependency/stable-diffusion.cpp/CMakeFiles/stable-diffusion.dir/util.cpp.o  dependency/stable-diffusion.cpp/ggml/src/libggml.a  /usr/lib/gcc/x86_64-linux-gnu/12/libgomp.so  -lm && :
[39/43] : && /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -fPIC -O3 -DNDEBUG   -shared -Wl,-soname,libllama.so -o dependency/llama.cpp/src/libllama.so dependency/llama.cpp/src/CMakeFiles/llama.dir/llama.cpp.o dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-vocab.cpp.o dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-grammar.cpp.o dependency/llama.cpp/src/CMakeFiles/llama.dir/llama-sampling.cpp.o dependency/llama.cpp/src/CMakeFiles/llama.dir/unicode.cpp.o dependency/llama.cpp/src/CMakeFiles/llama.dir/unicode-data.cpp.o  -Wl,-rpath,"\$ORIGIN"  dependency/llama.cpp/ggml_llama/src/libggml_llama.so && :
[40/43] : && /tmp/pip-build-env-e3n39ldc/normal/lib/python3.12/site-packages/cmake/data/bin/cmake -E rm -f dependency/llama.cpp/common/libcommon.a && /usr/bin/ar qc dependency/llama.cpp/common/libcommon.a  dependency/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o dependency/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o dependency/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o dependency/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o dependency/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o dependency/llama.cpp/common/CMakeFiles/common.dir/json-schema-to-grammar.cpp.o dependency/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o dependency/llama.cpp/common/CMakeFiles/common.dir/ngram-cache.cpp.o && /usr/bin/ranlib dependency/llama.cpp/common/libcommon.a && :
[41/43] : && /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -fPIC -O3 -DNDEBUG   -shared -Wl,-soname,libllava.so -o dependency/llama.cpp/examples/llava/libllava.so dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o  -Wl,-rpath,"\$ORIGIN"  dependency/llama.cpp/src/libllama.so  dependency/llama.cpp/ggml_llama/src/libggml_llama.so && :
[42/43] : && /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -O3 -DNDEBUG  dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llama-llava-cli.dir/llava-cli.cpp.o -o dependency/llama.cpp/examples/llava/llama-llava-cli  -Wl,-rpath,/tmp/tmpb8vo6nlh/build/dependency/llama.cpp/src:/tmp/tmpb8vo6nlh/build/dependency/llama.cpp/ggml_llama/src:  dependency/llama.cpp/common/libcommon.a  dependency/llama.cpp/src/libllama.so  dependency/llama.cpp/ggml_llama/src/libggml_llama.so && :
FAILED: dependency/llama.cpp/examples/llava/llama-llava-cli
: && /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -O3 -DNDEBUG  dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llama-llava-cli.dir/llava-cli.cpp.o -o dependency/llama.cpp/examples/llava/llama-llava-cli  -Wl,-rpath,/tmp/tmpb8vo6nlh/build/dependency/llama.cpp/src:/tmp/tmpb8vo6nlh/build/dependency/llama.cpp/ggml_llama/src:  dependency/llama.cpp/common/libcommon.a  dependency/llama.cpp/src/libllama.so  dependency/llama.cpp/ggml_llama/src/libggml_llama.so && :
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: warning: libgomp.so.1, needed by dependency/llama.cpp/ggml_llama/src/libggml_llama.so, not found (try using -rpath or -rpath-link)
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `GOMP_barrier@GOMP_1.0'
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `GOMP_parallel@GOMP_4.0'
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `omp_get_thread_num@OMP_1.0'
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `GOMP_single_start@GOMP_1.0'
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `omp_get_num_threads@OMP_1.0'
collect2: error: ld returned 1 exit status
[43/43] : && /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -O3 -DNDEBUG  dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llama-minicpmv-cli.dir/minicpmv-cli.cpp.o -o dependency/llama.cpp/examples/llava/llama-minicpmv-cli  -Wl,-rpath,/tmp/tmpb8vo6nlh/build/dependency/llama.cpp/src:/tmp/tmpb8vo6nlh/build/dependency/llama.cpp/ggml_llama/src:  dependency/llama.cpp/common/libcommon.a  dependency/llama.cpp/src/libllama.so  dependency/llama.cpp/ggml_llama/src/libggml_llama.so && :
FAILED: dependency/llama.cpp/examples/llava/llama-minicpmv-cli
: && /usr/bin/g++  -pthread -B /home/abclution/miniconda3/envs/local_file_organizer/compiler_compat -O3 -DNDEBUG  dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o dependency/llama.cpp/examples/llava/CMakeFiles/llama-minicpmv-cli.dir/minicpmv-cli.cpp.o -o dependency/llama.cpp/examples/llava/llama-minicpmv-cli  -Wl,-rpath,/tmp/tmpb8vo6nlh/build/dependency/llama.cpp/src:/tmp/tmpb8vo6nlh/build/dependency/llama.cpp/ggml_llama/src:  dependency/llama.cpp/common/libcommon.a  dependency/llama.cpp/src/libllama.so  dependency/llama.cpp/ggml_llama/src/libggml_llama.so && :
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: warning: libgomp.so.1, needed by dependency/llama.cpp/ggml_llama/src/libggml_llama.so, not found (try using -rpath or -rpath-link)
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `GOMP_barrier@GOMP_1.0'
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `GOMP_parallel@GOMP_4.0'
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `omp_get_thread_num@OMP_1.0'
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `GOMP_single_start@GOMP_1.0'
/home/abclution/miniconda3/envs/local_file_organizer/compiler_compat/ld: dependency/llama.cpp/ggml_llama/src/libggml_llama.so: undefined reference to `omp_get_num_threads@OMP_1.0'
collect2: error: ld returned 1 exit status
ninja: build stopped: subcommand failed.

*** CMake build failed
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for nexaai
Failed to build nexaai
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (nexaai)
abclution commented 1 month ago

Was able to continue using the following

CMAKE_ARGS="-DCMAKE_CXX_FLAGS=-fopenmp" pip install nexaai --prefer-binary --index-url https://nexaai.github.io/nexa-sdk/whl/cpu --extra-index-url https://pypi.org/simple

abclution commented 1 month ago

Hmm, well trying to use the program after does not seem to work. It just sits there after doing nothing.. No cpu usage either.

python main.py                                                                                               
--------------------------------------------------
Enter the path of the directory you want to organize: sample_data
Input path successfully uploaded: sample_data
--------------------------------------------------
Enter the path to store organized files and folders (press Enter to use 'organized_folder' in the input directory):
Output path successfully upload: organized_folder
--------------------------------------------------
Time taken to load file paths: 0.00 seconds
--------------------------------------------------
Directory tree before renaming:
/home/abclution/Desktop/Local-File-Organizer/sample_data
├── 12222_777.docx
├── IMG_0967.PNG
├── logo.png
├── paper_1col.pdf
├── sub_dir1
│   └── animal.jpg
├── sub_dir2
│   └── BS.txt
└── text_files
└── dsflsdflj.txt
**************************************************
The file upload was successful. It will take some minutes.
**************************************************
abclution commented 1 month ago

Ok, it seems to be downloading models in the background, perhaps it should notify the user?

duoduoyeah commented 1 month ago

Ok, it seems to be downloading models in the background, perhaps it should notify the user?

Hi, I have also come across this question. Where can I find the background download?

abclution commented 1 month ago

~/.cache/nexa/hub/official/*

ex. ~/.cache/nexa/hub/official/llava-v1.6-vicuna-7b

QiuYannnn commented 1 month ago

Thanks for your suggestion! Initially, I was trying to keep the output more concise, which caused me to overlook the user-friendliness of the background download feature—sorry about that. I've updated it to be more interactive, so you shouldn't feel lost anymore. You can also use the nexa list command to check which models you already have. I learned these things from the Nexa SDK Reference.

abclution commented 1 month ago

Tried to get Nexaai working with AMD gpu but so far no luck. Its looking for some specific ROCm stuff that is not availiable and ROCm from AMD is not installable on Debian 12.

That being said, I have many other AI softwares working on AMD on this same system, there seems to be a subset of ROCm packaged in my distro, no idea if its enough for the Nexa SDk

QiuYannnn commented 1 month ago

It could be a ROCm compatibility issue. I'd recommend raising this on the official Nexa SDK GitHub repository, where the developers can review it. They’re usually quick to respond and may have a solution or workaround for you. You can submit an issue to SDK issue.