mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
23.09k stars 1.75k forks source link

Error on build M1 Metal - /opt/homebrew/Cellar/go/1.20.5/libexec/pkg/tool/darwin_arm64/link: running c++ failed #708

Open genwch opened 1 year ago

genwch commented 1 year ago

LocalAI version: commit 3829aba

Environment, CPU architecture, OS, and Version: Darwin macmini 22.5.0 Darwin Kernel Version 22.5.0 Apple M1 (8 GB) - 2020 Mac mini MacOS Ventura

Describe the bug Failure on build

To Reproduce make BUILD_TYPE=metal build

Expected behavior build finished OK

Logs

[100%] Linking CXX executable ../bin/rwkv_quantize
[100%] Built target rwkv_quantize
cd whisper.cpp && make libwhisper.a
I whisper.cpp build info: 
I UNAME_S:  Darwin
I UNAME_P:  arm
I UNAME_M:  arm64
I CFLAGS:   -I.              -O3 -DNDEBUG -std=c11   -fPIC -D_XOPEN_SOURCE=600 -D_DARWIN_C_SOURCE -pthread -DGGML_USE_ACCELERATE
I CXXFLAGS: -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -D_XOPEN_SOURCE=600 -D_DARWIN_C_SOURCE -pthread
I LDFLAGS:   -framework Accelerate
I CC:       Apple clang version 14.0.3 (clang-1403.0.22.14.1)
I CXX:      Apple clang version 14.0.3 (clang-1403.0.22.14.1)

cc  -I.              -O3 -DNDEBUG -std=c11   -fPIC -D_XOPEN_SOURCE=600 -D_DARWIN_C_SOURCE -pthread -DGGML_USE_ACCELERATE   -c ggml.c -o ggml.o
c++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -D_XOPEN_SOURCE=600 -D_DARWIN_C_SOURCE -pthread -c whisper.cpp -o whisper.o
ar rcs libwhisper.a ggml.o whisper.o
cd bloomz && make libbloomz.a
I llama.cpp build info: 
I UNAME_S:  Darwin
I UNAME_P:  arm
I UNAME_M:  arm64
I CFLAGS:   -I.              -O3 -DNDEBUG -std=c11   -fPIC -pthread -DGGML_USE_ACCELERATE
I CXXFLAGS: -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread
I LDFLAGS:   -framework Accelerate
I CC:       Apple clang version 14.0.3 (clang-1403.0.22.14.1)
I CXX:      Apple clang version 14.0.3 (clang-1403.0.22.14.1)

cc  -I.              -O3 -DNDEBUG -std=c11   -fPIC -pthread -DGGML_USE_ACCELERATE   -c ggml.c -o ggml.o
c++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread -c utils.cpp -o utils.o
c++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread bloomz.cpp ggml.o utils.o -o bloomz.o -c  -framework Accelerate
clang: warning: ggml.o: 'linker' input unused [-Wunused-command-line-argument]
clang: warning: utils.o: 'linker' input unused [-Wunused-command-line-argument]
clang: warning: -framework Accelerate: 'linker' input unused [-Wunused-command-line-argument]
ar src libbloomz.a bloomz.o ggml.o utils.o
touch prepare
I local-ai build info:
I BUILD_TYPE: metal
I GO_TAGS: 
I LD_FLAGS:  -X "github.com/go-skynet/LocalAI/internal.Version=v1.20.1-1-g3829aba-dirty" -X "github.com/go-skynet/LocalAI/internal.Commit=3829aba869f8925dde7a1c9f280a4718dda3a18c"
CGO_LDFLAGS=" -framework Foundation -framework Metal -framework MetalKit -framework MetalPerformanceShaders" C_INCLUDE_PATH=~/Documents/git/LocalAI/go-llama:~/Documents/git/LocalAI/go-stable-diffusion/:~/Documents/git/LocalAI/gpt4all/gpt4all-bindings/golang/:~/Documents/git/LocalAI/go-ggml-transformers:~/Documents/git/LocalAI/go-rwkv:~/Documents/git/LocalAI/whisper.cpp:~/Documents/git/LocalAI/go-bert:~/Documents/git/LocalAI/bloomz LIBRARY_PATH=~/Documents/git/LocalAI/go-piper:~/Documents/git/LocalAI/go-llama:~/Documents/git/LocalAI/go-stable-diffusion/:~/Documents/git/LocalAI/gpt4all/gpt4all-bindings/golang/:~/Documents/git/LocalAI/go-ggml-transformers:~/Documents/git/LocalAI/go-rwkv:~/Documents/git/LocalAI/whisper.cpp:~/Documents/git/LocalAI/go-bert:~/Documents/git/LocalAI/bloomz go build -ldflags " -X "github.com/go-skynet/LocalAI/internal.Version=v1.20.1-1-g3829aba-dirty" -X "github.com/go-skynet/LocalAI/internal.Commit=3829aba869f8925dde7a1c9f280a4718dda3a18c"" -tags "" -o local-ai ./
# github.com/go-skynet/go-llama.cpp
binding.cpp:634:15: warning: 'llama_init_from_file' is deprecated: please use llama_load_model_from_file combined with llama_new_context_with_model instead [-Wdeprecated-declarations]
go-llama/llama.cpp/llama.h:162:15: note: 'llama_init_from_file' has been explicitly marked deprecated here
go-llama/llama.cpp/llama.h:30:56: note: expanded from macro 'DEPRECATED'
# github.com/go-skynet/go-bert.cpp
In file included from gobert.cpp:6:
go-bert/bert.cpp/bert.cpp:692:74: warning: format specifies type 'int' but the argument has type 'unsigned long' [-Wformat]
# github.com/go-skynet/go-ggml-transformers.cpp
replit.cpp:65:50: warning: format specifies type 'int' but the argument has type 'std::vector<unsigned long>::value_type' (aka 'unsigned long') [-Wformat]
# github.com/go-skynet/LocalAI
/opt/homebrew/Cellar/go/1.20.5/libexec/pkg/tool/darwin_arm64/link: running c++ failed: exit status 1
duplicate symbol '_set_numa_thread_affinity' in:
    ~/Documents/git/LocalAI/go-ggml-transformers/libtransformers.a(ggml.o)
    ~/Documents/git/LocalAI/go-llama/libbinding.a(ggml.o)
duplicate symbol '_clear_numa_thread_affinity' in:
    ~/Documents/git/LocalAI/go-ggml-transformers/libtransformers.a(ggml.o)
    ~/Documents/git/LocalAI/go-llama/libbinding.a(ggml.o)
ld: 2 duplicate symbols for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)

make: *** [build] Error 1

Additional context N/A

localai-bot commented 1 year ago

:warning::warning::warning::warning::warning:

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me, I don't support (yet) replying!

:warning::warning::warning::warning::warning:

The issue seems to be related to the build process of the LocalAI project. There are several warnings and errors that appear during the build process, such as deprecated functions, format specifiers, and duplicate symbols. It's possible that these issues are caused by conflicting dependencies or incorrect build settings. To troubleshoot this issue, you can try the following steps:

  1. Check the version of the Go compiler used to build the project (1.20.5) and ensure that it is compatible with the llama library (which requires at least version 1.19). You can use the go env command to check the current version of the Go toolchain.
  2. Check the build flags and environment variables used by the project to ensure that they are correct and compatible with the llama library. You can check the .go-build file in the root directory of the project to see the build commands used by the go build command.
  3. Ensure that all dependencies are installed and up-to-date, including the llama library and any other third-party libraries used by the project. You can use the go get command to install missing dependencies.
  4. Check the code for any potential issues or conflicts, such as duplicate symbols or incorrect function calls. You can use the go vet command to check for potential issues in the codebase. If these steps do not resolve the issue, you may need to file a bug report with the project maintainers or seek assistance from the community on the project's GitHub page or Discord channel.

Sources: