mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
24.22k stars 1.86k forks source link

Failed to Build Locally to Mac M2 #1294

Open YAY-3M-TA3 opened 11 months ago

YAY-3M-TA3 commented 11 months ago

LocalAI version: LocalAI commit 2addb9f

Environment, CPU architecture, OS, and Version: Darwin MacBook-Pro.local 22.5.0 Darwin Kernel Version 22.5.0: Thu Jun 8 22:21:34 PDT 2023; root:xnu-8796.121.3~7/RELEASE_ARM64_T8112 arm64

Describe the bug I’m trying to build localai for Mac m2 with stable diffusion option, using this: (I “brew” installed everything, git cloned, then cd LocalAI)

Additionally, I also brewed these... brew install opencv brew install ncnn brew install protobuf libomp

And added these envs: export C_INCLUDE_PATH=/usr/local/include export CPLUS_INCLUDE_PATH=/usr/local/include

Then I built with this command: make GO_TAGS=stablediffusion BUILD_TYPE=metal build

But I get this build error:


# github.com/go-skynet/go-ggml-transformers.cpp
replit.cpp:65:50: warning: format specifies type 'int' but the argument has type 'std::vector<unsigned long>::value_type' (aka 'unsigned long') [-Wformat]
# github.com/go-skynet/LocalAI/cmd/grpc/falcon-ggml
/usr/local/go/pkg/tool/darwin_arm64/link: running clang++ failed: exit status 1
Undefined symbols for architecture arm64:
  "_ggml_add", referenced from:
      gpt_neox_ff(dollyv2_layer const&, ggml_context*, ggml_tensor*) in 000009.o

To Reproduce brew install abseil cmake go grpc protobuf wget git clone https://github.com/go-skynet/LocalAI.git

Additionally, I also brewed these... brew install opencv brew install ncnn brew install protobuf libomp

And added these envs: export C_INCLUDE_PATH=/usr/local/include export CPLUS_INCLUDE_PATH=/usr/local/include

Then I built with this command: make GO_TAGS=stablediffusion BUILD_TYPE=metal build

then I get the build error:

Expected behavior LocalAI should build properly for Mac M2

Logs This is the beginning part of the error log: go mod edit -replace github.com/nomic-ai/gpt4all/gpt4all-bindings/golang=/Users/xxxx/AI/LocalAI/gpt4all/gpt4all-bindings/golang go mod edit -replace github.com/go-skynet/go-ggml-transformers.cpp=/Users/xxxx/AI/LocalAI/go-ggml-transformers go mod edit -replace github.com/donomii/go-rwkv.cpp=/Users/xxxx/AI/LocalAI/go-rwkv go mod edit -replace github.com/ggerganov/whisper.cpp=/Users/xxxx/AI/LocalAI/whisper.cpp go mod edit -replace github.com/go-skynet/go-bert.cpp=/Users/xxxx/AI/LocalAI/go-bert go mod edit -replace github.com/mudler/go-stable-diffusion=/Users/xxxx/AI/LocalAI/go-stable-diffusion go mod edit -replace github.com/mudler/go-piper=/Users/xxxx/AI/LocalAI/go-piper go mod download touch prepare CGO_LDFLAGS=" -lcblas -framework Accelerate -framework Foundation -framework Metal -framework MetalKit -framework MetalPerformanceShaders" C_INCLUDE_PATH=/Users/xxxx/AI/LocalAI/go-ggml-transformers LIBRARY_PATH=/Users/xxxx/AI/LocalAI/go-ggml-transformers \ go build -ldflags " -X "github.com/go-skynet/LocalAI/internal.Version=v1.40.0-26-g2addb9f" -X "github.com/go-skynet/LocalAI/internal.Commit=2addb9f99a29a5131d2e8c0b841dfff334f9b161"" -tags "stablediffusion" -o backend-assets/grpc/falcon-ggml ./backend/go/llm/falcon-ggml/

github.com/go-skynet/go-ggml-transformers.cpp

replit.cpp:65:50: warning: format specifies type 'int' but the argument has type 'std::vector::value_type' (aka 'unsigned long') [-Wformat]

github.com/go-skynet/LocalAI/backend/go/llm/falcon-ggml

/usr/local/go/pkg/tool/darwin_arm64/link: running clang++ failed: exit status 1 Undefined symbols for architecture arm64: "_ggml_add", referenced from: gpt_neox_ff(dollyv2_layer const&, ggml_context, ggml_tensor) in 000009.o dollyv2_eval(dollyv2_model const&, int, int, std::1::vector<int, std::1::allocator> const&, std::1::vector<float, std::1::allocator>&, unsigned long&) in 000009.o falcon_eval(falcon_model const&, int, int, std::1::vector<int, std::1::allocator> const&, std::1::vector<float, std::1::allocator>&, unsigned long&) in 000010.o gpt2_eval(gpt2_model const&, int, int, std::1::vector<int, std::1::allocator> const&, std::1::vector<float, std::1::allocator>&, unsigned long&) in 000011.o gptj_eval(gptj_model const&, int, int, std::1::vector<int, std::1::allocator> const&, std::1::vector<float, std::1::allocator>&, unsigned long&) in 000012.o

Additional context nothing else

Aisuko commented 11 months ago

Hi @YAY-3M-TA3, may you dump the whole log as log.txt. I cannot see any key point of failed from the log.

jonmellman commented 11 months ago

+1, exact same issue here. Attaching log.txt.

Aisuko commented 11 months ago

Thanks, I am not sure the comment https://github.com/golang/go/issues/39033#issuecomment-627661302 can help us to describe the issue here in an accurate way.

The dependencies here:

brew install abseil cmake go grpc protobuf wget

Maybe you guys can run the command make BUILD_TYPE=metal build.

A note with Apple Metal:

localai-bot commented 11 months ago

:warning::warning::warning::warning::warning:

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me, I don't support (yet) replying!

:warning::warning::warning::warning::warning:

ERROR: The prompt size exceeds the context window size and cannot be processed.

Sources:

sweetcard commented 11 months ago

Thanks, I am not sure the comment golang/go#39033 (comment) can help us to describe the issue here in an accurate way.

The dependencies here:

brew install abseil cmake go grpc protobuf wget

Maybe you guys can run the command make BUILD_TYPE=metal build.

A note with Apple Metal:

  • Set gpu_layers: 1 to your YAML model config file and f16: true
  • Note: only models quantized with q4_0 are supported!

Q6_K and Q4_K_M are also supported in m3.

MaxKolbysh commented 10 months ago

During the build I have another issue: In my case protobuf version already 25.1

But during the build such error:

cd sources/go-stable-diffusion && git checkout -b build 902db5f066fd137697e3b69d0fa10d4782bd2c2f && git submodule update --init --recursive --depth 1 Switched to a new branch 'build' touch get-sources go mod edit -replace github.com/nomic-ai/gpt4all/gpt4all-bindings/golang=/Users/sgsg/Documents/Projects/LLM based projects/Local ai/LocalAI/sources/gpt4all/gpt4all-bindings/golang go: too many arguments make: *** [replace] Error 1

themeaningofmeaning commented 9 months ago

During the build I have another issue: In my case protobuf version already 25.1

But during the build such error:

cd sources/go-stable-diffusion && git checkout -b build 902db5f066fd137697e3b69d0fa10d4782bd2c2f && git submodule update --init --recursive --depth 1 Switched to a new branch 'build' touch get-sources go mod edit -replace github.com/nomic-ai/gpt4all/gpt4all-bindings/golang=/Users/sgsg/Documents/Projects/LLM based projects/Local ai/LocalAI/sources/gpt4all/gpt4all-bindings/golang go: too many arguments make: *** [replace] Error 1

@MaxKolbysh Exact same issue on my Macbook M1 Pro. Does anyone know if there is a previous version of this repo that runs well on Apple silicon machines?