mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
23.72k stars 1.81k forks source link

Build from source failed on a mac (arm64) with protoc version mismatch. #2571

Open kastakhov opened 3 months ago

kastakhov commented 3 months ago

LocalAI version: 96a7a3b59ff73f71bfcbd080bc49094d3f30d101

Environment, CPU architecture, OS, and Version: macOS 14.5 Apple M1 Pro

Environment prepared according to build guide

Describe the bug When run make build I got the next error:

_PROTOBUF_PROTOC=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/bin/proto \
    _GRPC_CPP_PLUGIN_EXECUTABLE=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/bin/grpc_cpp_plugin \
    PATH="/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/bin:/opt/local/bin:/opt/local/sbin:/opt/homebrew/bin:/opt/homebrew/sbin:/usr/local/bin:/System/Cryptexes/App/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin:/var/run/com.apple.security.cryptexd/codex.system/bootstrap/usr/local/bin:/var/run/com.apple.security.cryptexd/codex.system/bootstrap/usr/bin:/var/run/com.apple.security.cryptexd/codex.system/bootstrap/usr/appleinternal/bin:/opt/X11/bin" \
    CMAKE_ARGS=" -DLLAMA_AVX=on -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_FMA=off -DLLAMA_F16C=off -Dabsl_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/absl -DProtobuf_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/protobuf -Dutf8_range_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/utf8_range -DgRPC_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/grpc -DCMAKE_CXX_STANDARD_INCLUDE_DIRECTORIES=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/include" \
    LLAMA_VERSION=172c8256840ffd882ab9992ecedbb587d9b21f15 \
    /Applications/Xcode.app/Contents/Developer/usr/bin/make -C backend/cpp/llama-avx grpc-server
git clone --recurse-submodules https://github.com/ggerganov/llama.cpp llama.cpp
Cloning into 'llama.cpp'...
Submodule 'kompute' (https://github.com/nomic-ai/kompute.git) registered for path 'kompute'
Cloning into '/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/kompute'...
Submodule path 'kompute': checked out '4565194ed7c32d1d2efa32ceab4d3c6cae006306'
if [ -z "172c8256840ffd882ab9992ecedbb587d9b21f15" ]; then \
        exit 1; \
    fi
cd llama.cpp && git checkout -b build 172c8256840ffd882ab9992ecedbb587d9b21f15 && git submodule update --init --recursive --depth 1
Switched to a new branch 'build'
mkdir -p llama.cpp/examples/grpc-server
bash prepare.sh
json.hpp -> llama.cpp/examples/grpc-server/json.hpp
utils.hpp -> llama.cpp/examples/grpc-server/utils.hpp
llama.cpp/examples/llava/clip.h -> llama.cpp/examples/grpc-server/clip.h
llama.cpp/examples/llava/llava.cpp -> llama.cpp/examples/grpc-server/llava.cpp
llama.cpp/examples/llava/clip.cpp -> llama.cpp/examples/grpc-server/clip.cpp
Building grpc-server with metal build type and  -DLLAMA_AVX=on -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_FMA=off -DLLAMA_F16C=off -Dabsl_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/absl -DProtobuf_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/protobuf -Dutf8_range_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/utf8_range -DgRPC_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/grpc -DCMAKE_CXX_STANDARD_INCLUDE_DIRECTORIES=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/include
cd llama.cpp && mkdir -p build && cd build && cmake ..  -DLLAMA_AVX=on -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_FMA=off -DLLAMA_F16C=off -Dabsl_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/absl -DProtobuf_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/protobuf -Dutf8_range_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/utf8_range -DgRPC_DIR=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/lib/cmake/grpc -DCMAKE_CXX_STANDARD_INCLUDE_DIRECTORIES=/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/grpc/installed_packages/include && /Applications/Xcode.app/Contents/Developer/usr/bin/make
-- The C compiler identification is AppleClang 15.0.0.15000309
-- The CXX compiler identification is AppleClang 15.0.0.15000309
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: /usr/bin/git (found version "2.39.3 (Apple Git-146)")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Accelerate framework found
-- Metal framework found
-- Could NOT find OpenMP_C (missing: OpenMP_C_FLAGS OpenMP_C_LIB_NAMES) 
-- Could NOT find OpenMP_CXX (missing: OpenMP_CXX_FLAGS OpenMP_CXX_LIB_NAMES) 
-- Could NOT find OpenMP (missing: OpenMP_C_FOUND OpenMP_CXX_FOUND) 
CMake Warning at CMakeLists.txt:311 (message):
  OpenMP not found

-- Looking for dgemm_
-- Looking for dgemm_ - found
-- Found BLAS: /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX14.5.sdk/System/Library/Frameworks/Accelerate.framework
-- BLAS found, Libraries: /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX14.5.sdk/System/Library/Frameworks/Accelerate.framework
-- BLAS found, Includes: 
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: arm64
-- ARM detected
-- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E
-- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E - Failed
-- Found ZLIB: /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX14.5.sdk/usr/lib/libz.tbd (found version "1.2.12")
-- Using protobuf version 24.3.0 | Protobuf_INCLUDE_DIRS:  | CMAKE_CURRENT_BINARY_DIR: /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server
-- Configuring done (1.6s)
-- Generating done (0.3s)
-- Build files have been written to: /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build

...

[ 94%] Generating backend.pb.cc, backend.pb.h, backend.grpc.pb.cc, backend.grpc.pb.h
[ 95%] Building CXX object examples/grpc-server/CMakeFiles/hw_grpc_proto.dir/backend.grpc.pb.cc.o
In file included from /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:13:2: error: "This file was generated by a newer version of protoc which is"
#error "This file was generated by a newer version of protoc which is"
 ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:14:2: error: "incompatible with your Protocol Buffer headers. Please update"
#error "incompatible with your Protocol Buffer headers. Please update"
 ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:15:2: error: "your headers."
#error "your headers."
 ^
In file included from /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
In file included from /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:24:
In file included from /opt/homebrew/include/google/protobuf/io/coded_stream.h:112:
In file included from /opt/homebrew/include/absl/strings/cord.h:80:
In file included from /opt/homebrew/include/absl/crc/internal/crc_cord_state.h:23:
In file included from /opt/homebrew/include/absl/crc/crc32c.h:32:
In file included from /opt/homebrew/include/absl/strings/str_format.h:84:
In file included from /opt/homebrew/include/absl/strings/internal/str_format/bind.h:29:
/opt/homebrew/include/absl/strings/internal/str_format/parser.h:225:11: warning: 'enable_if' is a clang extension [-Wgcc-compat]
          enable_if(str_format_internal::EnsureConstexpr(format),
          ^
/opt/homebrew/include/absl/strings/internal/str_format/parser.h:227:11: warning: 'enable_if' is a clang extension [-Wgcc-compat]
          enable_if(str_format_internal::ValidFormatImpl<C...>(format),
          ^
In file included from /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
In file included from /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:24:
In file included from /opt/homebrew/include/google/protobuf/io/coded_stream.h:112:
In file included from /opt/homebrew/include/absl/strings/cord.h:80:
In file included from /opt/homebrew/include/absl/crc/internal/crc_cord_state.h:23:
In file included from /opt/homebrew/include/absl/crc/crc32c.h:32:
In file included from /opt/homebrew/include/absl/strings/str_format.h:84:
/opt/homebrew/include/absl/strings/internal/str_format/bind.h:143:11: warning: 'enable_if' is a clang extension [-Wgcc-compat]
          enable_if(str_format_internal::EnsureConstexpr(s), "constexpr trap"),
          ^
/opt/homebrew/include/absl/strings/internal/str_format/bind.h:149:22: warning: 'enable_if' is a clang extension [-Wgcc-compat]
      __attribute__((enable_if(str_format_internal::EnsureConstexpr(s),
                     ^
/opt/homebrew/include/absl/strings/internal/str_format/bind.h:158:22: warning: 'enable_if' is a clang extension [-Wgcc-compat]
      __attribute__((enable_if(ValidFormatImpl<Args...>(s), "bad format trap")))
                     ^
/opt/homebrew/include/absl/strings/internal/str_format/bind.h:162:22: warning: 'enable_if' is a clang extension [-Wgcc-compat]
      __attribute__((enable_if(ValidFormatImpl<Args...>(s), "bad format trap")))
                     ^
In file included from /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
In file included from /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:27:
In file included from /opt/homebrew/include/google/protobuf/generated_message_bases.h:18:
In file included from /opt/homebrew/include/google/protobuf/message.h:106:
In file included from /opt/homebrew/include/google/protobuf/descriptor.h:45:
In file included from /opt/homebrew/include/absl/container/btree_map.h:57:
In file included from /opt/homebrew/include/absl/container/internal/btree.h:72:
/opt/homebrew/include/absl/types/compare.h:78:22: warning: 'enable_if' is a clang extension [-Wgcc-compat]
      __attribute__((enable_if(n == 0, "Only literal `0` is allowed."))) {}
                     ^
In file included from /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:287:3: error: unknown type name 'PROTOBUF_ATTRIBUTE_REINITIALIZES'
  PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
  ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:288:30: error: only virtual member functions can be marked 'final'
  bool IsInitialized() const final;
                             ^~~~~
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:291:96: error: only virtual member functions can be marked 'final'
  const char* _InternalParse(const char* ptr, ::google::protobuf::internal::ParseContext* ctx) final;
                                                                                               ^~~~~
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:294:29: error: only virtual member functions can be marked 'final'
  int GetCachedSize() const final { return _impl_._cached_size_.Get(); }
                            ^~~~~~
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:299:38: error: only virtual member functions can be marked 'final'
  void SetCachedSize(int size) const final;
                                     ^~~~~
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:314:52: error: only virtual member functions can be marked 'final'
  ::google::protobuf::Metadata GetMetadata() const final;
                                                   ^~~~~
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:213:9: error: use of undeclared identifier 'GetOwningArena'
    if (GetOwningArena() == from.GetOwningArena()
        ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:213:34: error: no member named 'GetOwningArena' in 'backend::RerankRequest'
    if (GetOwningArena() == from.GetOwningArena()
                            ~~~~ ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:260:9: error: use of undeclared identifier 'GetOwningArena'
    if (GetOwningArena() == other->GetOwningArena()) {
        ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:260:36: error: no member named 'GetOwningArena' in 'backend::RerankRequest'
    if (GetOwningArena() == other->GetOwningArena()) {
                            ~~~~~  ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:269:17: error: use of undeclared identifier 'GetOwningArena'
    ABSL_DCHECK(GetOwningArena() == other->GetOwningArena());
                ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:269:44: error: no member named 'GetOwningArena' in 'backend::RerankRequest'
    ABSL_DCHECK(GetOwningArena() == other->GetOwningArena());
                                    ~~~~~  ^
/opt/homebrew/include/absl/log/absl_check.h:47:34: note: expanded from macro 'ABSL_DCHECK'
  ABSL_LOG_INTERNAL_DCHECK_IMPL((condition), #condition)
                                 ^~~~~~~~~
/opt/homebrew/include/absl/log/internal/check_impl.h:43:41: note: expanded from macro 'ABSL_LOG_INTERNAL_DCHECK_IMPL'
  ABSL_LOG_INTERNAL_CHECK_IMPL(true || (condition), "true")
                                        ^~~~~~~~~
/opt/homebrew/include/absl/log/internal/check_impl.h:27:58: note: expanded from macro 'ABSL_LOG_INTERNAL_CHECK_IMPL'
                                    ABSL_PREDICT_FALSE(!(condition))) \
                                                         ^~~~~~~~~
/opt/homebrew/include/absl/base/optimization.h:178:59: note: expanded from macro 'ABSL_PREDICT_FALSE'
#define ABSL_PREDICT_FALSE(x) (__builtin_expect(false || (x), false))
                                                          ^
/opt/homebrew/include/absl/log/internal/conditions.h:172:40: note: expanded from macro 'ABSL_LOG_INTERNAL_CONDITION_FATAL'
  ABSL_LOG_INTERNAL_##type##_CONDITION(condition)
                                       ^~~~~~~~~
/opt/homebrew/include/absl/log/internal/conditions.h:68:7: note: expanded from macro 'ABSL_LOG_INTERNAL_STATELESS_CONDITION'
    !(condition) ? (void)0 : ::absl::log_internal::Voidify()&&
      ^~~~~~~~~
In file included from /Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:276:12: error: use of undeclared identifier 'CreateMaybeMessage'
    return CreateMaybeMessage<RerankRequest>(arena);
           ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:493:3: error: unknown type name 'PROTOBUF_ATTRIBUTE_REINITIALIZES'
  PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
  ^
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:494:30: error: only virtual member functions can be marked 'final'
  bool IsInitialized() const final;
                             ^~~~~
/Users/kastakhov/wiseinfotec-prj/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:497:96: error: only virtual member functions can be marked 'final'
  const char* _InternalParse(const char* ptr, ::google::protobuf::internal::ParseContext* ctx) final;
                                                                                               ^~~~~
fatal error: too many errors emitted, stopping now [-ferror-limit=]
7 warnings and 20 errors generated.
make[5]: *** [examples/grpc-server/CMakeFiles/hw_grpc_proto.dir/backend.grpc.pb.cc.o] Error 1
make[4]: *** [examples/grpc-server/CMakeFiles/hw_grpc_proto.dir/all] Error 2
make[3]: *** [all] Error 2
make[2]: *** [grpc-server] Error 2
make[1]: *** [build-llama-cpp-grpc-server] Error 2
make: *** [backend-assets/grpc/llama-cpp-avx] Error 2

My make command: make build BUILD_GRPC_FOR_BACKEND_LLAMA=true BUILD_TYPE=metal

% brew info protobuf
==> protobuf: stable 27.0 (bottled)
Protocol buffers (Google's data interchange format)
https://protobuf.dev/
Installed
/opt/homebrew/Cellar/protobuf/27.0 (430 files, 14.6MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 23:20:08
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/protobuf.rb
License: BSD-3-Clause
==> Dependencies
Build: cmake ✔, googletest ✘
Required: abseil ✔
==> Caveats
Emacs Lisp files have been installed to:
  /opt/homebrew/share/emacs/site-lisp/protobuf
==> Analytics
install: 57,915 (30 days), 164,506 (90 days), 690,647 (365 days)
install-on-request: 25,902 (30 days), 77,756 (90 days), 322,341 (365 days)
build-error: 260 (30 days)

% protoc --version
libprotoc 27.0

% ./backend/cpp/grpc/installed_packages/bin/protoc --version
libprotoc 24.3

To Reproduce Perform git clone and run make build

mudler commented 3 months ago

See https://github.com/mudler/LocalAI/issues/2562 - the next release will bundle grpc libs along the binary so it shouldn't be required to compile from scratch.

mudler commented 3 months ago

Also relevant: https://github.com/Homebrew/homebrew-core/pull/172734

kastakhov commented 3 months ago

Yea, I saw it, but anyway, for some reason, build from scratch might still be required, and this issue blocks it. Moreover, I'm not sure if it is relevant for another platform, or only macOS is affected.

sunfw2008 commented 3 months ago

👍🏻加油各位,虽然帮不了什么忙,但感谢让我有参与感!

获取 Outlook for iOShttps://aka.ms/o0ukef


发件人: Kostiantyn Astakhov @.> 发送时间: Saturday, June 15, 2024 2:28:50 PM 收件人: mudler/LocalAI @.> 抄送: Subscribed @.***> 主题: Re: [mudler/LocalAI] Build from source failed on a mac (arm64) with protoc version mismatch. (Issue #2571)

Yea, I saw it, but anyway, for some reason, build from scratch might still be required, and this issue blocks it. Moreover, I'm not sure if it is relevant for another platform, or only macOS is affected.

— Reply to this email directly, view it on GitHubhttps://github.com/mudler/LocalAI/issues/2571#issuecomment-2169158286, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ADS4MBVQ7TGWC5D2IV2HOSLZHPNKFAVCNFSM6AAAAABJK74MQCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRZGE2TQMRYGY. You are receiving this because you are subscribed to this thread.Message ID: @.***>