li-plus / chatglm.cpp

C++ implementation of ChatGLM-6B & ChatGLM2-6B & ChatGLM3 & GLM4(V)
MIT License
2.92k stars 334 forks source link

error: wheels for chatglm.cpp on windows #287

Open srdevore opened 5 months ago

srdevore commented 5 months ago

I get an error with pip install chatglm.cpp due to problem with wheel despite multiple troubleshooting attempts. Details are:

Context: want to use Chinese LLMs in xinference.

Windows machine (local); 100+ gb free space Python 3.9.12 Installed Cmake 3.29.1 Visual studio 2022 including Desktop development with C++ (build tools inlcuded) These installs fixed similar issue with llama-cpp-python, so it doesn't seem like the config is the problem.

ERROR: Could not build wheels for chatglm.cpp, which is required to install pyproject.toml-based projects full output is attached, but root cause seems to be here: -- Building for: Visual Studio 17 2022 -- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.22631. -- The CXX compiler identification is MSVC 19.39.33523.0 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - failed -- Check for working CXX compiler: C:/Program Files/Microsoft Visual chatglm_output.txt

Any help is greatly appreciated! chatglm_output.txt

BradKML commented 5 months ago

Apparently the online package file has not been updated (and requires direct download from GitHub)? https://github.com/xorbitsai/inference/issues/1393