Open helqasem opened 1 year ago
Yeah, I'm also getting the exact same error. My setup is the same, trying to run the same commands from the llm instructions as you are, and am getting the same result. I too had to copy the zstd.dll file when running llm mlc setup
in order to get the setup to work. The only difference I can see is that I'm on win 10 instead of 11. š¤·
Hi There,
While attempting to run a basic prompt on local Llama2-7b plugin I received and attribute error: "AttributeError: function 'TVMGetLastPythonError' not found. Did you mean: 'TVMGetLastError'?"
Environment is Windows 11, Python 3.11.
During set-up I hit the same issue in setup regarding tvm as detailed here: https://github.com/mlc-ai/mlc-llm/issues/875. I was able to progress by downloading libzstd and renaming to zstd.dll as noted in that issue,
I've followed the instructions here for downloading and using plugins: https://pypi.org/project/llm/
After install and set-up the first use suggested is: "llm -m llama2 'difference between a llama and an alpaca'" The Attribute Error is received after running this command.
Full stack trace:
Same error occurs when attempting to use "llm chat -m llama2"
Any assistance appreciated.