eth-sri / lmql

A language for constraint-guided and efficient LLM programming.
https://lmql.ai
Apache License 2.0
3.48k stars 191 forks source link

Starting out support for lmql with conda on wsl #355

Open Gayanukaa opened 3 weeks ago

Gayanukaa commented 3 weeks ago

To test out LMQL as per the guidance in the documentation. I did the following steps where I encountered several issues:

  1. I started on a blank folder on Linux environment and running conda env create -f requirements.yml -n lmql-dev with the requirements file
  2. After activating the environment on conda i ran the activate script with source activate-dev.sh
  3. Then I installed lmql on the environment to run with my gpu using pip install lmql[hf]

To test local model - Llama-2-7B

  1. I added Llama.cpp using CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python as per llama-cpp-python
  2. To load the .guff of Llama-2-7B-GGUF I used lmql serve-model llama.cpp:/home/gayanukaa/llm-test/lmql-test/llama-2-7b.Q4_K_M.gguf --cuda --port 9999 --trust_remote_code True

From this onwards I faced several issues:

Would appreciate any help or guidance to resolve these issues to run and learn LMQL