outlines-dev / outlines

Structured Text Generation
https://outlines-dev.github.io/outlines/
Apache License 2.0
8.16k stars 411 forks source link

Missing Modules for llamacpp_example.py #1066

Closed Treide1 closed 1 month ago

Treide1 commented 1 month ago

Describe the issue as clearly as possible:

After a clean installation in a .venv (pydantic, llama-cpp-python, outlines), running your example for llama_cpp I get an error that on import torch that yields ModuleNotFoundError: No module named 'torch'

The llama_cpp installation worked. I can their examples just fine.

After some testing, I see similar effects on missing lib transformers. Maybe there are some more.

Installing torch (pip install torch) leads to a cascade of incompatibilities, presumably due to bad versions.

Is there a way to understand which libs in which versions are needed for the examples ? Have I missed some documentation discussing this ?

Thanks in advance.

Steps/code to reproduce the bug:

import outlines
import pydantic
import llama_cpp

# Shortened version. Doesn't matter.
class Character(pydantic.BaseModel):
    name: str
    age: int

# Download the model from Hugging Face (if necessary)
# curl -L -o mistral-7b-instruct-v0.2.Q5_K_M.gguf https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-GGUF/resolve/main/mistral-7b-instruct-v0.2.Q5_K_M.gguf

model_path=".\mistral-7b-instruct-v0.2.Q5_K_M.gguf"
model = outlines.models.LlamaCpp(
    model=llama_cpp.Llama(model_path=model_path)
)

# Construct structured sequence generator
generator = outlines.generate.json(model, Character) # ERROR

Expected result:

# Some Output, no error

Error message:

Traceback (most recent call last):
  File "c:\git\outlines_playground\llamacpp_example1.py", line 43, in <module>
    generator = outlines.generate.json(model, Character)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\functools.py", line 909, in wrapper
    return dispatch(args[0].__class__)(*args, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\git\outlines_playground\.venv\Lib\site-packages\outlines\generate\json.py", line 49, in json
    generator = regex(model, regex_str, sampler)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\functools.py", line 909, in wrapper
    return dispatch(args[0].__class__)(*args, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\git\outlines_playground\.venv\Lib\site-packages\outlines\generate\regex.py", line 59, in regex_llamacpp
    from outlines.integrations.llamacpp import RegexLogitsProcessor
  File "c:\git\outlines_playground\.venv\Lib\site-packages\outlines\integrations\llamacpp.py", line 32, in <module>
    import torch
ModuleNotFoundError: No module named 'torch'
Exception ignored in: <function Llama.__del__ at 0x0000018DA9FE5440>
Traceback (most recent call last):
  File "c:\git\outlines_playground\.venv\Lib\site-packages\llama_cpp\llama.py", line 2091, in __del__
  File "c:\git\outlines_playground\.venv\Lib\site-packages\llama_cpp\llama.py", line 2086, in close
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\contextlib.py", line 594, in close
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\contextlib.py", line 586, in __exit__
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\contextlib.py", line 571, in __exit__
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\contextlib.py", line 345, in __exit__
  File "c:\git\outlines_playground\.venv\Lib\site-packages\llama_cpp\_internals.py", line 66, in close
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\contextlib.py", line 594, in close
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\contextlib.py", line 586, in __exit__
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\contextlib.py", line 571, in __exit__
  File "C:\Users\luh\.pyenv\pyenv-win\versions\3.11.0b4\Lib\contextlib.py", line 454, in _exit_wrapper
  File "c:\git\outlines_playground\.venv\Lib\site-packages\llama_cpp\_internals.py", line 60, in free_model
TypeError: 'NoneType' object is not callable

Outlines/Python version information:

# python -c "from outlines import _version; print(_version.version)"
0.046

# python -c "import sys; print('Python', sys.version)"
Python 3.11.0b4 (main, Jul 11 2022, 15:47:56) [MSC v.1932 64 bit (AMD64)]

# pip freeze
aiohttp==3.9.5
aiosignal==1.3.1
annotated-types==0.7.0
attrs==23.2.0
certifi==2024.7.4
charset-normalizer==3.3.2
cloudpickle==3.0.0
colorama==0.4.6
datasets==2.20.0
dill==0.3.8
diskcache==5.6.3
filelock==3.15.4
frozenlist==1.4.1
fsspec==2024.5.0
huggingface-hub==0.24.1
idna==3.7
interegular==0.3.3
Jinja2==3.1.4
jsonschema==4.23.0
jsonschema-specifications==2023.12.1
lark==1.1.9
llama_cpp_python==0.2.83
llvmlite==0.43.0
MarkupSafe==2.1.5
multidict==6.0.5
multiprocess==0.70.16
nest-asyncio==1.6.0
numba==0.60.0
numpy==1.26.4
outlines==0.0.46
packaging==24.1
pandas==2.2.2
pyairports==2.1.1
pyarrow==17.0.0
pyarrow-hotfix==0.6
pycountry==24.6.1
pydantic==2.8.2
pydantic_core==2.20.1
python-dateutil==2.9.0.post0
pytz==2024.1
PyYAML==6.0.1
referencing==0.35.1
requests==2.32.3
rpds-py==0.19.0
six==1.16.0
tqdm==4.66.4
typing_extensions==4.12.2
tzdata==2024.1
urllib3==2.2.2
xxhash==3.4.1
yarl==1.9.4

Context for the issue:

No response

Treide1 commented 1 month ago

P.S.: I use the pre-built llama-cpp-python wheel on Windows 10: pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu

For clarity, after installing torch and transformers, I run into this: OSError: [WinError 127] Die angegebene Prozedur wurde nicht gefunden. Error loading "c:\git\outlines_playground\.venv\Lib\site-packages\torch\lib\torch_python.dll" or one of its dependencies.

rlouf commented 1 month ago

It sounds like a torch install issue to me. Are you able to install and import torchindependently from Outlines?

lapp0 commented 1 month ago

@Treide1 could you please try pip install git+https://github.com/outlines-dev/outlines?

outlines.generate.json no longer uses outlines.integrations.llamacpp in main

Treide1 commented 1 month ago

Thanks for your advice! Installing from current main resolved the issue 👍

Here my steps, for completeness:

In conclusion, the pip installation of outlines as-is was broken for me.