run-llama / llama-hub

A library of data loaders for LLMs made by the community -- to be used with LlamaIndex and/or LangChain
https://llamahub.ai/
MIT License
3.44k stars 729 forks source link

[Bug]: The pack didn't work out of the box throwing TypeErrors #786

Closed tichomir closed 9 months ago

tichomir commented 9 months ago

Bug Description

I have installed the pack using it with a very simple application. Initially it didn't work out of the box throwing the following error

Traceback (most recent call last): File "/Users/tichomir/Downloads/Simple/test.py", line 21, in <module> ollama_pack = OllamaQueryEnginePack(model="llama2",documents=documents) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/tichomir/Downloads/Simple/ollama_pack/base.py", line 25, in __init__ llm = Ollama(self._model, base_url=self._base_url) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: BaseModel.__init__() takes 1 positional argument but 2 were given

In class OllamaQueryEnginePack(BaseLlamaPack):

following line: llm = Ollama(self._model, base_url=self._base_url) had to change to: llm = Ollama(model=self._model, base_url=self._base_url)

Also in class OllamaEmbedding(BaseEmbedding): had to comment out: class OllamaEmbedding(BaseEmbedding): and add a definition for private var: _verbose: bool = False

Version

Latest

Steps to Reproduce

Get a simple program to use this llama pack and try to give it a text file to analyse

Relevant Logs/Tracbacks

Traceback (most recent call last):
  File "/Users/tichomir/Downloads/Simple/test.py", line 21, in <module>
    ollama_pack = OllamaQueryEnginePack(model="llama2",documents=documents)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tichomir/Downloads/Simple/ollama_pack/base.py", line 25, in __init__
    llm = Ollama(self._model, base_url=self._base_url)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: BaseModel.__init__() takes 1 positional argument but 2 were given