ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.67k stars 220 forks source link

ollama._types.ResponseError: Unsupported type: 'bool' #157

Closed guitmonk-1290 closed 3 weeks ago

guitmonk-1290 commented 1 month ago

I have this code for generating a SQL query based on my SQL database:

def run(self, query: str):
  self.vector_index_dict = self.index_all_tables()

  table_schema_objs = self.obj_retriever.retrieve(query)
  for table in table_schema_objs:
      print(f"[matched_table]: {table.table_name}")

  context_str = self.get_table_context_and_rows_str(query_str=query, table_schema_objs=table_schema_objs)
  context_str += f"Write a SQL query for this user query and nothing else: '{query}'"
  print(f"[LLM_CONTEXT_STR]: {context_str}")

  response = ollama.chat(
      model='stablelm-zephyr',
      messages=[{'role': 'user', 'content': context_str}],
      stream=False,
      keep_alive=True
  )

However I am encountering this error:

response = ollama.chat(
               ^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 177, in chat
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 97, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 73, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: Unsupported type: 'bool'

Previously there was no such error before updating ollama. I'm not sure what is the problem.

Thanks.

ruaanviljoen commented 3 weeks ago

I have this code for generating a SQL query based on my SQL database:

def run(self, query: str):
  self.vector_index_dict = self.index_all_tables()

  table_schema_objs = self.obj_retriever.retrieve(query)
  for table in table_schema_objs:
      print(f"[matched_table]: {table.table_name}")

  context_str = self.get_table_context_and_rows_str(query_str=query, table_schema_objs=table_schema_objs)
  context_str += f"Write a SQL query for this user query and nothing else: '{query}'"
  print(f"[LLM_CONTEXT_STR]: {context_str}")

  response = ollama.chat(
      model='stablelm-zephyr',
      messages=[{'role': 'user', 'content': context_str}],
      stream=False,
      keep_alive=True
  )

However I am encountering this error:

response = ollama.chat(
               ^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 177, in chat
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 97, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 73, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: Unsupported type: 'bool'

Previously there was no such error before updating ollama. I'm not sure what is the problem.

Thanks.

I had the same issue. The problem is that your keep_alive parameter is failing. I didn't dig into why it worked before, but if you look at the chat function definition, the types supported are not Boolean. keep_alive is used to configure the duration and behaviour, it isn't a flag (anymore?). For valid options (seems like float and str for now), see the faq.md here https://github.com/ollama/ollama/blob/d4a86102fd5f84cca50757af00296606ac191890/docs/faq.md?plain=1#L237

I updated mine and works just fine now.

guitmonk-1290 commented 3 weeks ago

@ruaanviljoen ahh I see. So it's like a configuration instead of a flag now. Thanks a bunch.