PromptEngineer48 / Ollama

This repo brings numerous use cases from the Open Source Ollama
Apache License 2.0
180 stars 98 forks source link

Error when I typed the query #6

Open Polly2014 opened 10 months ago

Polly2014 commented 10 months ago

I run `python privateGPT.py and met this error. Could you help to take a look, thx

python privateGPT.py

Enter a query: give me a summary
Traceback (most recent call last):
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/requests/models.py", line 971, in json
    return complexjson.loads(self.text, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/polly/Downloads/Sublime_Workspace/GitHub_Workspace/ollama/examples/langchain-python-rag-privategpt/privateGPT.py", line 80, in <module>
    main()
  File "/Users/polly/Downloads/Sublime_Workspace/GitHub_Workspace/ollama/examples/langchain-python-rag-privategpt/privateGPT.py", line 51, in main
    res = qa(query)
          ^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 282, in __call__
    raise e
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 276, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/retrieval_qa/base.py", line 139, in _call
    answer = self.combine_documents_chain.run(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 480, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 282, in __call__
    raise e
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 276, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/combine_documents/base.py", line 105, in _call
    output, extra_return_dict = self.combine_docs(
                                ^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/combine_documents/stuff.py", line 171, in combine_docs
    return self.llm_chain.predict(callbacks=callbacks, **inputs), {}
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/llm.py", line 255, in predict
    return self(kwargs, callbacks=callbacks)[self.output_key]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 282, in __call__
    raise e
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/base.py", line 276, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/llm.py", line 91, in _call
    response = self.generate([inputs], run_manager=run_manager)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/chains/llm.py", line 101, in generate
    return self.llm.generate_prompt(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/base.py", line 467, in generate_prompt
    return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/base.py", line 602, in generate
    output = self._generate_helper(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/base.py", line 504, in _generate_helper
    raise e
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/base.py", line 491, in _generate_helper
    self._generate(
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/ollama.py", line 220, in _generate
    final_chunk = super()._stream_with_aggregation(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/ollama.py", line 156, in _stream_with_aggregation
    for stream_resp in self._create_stream(prompt, stop, **kwargs):
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/langchain/llms/ollama.py", line 140, in _create_stream
    optional_detail = response.json().get("error")
                      ^^^^^^^^^^^^^^^
  File "/Users/polly/miniforge3/envs/Ollama/lib/python3.11/site-packages/requests/models.py", line 975, in json
    raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
ahbon123 commented 8 months ago

I get a same error.