deepset-ai / haystack

:mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
https://haystack.deepset.ai
Apache License 2.0
14.55k stars 1.71k forks source link

TypeError: Object of type PromptTemplate is not JSON serializable #6146

Closed muazhari closed 1 month ago

muazhari commented 8 months ago

Describe the bug Error when using this:

logging.basicConfig(format="%(levelname)s - %(name)s -  %(message)s", level=logging.WARNING)
logging.getLogger("haystack").setLevel(logging.DEBUG)
def get_online_generator(self, generator_body: GeneratorBody) -> PromptNode:
        prompt_template = PromptTemplate(
            prompt=generator_body.prompt,
            output_parser=AnswerParser()
        )

        generator: PromptNode = PromptNode(
            model_name_or_path=generator_body.generator_model.model,
            default_prompt_template=prompt_template,
            max_length=generator_body.answer_max_length,
            api_key=generator_body.generator_model.api_key,
            use_gpu=True,
        )
        return generator

Error message

DEBUG - haystack.telemetry -  Telemetry couldn't make a POST request to PostHog.
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/haystack/telemetry.py", line 98, in send_event
    json.dumps({**self.event_properties, **dynamic_specs, **event_properties}, sort_keys=True)
  File "/usr/lib/python3.10/json/__init__.py", line 238, in dumps
    **kw).encode(obj)
  File "/usr/lib/python3.10/json/encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/usr/lib/python3.10/json/encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "/usr/lib/python3.10/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type PromptTemplate is not JSON serializable
DEBUG - haystack.utils.openai_utils -  Using tiktoken cl100k_base tokenizer

Expected behavior No error shown.

Additional context

To Reproduce Use a logger and PromptTemplate like the above.

FAQ Check

System: OS: Windows 11 GPU/CPU: RTX4090/i9-13900K Haystack version (commit or version number): https://github.com/deepset-ai/haystack/commit/2a45e7cc06bc74526907c4da38d647163dca0dd2 DocumentStore: - Reader: - Retriever: - Ranker: -

masci commented 6 months ago

Can you try disabling the telemetry as described in https://docs.haystack.deepset.ai/docs/telemetry#how-can-i-opt-out ?