deepset-ai / hayhooks

Deploy Haystack pipelines behind a REST Api.
https://haystack.deepset.ai
Apache License 2.0
30 stars 8 forks source link

Internal Server Error #8

Closed LuisYordano closed 1 month ago

LuisYordano commented 2 months ago

Currently, I am testing hayhooks, and I get the : Internal Server Error

-----example1.yml-------

example1

example1.yml

components:
  converter:
    init_parameters:
      extractor_type: DefaultExtractor
    type: haystack.components.converters.html.HTMLToDocument
  fetcher:
    init_parameters:
      raise_on_failure: true
      retry_attempts: 2
      timeout: 3
      user_agents:
      - haystack/LinkContentFetcher/2.0.1
    type: haystack.components.fetchers.link_content.LinkContentFetcher
  llm:
    init_parameters:
      generation_kwargs: {}
      model: orca-mini
      raw: false
      streaming_callback: null
      system_prompt: null
      template: null
      timeout: 1200
      url: http://localhost:11434/api/generate
    type: haystack_integrations.components.generators.ollama.generator.OllamaGenerator
  prompt:
    init_parameters:
      template: |
        "According to the contents of this website:
        {% for document in documents %}
          {{document.content}}
        {% endfor %}
        Answer the given question: {{query}}
        Answer:
        "
    type: haystack.components.builders.prompt_builder.PromptBuilder
connections:
- receiver: converter.sources
  sender: fetcher.streams
- receiver: prompt.documents
  sender: converter.documents
- receiver: llm.prompt
  sender: prompt.prompt

metadata: {}

Request body

{
  "converter": {
    "meta": {}
  },
  "fetcher": {
    "urls": [
      "https://haystack.deepset.ai/overview/quick-start"
    ]
  },
  "llm": {
    "generation_kwargs": {}
  },
  "prompt": {
    "query": "Which components do I need for a RAG pipeline?"
  }
}

Could you indicate what is the correct curl command?


----Example2.yml----------

example2

example2.yml

components:
  llm:
    init_parameters:
      generation_kwargs: {}
      model: orca-mini
      raw: false
      streaming_callback: null
      system_prompt: null
      template: null
      timeout: 1200
      url: http://localhost:11434/api/generate
    type: haystack_integrations.components.generators.ollama.generator.OllamaGenerator
  prompt_builder:
    init_parameters:
      template: |
          "Given these documents, answer the question.
          Documents:
          {% for doc in documents %}
              {{ doc.content }}
          {% endfor %}
          Question: {{query}}
          Answer:"
    type: haystack.components.builders.prompt_builder.PromptBuilder
  retriever:
    init_parameters:
      document_store:
        init_parameters:
          collection_name: documents
          embedding_function: default
          persist_path: .
        type: haystack_integrations.document_stores.chroma.document_store.ChromaDocumentStore
      filters: null
      top_k: 10
    type: haystack_integrations.components.retrievers.chroma.retriever.ChromaEmbeddingRetriever
  text_embedder:
    init_parameters:
      generation_kwargs: {}
      model: orca-mini
      timeout: 1200
      url: http://localhost:11434/api/embeddings
    type: haystack_integrations.components.embedders.ollama.text_embedder.OllamaTextEmbedder
connections:
- receiver: retriever.query_embedding
  sender: text_embedder.embedding
- receiver: prompt_builder.documents
  sender: retriever.documents
- receiver: llm.prompt
  sender: prompt_builder.prompt
max_loops_allowed: 100
metadata: {}

Request body

{
  "llm": {
    "generation_kwargs": {}
  },
  "prompt_builder": {
    "query": "How old was he when he died?"
  },
  "retriever": {
    "filters": {},
    "top_k": 3
  },
  "text_embedder": {
    "text": "How old was he when he died?",
    "generation_kwargs": {}
  }
}

Could you indicate what is the correct curl command?

masci commented 2 months ago

Hey @LuisYordano thanks for trying out Hayhooks and thanks for reporting the issue!

I'll look into this and will let you know 👍

jacksteussie commented 2 months ago

I don't know if it's the same issue, but when I get the Internal Server Error on my deployment, when I check the fastapi logs I get a TypeError: Object of type timedelta is not JSON serializable.

masci commented 1 month ago

Your example works with the latest version 0.0.13