-
**Describe the bug**
I get a warning related to Hugging Face when using the AmazonBedrockGenerator on Colab. Everything works as expected
Warning:
![image](https://github.com/deepset-ai/haystac…
-
**Describe the bug**
- Unable to load Tinyllama on 0.4.0-41 nightly with Windows hotfixes, even after 5 minutes
- Error message in console indicates Nitro process has exited
**To Reproduce**
Ste…
-
I'm playing around with Qdrant as a vector store index using sentence-transformer embeddings from HuggingFace.
However, when I try to create an index, I get an error message `Did not find openai_a…
mjp0 updated
2 months ago
-
Hi, I am trying to run the eg of Llama-2 and it runs fine but the app stops my goal is to deploy the llama-2 API that I can use with the front end. So is Modal.com a good choice for me? currently, I c…
-
How come my end point still connecting to "OpenAI API" as the default LLM and Mistral AI API at the same time , even I've assigned llm=mistral_client ONLY to every Agents already ? May I know what'…
-
## Problem
I am currently exeprimenting with TinyLlama on Google Colab for various NLP tasks (question/answer generation, summarization of text). I have encountered a peculiar issue where, after aski…
-
A quick search did not find where it is used.
If you remove it, then this project can at least be installed on MacOS (auto-gptq does not yet support macOS, since it very interestingly depends on sp…
-
-
**Describe the bug**
- I encounter a bug where Nitro process exits immediately when trying to load model
- See screenshot below
**To Reproduce**
Steps to reproduce the behavior:
1. Run Jan and…
-
I have downloaded the moondream model from official ollama site (https://ollama.com/library/moondream) but while running the model in ollama i get this error ERROR source=routes.go:120 msg="error loa…