-
Hello and thanks for this repo:)
Do you have plans for adding Self Hosted LLM support, like ollama?
Best,
Orkut
-
Hi, great paper!
I'm wondering on which Mixtral model you used in your experiments.
I've been trying to use your code with this model: `mixtral:8x7b-instruct-v0.1-q6_K`, but it's very rare that …
-
Hello,
I attempted to use the LLMExtractionStrategy code provided in the documentation for OpenAI and adapted it to work with Hugging Face. However, I encountered the following error:
Provider Li…
-
Congrats on the recent release!
I was hoping to test PaperQA2 out with an open source LLM server, but I was struggling to get PaperQA2 to work with one. I have used ollama previously and I was able…
-
Currently, the corpus generalization module can only use the Qwen Inference Service served by Aliyvn. Supporting local open source LLM like codellama for corpus generalization can give users more opti…
-
Hi! Have you tried using open-source LLMs for your results? If so, would you be willing to share some instructions on deploying an open-source LLM, such as Llama3?
-
-
hey - we're starting to use BPMN Assistant to generate models for - https://lexipedia.xyz/ = I'd like to add an ollama llm integration point = could you help advise on how best to approach that? we h…
-
This task requires abstracting away common interface and adding support through common interface. Following are a list of models we want to cover initially,
- OpenAI (already has some support)
- M…
-
Hi, I would like to use an open source LLM like mistral or llama instead of the OpenAi models. What are the modifications needed on the code to do that?