run-llama / llama-hub

A library of data loaders for LLMs made by the community -- to be used with LlamaIndex and/or LangChain
https://llamahub.ai/
MIT License
3.45k stars 732 forks source link

[Feature Request]: Ollama Query Engine Pack #677

Closed shauryr closed 10 months ago

shauryr commented 11 months ago

Feature Description

Ollama is a great way of hosting models locally. It would be great if we can have a Ollama pack just like the zephyr query engine pack. Ollama makes it easy to swap out and try different open source models.

Reason

No response

Value of Feature

No response

anoopshrma commented 10 months ago

https://llamahub.ai/l/llama_packs-ollama_query_engine?from=all

I hope this solves this issue 🙂. If so mind closing this issue.

Thanks!

shauryr commented 10 months ago

Thank you so much!!