raznem / parsera

Lightweight library for scraping web-sites with LLMs
https://parsera.org
GNU General Public License v2.0
732 stars 47 forks source link

Add ollama support. #5

Open mmar58 opened 1 month ago

mmar58 commented 1 month ago

Hi, ollama runs offline on personal computer or laptop. So enabling ollama in website scraping will give a boost to end users and developer. They can test it freely.

And moreover, ollama also has python API. So if you can do this, it will be greater good for developers. Also your project will get a boost as it will enable developer to test and develop by keeping everything inside his computer. Will be also good for privacy concerns.

XD-coder commented 1 month ago

.Also, using ollama makes it easier to interchange models and future proof in case there are new models that would work better with this perticular project.

cognitivetech commented 1 month ago

+1

spinagon commented 4 weeks ago

Langchain supports it https://python.langchain.com/v0.2/docs/integrations/chat/ollama/ so you should be able to just use it

raznem commented 4 weeks ago

@SamTheCoder777 implemented HuggingFace Transformers support, which can be a way to go for a quick local run atm.