Open inluxc opened 10 months ago
What needs to be done to support it?
Basically Ollama support is an open-source alternative to chatGPT by Meta which is free. https://ollama.ai/
But does it work with OpenAI SDK?
Using a universal connector would probably satisfy everyone, and facilitate adoption: https://github.com/BerriAI/litellm.
That looks good. Would you be open to implementing that?
Considering Langchain Python/JS already fits the use case (playwright and non-OpenAI frameworks), I'm not sure this is really needed.
Considering Langchain Python/JS already fits the use case (playwright and non-OpenAI frameworks), I'm not sure this is really needed.
I got also interested: it seems the JS/TS version of Playwright is not available as I see https://python.langchain.com/docs/integrations/toolkits/playwright, but not the JS version?
Ollama is self-hosted drop-in-replacement of OpenAI and should work out of the box with this project. There is a Blogpost from Ollama : https://ollama.com/blog/openai-compatibility where also is an Example.
As far as I see, there is just a baseURL that needs to be provided besides the Model. Api Key is not necessary for Self-Hosted Ollama and is ignored when passed.
probably a small change here will help: https://github.com/lucgagan/auto-playwright/blob/22c8f2f1e70e794795bc7b4c2d3118e2180d8300/src/completeTask.ts#L12
Add Ollama support instead of chatGPT.