ogre-run / miniogre

Automate the management of software dependencies with AI, to ensure your Python code runs on any computer.
https://docs.ogre.run
Apache License 2.0
43 stars 6 forks source link

feat: enable ogre LLM provider #22

Closed wilderlopes closed 2 months ago

wilderlopes commented 2 months ago

Ogre LLM provider

This feature adds ogre to the the list of LLM providers that can be used with option --provider. The default provider is still Google's Gemini Pro, but now users can point to the ogre LLM provider that is currently running llama3.1.

Full command to use the ogre provider within the run pipeline:

miniogre run --provider ogre