Abraxas-365 / langchain-rust

🦜️🔗LangChain for Rust, the easiest way to write LLM-based programs in Rust
MIT License
625 stars 83 forks source link

Add Groq LLM integration (or ability to create custom LLM models) #239

Open arek-e opened 1 month ago

arek-e commented 1 month ago

Is your feature request related to a problem? Please describe. I have used the python Langchain library in a previous project for a Django backend. Now Im working on a new project that uses Tauri and its rust backend so naturally i was looking for implementation for Langchain in rust. In python I started using the Groq chat model, I really liked its simple price model and fast inference so naturally i would want to use it in this project as well but currently there is no support for this.

Describe the solution you'd like I would like an integration for the llm Groq model for langchain-rust like OpenAI,Claude and Ollama is supported. Currently there are some unnoffical sdk for Groq like groq-api-rust or groq-rust

Describe alternatives you've considered The other alternative and more scalable solution for similar issues would be the ability to create a custom chat model similarly to how the python SDK works by wrapping the llm with the BaseChatModel.

https://python.langchain.com/docs/how_to/custom_chat_model/

prabirshrestha commented 1 month ago

Seems like Groq supports OpenAI compatibility based on their docs. Would this be enough?