sobelio / llm-chain

`llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks
https://llm-chain.xyz
MIT License
1.32k stars 130 forks source link

add support for Mistral using TGI / vllm / candle #225

Open pabl-o-ce opened 11 months ago

pabl-o-ce commented 11 months ago

Hi guys love your project

I was wondering if you can add support to mistral via:

for use it as endpoints also they have active support to new llm Architectures as Mistral

williamhogman commented 11 months ago

Hey sounds like a very good idea :)

If anyone wants to add this it would be a most welcome contribution.

andychenbruce commented 11 months ago

Llama+Mistral+Zephyr and GPU acceleration in only ~450 lines using candle. https://github.com/huggingface/candle/blob/main/candle-examples/examples/quantized/main.rs

If Mistral support is added with candle it could be fairly trivial to also support Llama and Zephyr.

01PrathamS commented 10 months ago

I have some experience with Rust, although my familiarity with LLMS is somewhat limited. can take on this challenge, as it would mark my initial contribution to the LLM-chain.

williamhogman commented 10 months ago

Sounds like a great idea :)