austin-starks / Promptimizer

An Automated AI-Powered Prompt Optimization Framework
https://nexustrade.io/
Other
151 stars 13 forks source link

Amazing project! What about Groq and fast inference on open source models? #1

Open agnoldo opened 3 months ago

agnoldo commented 3 months ago

Congratulations on your achievements, @austin-starks ! I see a huge potential for this project!

I was wondering if you could implement support for Groq and open source fast models such as Llama 3.1 8B. Imagine improving prompts for such a fast model, running at 1200 tokens/second! And cheaply. Or even locally, for those who need complete privacy...

Do you think this is feasible?

Thanks!

lukmay commented 2 months ago

Hi @agnoldo

I completely agree with you. This is a very good idea! I wanted to add that you can actually achieve something similar using the PrivateGPT project. With PrivateGPT, you can run a model like Llama 3.1 8B locally and set up the same interfaces that OpenAI provides. This means you can easily swap out API calls to OpenAI with calls to your locally running model.

This approach could be a first workaround to get Promptimizer to work with local models.

Thanks for bringing this up!

austin-starks commented 2 months ago

Hey @agnoldo! Sorry for responding late; I never saw your initial message!

It should absolutely be possible! I've never used Groq, but if we could get it to work, that would be game-changing. Ollama is unfortunately too slow, at least on my computer.

I don't have time to implement this, but I'm open to PR requests! It should be relatively straightforward to add.

pressdarling commented 1 month ago

I was wondering if you could implement support for Groq and open source fast models such as Llama 3.1 8B. Imagine improving prompts for such a fast model, running at 1200 tokens/second! And cheaply. Or even locally, for those who need complete privacy...

@agnoldo For local inference, Ollama should work - any other local inference engine that exposes a compatible API should work, right? LlamaCpp, etc.

I haven't tried it myself yet in this project and don't have time to add a PR right now, but the breadcrumbs should all be here:

https://github.com/austin-starks/Promptimizer/blob/3acdcbf01e29defefb0a524c648b9f67855693f7/src/services/OpenAIServiceClient.ts#L27

https://github.com/austin-starks/Promptimizer/blob/3acdcbf01e29defefb0a524c648b9f67855693f7/src/services/OpenAIServiceClient.ts#L30

@austin-starks There's probably a much smarter way to do this but this looks enough like a nail to my hammer-minded approach...