machinewrapped / gpt-subtrans

Open Source project using LLMs to translate SRT subtitles
Other
311 stars 36 forks source link

Azure OpenAI Suport #77

Closed tiagoefreitas closed 3 months ago

tiagoefreitas commented 8 months ago

Hi can you add support for Azure endpoint? Only way to have gpt4-32k Thanks

ishaan-jaff commented 8 months ago

Hi @tiagoefreitas @machinewrapped - I believe we can make this easier I’m the maintainer of LiteLLM - we allow you to deploy a LLM proxy to call 100+ LLMs in 1 format - Azure OpenAI, PaLM, Bedrock, OpenAI, Anthropic etc https://github.com/BerriAI/litellm/tree/main/openai-proxy.

If this looks useful (we're used in production)- please let me know how we can help.

Usage

Azure request

curl http://0.0.0.0:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
     "model": "azure/<your-deployment-name>",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
   }'

gpt-3.5-turbo request

curl http://0.0.0.0:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
     "model": "gpt-3.5-turbo",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
   }'

claude-2 request

curl http://0.0.0.0:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
     "model": "claude-2",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
   }'
machinewrapped commented 8 months ago

Thanks @ishaan-jaff , sounds interesting - I should have some time next week(end), I'll take a look.

IlmariKu commented 5 months ago

It was relatively easy to do with the existing libraries. I personally got it working already. I can do a PR this weekend or during next week @tiagoefreitas

machinewrapped commented 4 months ago

That would be amazing!

machinewrapped commented 4 months ago

@IlmariKu you might be interested in this branch. It's a major architectural overhaul to support multiple translation services. https://github.com/machinewrapped/gpt-subtrans/tree/translation-providers

It's still WIP and I'm paying the price now for decisions made at the start of the project so there are still some wrinkles to iron out but it's at least close to being usable, and it should make it much easier to add support for Azure/Gemini/LiteLLM, each with their own settings (see OpenAiProvider for an example).

I'd appreciate the additional testing, if nothing else 😅

I'm focusing on the GUI since that's where most of the hard work is, so the CLI version might be broken at the moment. My plan is to provide separate entry scripts for each provider for the command-line interface (gpt-translate.py, gemini-translate.py, azure-translate.py etc.), so that each can support the arguments that make sense for that provider - probably separate install scripts too.

machinewrapped commented 4 months ago

The provider framework is now merged into main. https://github.com/machinewrapped/gpt-subtrans/releases/tag/v0.6.0

You can see how adding a new provider works in the gemini-provider branch, which should be merged into Main tomorrow too (just need to write the top-level command line and installer scripts). https://github.com/machinewrapped/gpt-subtrans/tree/gemini-provider

I'd still appreciate assistance in adding Azure support, @IlmariKu ... I was defeated by the Azure sign-up process, which seemed to assume I would be deploying my own models (probably I was in the wrong place)... it should be pretty easy to set up the provider classes if you can get past that :D

IlmariKu commented 3 months ago

I did it! Sorry the delay, this has been on my mind for weeks, but I've been too busy with my work.

I re-did the work with your provider-framework. Not super-clean code, but it totally works and adds Azure-support to the project. I'll open a PR