acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
490 stars 56 forks source link

Any translation option for response ? #51

Open neowisard opened 5 months ago

neowisard commented 5 months ago

For international user, for voice interaction with assist we can use wyoming-faster-whisper STT with translation capability. When I speak in my native language I get English in response, and this text goes to the model. next part - text to voice for international, If I send a request in English, I will receive an answer in English, which should be promptly translated and transmitted to the TTS\assist.

Could you add an option to use deepl api for translation, specifying 3 params : base url , token , target_lang. API is: curl -X POST 'https://api-free.deepl.com/v2/translate' \ --header 'Authorization: DeepL-Auth-Key [yourAuthKey]' \ --data-urlencode 'text=Hello, world!' \ --data-urlencode 'target_lang=DE' EXAMPLE RESPONSE { "translations": [ { "detected_source_language": "EN", "text": "Hallo, Welt!" } ] }

For my part, I would prepare a short guide on how to launch the assistant in any language. I have tried installing almost all kinds of APIs in order to execute functions\tools locally but have not succeeded.

Small models cant multilingual inference.

v1-valux commented 2 months ago

Apart from the fact that I would advise against making important and security-relevant information / functions of your smart home dependent on online services (which may not even be from your own country)..

I don't think its even a good choice to translate prompts or responses for the use with english models (in smarthome scenarios) - because theres a million different ways, different models would interpret your translation.. It would rather make your responses not usable for smart homes (thats what I experienced)

BUT - I have had very good experience with the following German models so far:

Needs a little more prompt engineering in your language than in english but generally works with a lot of integrations (thanks to methods like In Context Learning to change your Prompt for better usage with those models) - as long as they do not rely on function_calling, which many open-source models / APIs (like ollama) don't support yet..