barun-saha / slide-deck-ai

Co-create a PowerPoint presentation with Generative AI
https://huggingface.co/spaces/barunsaha/slide-deck-ai
MIT License
65 stars 11 forks source link

Use local model, besides Huggingface API #47

Open GhostBP112 opened 1 month ago

GhostBP112 commented 1 month ago

Is it planned or possible to use a local LLM for processing? I would see this variant as a possibility to significantly increase the generation speed (if the appropriate hardware is available) and also the possibility to use the model offline.

barun-saha commented 1 month ago

Hi,

Thanks for your interest in SlideDeck AI.

There is no "plan" as such for this. However, the use of local LLMs has been in "thoughts" lately.

Regarding the speed, token generation with Mistral Nemo appears to take longer, yes. I have been contemplating to switch back to Mistral or at least provide it as an alternative.

Let me create some tasks toward this general direction.