CyberTimon / Powerpointer-For-Local-LLMs

Local Powerpointer - A beautiful powerpoint generator which uses the power of local running large language models to generate the powerpoint slides.
Apache License 2.0
182 stars 30 forks source link

Small questions on Powerpoint Generators #6

Open BradKML opened 3 weeks ago

BradKML commented 3 weeks ago
  1. Can this work with Ollama (with or without LiteLLM) instead of Oobabooga? Maybe there are low foot-porint models like Phi-3.5 that would be applicable
  2. Are there ways to adjust pre-slide word count or line count? Things like 10-20-30 (less than 10 slides and "20 minutes", context font is 24-30pt while header is 36pt-48pt) and 8x8 (less than 8 lines or bullet points per slide, less than 8 words per line, less than 30 words total per slide) seem to set some guide for system prompts
  3. Could Stable Diffusion be hooked up to generate complementary images? This is something that can be easily found in other sites (StabilityMatrix suite seems to be a popular way out)
  4. What do you think of other projects that have similar functions? https://github.com/otahina/PowerPoint-Generator-Python-Project https://github.com/parthgupta1208/PresentSmart
BradKML commented 3 weeks ago

@Zoraaa-z what is that for, brother?

BradKML commented 2 weeks ago

@Zoraaa-z nah don't lie bruh, if you wanted people to click a script kiddie MEGA link and activate gcc to hack computers, that is just low. Also 2FA or at least get a password manager. Now LEAVE

CyberTimon commented 2 weeks ago

Hi @BradKML. First of all, the account @Zoraaa-z has been banned.

Next, I'll answer your questions:

  1. Yes, it won't be hard to add support for other OpenAI-like APIs like Ollama. Since I'm no longer developing this project except for critical bug fixes, I currently have no plans to add this.

  2. Adding support for accurate line counts or guidance is a bit tricky, as small language models tend to be a bit too random when counting lines/words.

  3. I've already experimented with SD/Bing integration and it worked surprisingly well, so yes, it's definitely possible to add such functionality without much effort.

  4. There are a lot of great open source projects like this and each of them has pros and cons. The ones you linked are all more web-oriented projects with frontends using the official (paid) OpenAI GPT3.5 API. This is more of a simple (bare bones) approach to connect them to local, offline language models.

Let me know if you have any further questions.

Best regards, Timon