Closed AmgadHasan closed 1 week ago
I plan to work on this but I believe my time is better spent (and my skills are better suited to) continuing the development of the platform.
If anyone is a YouTuber or knows a YouTuber who'd be interested, please help! The community is growing and publishing the first tutorials seems like a good opportunity.
I would like to write and contribute a tutorial on how to use StreamSync with a local LLM backend (e.g LM Studio) using the OpenAI API in a step-by-step manner from a beginner programmer's point of view as I couldn't find any.
However, I am stuck at the part trying to get StreamSync's chatbot component to display the stream returned by the API chunk by chunk (for the typewriter effect).
The alternative of waiting for the full response to be generated before returning it to the chatbot works, but that means an undesirably long waiting time after sending every prompt.
Any idea how to solve the streaming part?
There are now videos on Writer Framework (the new name for Streamsync since the acquisition), for example:
https://www.youtube.com/watch?v=sVLtMNJGxsE
To create chatbots, there's a dedicated Chatbot component with support for streaming.
It'd be really helpful if you could record a youtube tutorial on how to create and locally run a simple webapp and add this tutorial to the docs under a "tutorials" section. You might demo a webapp that allows users to chat with chatgpt using OpenAI API. This will make the package much more approachable.
Also allow contributors to create and share tutorials to the docs