toshiakit / MatGPT

MATLAB app to access ChatGPT API from OpenAI
MIT License
190 stars 26 forks source link

How to streamingly receive replies from a large model using Matlab? #23

Closed huliangbing closed 6 months ago

huliangbing commented 6 months ago

MatGPT is a great work! We can get useful replies from LLM. I want to know: How to streamingly receive replies from a large model using Matlab? Thank you!

toshiakit commented 6 months ago

Thanks. It is possible do so with matlab.net.http.io.StringConsumer class https://www.mathworks.com/help/matlab/ref/matlab.net.http.io.stringconsumer-class.html

Typically, streaming is used to start displaying the response text as it comes in, rather than waiting for the fully formed response.

If you are planning to output the result to the command window, this may be a useful thing to. However, I am not sure how useful this is in MatGPT because the output is displayed in the app, and the response will not be shown unless the app is constantly updated.

toshiakit commented 6 months ago

Streaming support was added to LLMs with MATLAB. https://github.com/matlab-deep-learning/llms-with-matlab

% Define the function handle. This function print the returned text. 
sf = @(x)fprintf("%s",x);
% Create the chat object with the function handle. 
chat = openAIChat(StreamFun=sf);
% Generate response to a prompt in streaming mode. 
prompt = "What is Model-Based Design?";
[text, message, response] = generate(chat,prompt);
toshiakit commented 6 months ago

This capability is now supported by the LLMs with MATLAB framework.