Closed huliangbing closed 11 months ago
Thanks. It is possible do so with matlab.net.http.io.StringConsumer class https://www.mathworks.com/help/matlab/ref/matlab.net.http.io.stringconsumer-class.html
Typically, streaming is used to start displaying the response text as it comes in, rather than waiting for the fully formed response.
If you are planning to output the result to the command window, this may be a useful thing to. However, I am not sure how useful this is in MatGPT because the output is displayed in the app, and the response will not be shown unless the app is constantly updated.
Streaming support was added to LLMs with MATLAB. https://github.com/matlab-deep-learning/llms-with-matlab
% Define the function handle. This function print the returned text.
sf = @(x)fprintf("%s",x);
% Create the chat object with the function handle.
chat = openAIChat(StreamFun=sf);
% Generate response to a prompt in streaming mode.
prompt = "What is Model-Based Design?";
[text, message, response] = generate(chat,prompt);
This capability is now supported by the LLMs with MATLAB framework.
MatGPT is a great work! We can get useful replies from LLM. I want to know: How to streamingly receive replies from a large model using Matlab? Thank you!