Closed huliangbing closed 9 months ago
Thank you very much! I will have a try using matlab.net.http.io.StringConsumer.
This feature is added in LLMs with MATLAB.
% Define the function handle. This function print the returned text.
sf = @(x)fprintf("%s",x);
% Create the chat object with the function handle.
chat = openAIChat(StreamFun=sf);
% Generate response to a prompt in streaming mode.
prompt = "What is Model-Based Design?";
[text, message, response] = generate(chat,prompt);
Thanks a lot!
This is supported in 2.0.1
This is a duplicate of #23 How to streamingly receive replies from a large model using Matlab? Please see my answer there.