Closed Stevenic closed 1 week ago
@Stevenic could you mark the class/methods in LLMClient as deprecated? Notes indicating to use StreamingLLMClient instead would be great.
@Stevenic could you mark the class/methods in LLMClient as deprecated? Notes indicating to use StreamingLLMClient instead would be great.
It's not really deprecated though. You should just generally use the derived class. It will trigger warnings if I mark it as deprecated since it's a base class
It's not really deprecated though. You should just generally use the derived class. It will trigger warnings if I mark it as deprecated since it's a base class
Okay so maybe marking it as deprecated isn't the right answer, but I find it confusing that the base class is only being used by one derived class. This is specifically done for backwards compatibility, correct? In that case there should be remarks to that effect, and what scenarios you would want to use the base class. Feel free to let me know if this is just misconception from my functions oriented brain, but personally the extra comments would be beneficial.
It's not really deprecated though. You should just generally use the derived class. It will trigger warnings if I mark it as deprecated since it's a base class
Okay so maybe marking it as deprecated isn't the right answer, but I find it confusing that the base class is only being used by one derived class. This is specifically done for backwards compatibility, correct? In that case there should be remarks to that effect, and what scenarios you would want to use the base class. Feel free to let me know if this is just misconception from my functions oriented brain, but personally the extra comments would be beneficial.
I can merge the two classes
Linked issues
closes: #1593
Details
Updated JS
OpenAIModel
class to support streaming and updatedActionPlanner
to support streaming chunks to the client.Change details
StreamingResponse
class that implements the new activity based chunking protocol.OpenAIModel
class to implement the new prompt completion model events.OpenAIModel
class to use the official node library and added streaming support.StreamingLLMClient
which manages the task of streaming chunks from the model to the client using the new StreamingResponse class.ActionPlanner
class to use the newStreamingLLMClient
instead of theLLMClient
class directly.ActionPlanner
to return an empty plan anytime streaming is used.TestModel
class to simplify testing of things like streaming support.StreamingResponse
,TestModel
,LLMClient
, andStreamingLLMClient
classes. `Attestation Checklist
[x ] My code follows the style guidelines of this project
I have checked for/fixed spelling, linting, and other errors
I have commented my code for clarity
I have made corresponding changes to the documentation (updating the doc strings in the code is sufficient)
My changes generate no new warnings
I have added tests that validates my changes, and provides sufficient test coverage. I have tested with:
New and existing unit tests pass locally with my changes
Additional information