Python Binding for Cody Agent from Sourcegraph: JSON-RPC over TCP or stdio. Establish seamless connectivity with the Cody Agent using this Python binding. Leverage the power of JSON-RPC over TCP or stdio for efficient communication.
I'd like to discuss the future possibilities for handling "Chat" and explain why I've isolated this into a separate object in PR https://github.com/PriNova/codypy/pull/7.
Currently, new_chat() and chat() are methods within the CodyAgent class. This structure doesn't allow for multiple chat sessions to occur simultaneously. However, testing has shown that the Agent can manage multiple chat sessions concurrently, each identified by its own chat_id. To avoid confusion with internal protocol parameters, I recommend renaming chat_id to chat_session_id later on.
Let's consider a scenario where a backend allows 2-3 chat sessions to run concurrently (similar to running Cody with an IDE: one process, multiple chat windows/sessions). In this case, it's necessary to distinguish between different sessions. Additionally, some actions are session-specific. For instance, you can theoretically change the LLM model for a specific session or configure the repo context.
Therefore, I propose that Chat Sessions should be individual objects, each carrying its own chat_session_id. These objects should implement all the methods used in individual chat contexts so far: ask() (previously chat()), set_context_repo(), and set_model(). I've renamed chat() to ask() to prevent confusion with the term chat.
I suggest that response objects should be returned to the caller in their entirety, without any data being removed. In the PR, this is achieved by placing the response in a pydantic model. It might be more robust to always return a Response object with Response.model referencing the pydantic model of the data, and Response.raw referencing the plain deserialized JSON data.
This approach allows higher-level libraries to run multiple chat sessions simultaneously and provides them with complete flexibility in handling certain input or output data.
I'd like to discuss the future possibilities for handling "Chat" and explain why I've isolated this into a separate object in PR https://github.com/PriNova/codypy/pull/7.
Currently,
new_chat()
andchat()
are methods within theCodyAgent
class. This structure doesn't allow for multiple chat sessions to occur simultaneously. However, testing has shown that the Agent can manage multiple chat sessions concurrently, each identified by its ownchat_id
. To avoid confusion with internal protocol parameters, I recommend renamingchat_id
tochat_session_id
later on.Let's consider a scenario where a backend allows 2-3 chat sessions to run concurrently (similar to running Cody with an IDE: one process, multiple chat windows/sessions). In this case, it's necessary to distinguish between different sessions. Additionally, some actions are session-specific. For instance, you can theoretically change the LLM model for a specific session or configure the repo context.
Therefore, I propose that Chat Sessions should be individual objects, each carrying its own
chat_session_id
. These objects should implement all the methods used in individual chat contexts so far:ask()
(previouslychat()
),set_context_repo()
, andset_model()
. I've renamedchat()
toask()
to prevent confusion with the term chat.I suggest that response objects should be returned to the caller in their entirety, without any data being removed. In the PR, this is achieved by placing the response in a pydantic model. It might be more robust to always return a
Response
object withResponse.model
referencing the pydantic model of the data, andResponse.raw
referencing the plain deserialized JSON data.This approach allows higher-level libraries to run multiple chat sessions simultaneously and provides them with complete flexibility in handling certain input or output data.