Hi there,
AgentChat for dotnet seems quite promising for multi-agent chats, especially for role-playing scenarios. Incorporating LLamaSharp for local large language model (LLM) inference could provide added flexibility and efficiency.
Is it possible to support LLamaSharp to run local LLM inference with AgentChat?
Hi there, AgentChat for dotnet seems quite promising for multi-agent chats, especially for role-playing scenarios. Incorporating LLamaSharp for local large language model (LLM) inference could provide added flexibility and efficiency.
Is it possible to support LLamaSharp to run local LLM inference with AgentChat?