{
public static async Task RunAsync()
{
// Create config
var llmConfig = autogen.ConfigListFromJson(@"PathToModelFile");
var config = new ConversableAgentConfig
{
Temperature = 0,
ConfigList = llmConfig,
};
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.",
llmConfig: config)
.RegisterPrintFormatMessageHook();
// set human input mode to ALWAYS so that user always provide input
var userProxyAgent = new UserProxyAgent(
name: "user",
humanInputMode: ConversableAgent.HumanInputMode.ALWAYS)
.RegisterPrintFormatMessageHook();
// start the conversation
await userProxyAgent.InitiateChatAsync(
receiver: assistantAgent,
message: "Hey assistant, please help me to do some tasks.",
maxRound: 10);
}
}
Add the following to Program.cs under Autogen.BasicSample
using AutoGen.BasicSample;
await CustomExample_LocalLLM.RunAsync();
Hey @tomasoft I'm going to mark this issue as won't resolve because in AutoGen.Net, it's recommended to use a specific agent like OpenAIChatAgent, MistralClientAgent rather than a genenral AssistantAgent
Describe the issue
After cloning and running the Basic Samples, wanted to use a local LLM. I tried to use ConfigListFromJson as found here : https://microsoft.github.io/autogen-for-net/api/AutoGen.LLMConfigAPI.html#AutoGen_LLMConfigAPI_ConfigListFromJson_System_String_System_Collections_Generic_IEnumerable_System_String__ Unfortunately this has not yet been implemented.
Steps to reproduce
Add the following class under AutoGen.BasicSample
Add the following to Program.cs under Autogen.BasicSample
Screenshots and logs
Additional Information
AutoGen Version : dotnet version