Closed shmuelk closed 6 days ago
Thanks for the report. The issue has been fixed by https://github.com/i-am-bee/bee-agent-framework/commit/d7584ef11a38e3541c53265d2e51e4adacd83096.
The fix will be released in 0.0.28 (expect a release later today or tomorrow).
Thanks I just pulled from main and the problem is fixed.
Great. Released in v0.0.28.
Describe the bug When the class BAMChatLLM serializes itself, the createSnapShot function simply does a shallowCopy of the config. This is problematic if the BAMChatLLM instance was created via BAMChatLLM.fromPreset, as the messagesToPrompt function is the loaded config references a local variable in the function in the class BAMChatLLMPreset.
I assume the same issue exists with the WatsonXChatLLM class.
To Reproduce Steps to reproduce the behavior:
const llm = BAMChatLLM.fromPreset("meta-llama/llama-3-8b-instruct");
// We create an agent let agent = new BeeAgent({ llm: llm, tools: [new DuckDuckGoSearchTool()], memory: new UnconstrainedMemory(), });
// We ask the agent let prompt = "Who is the president of USA?"; console.info(prompt); const response = await agent.run({ prompt, });
console.info(response.result.text);
// We can save (serialize) the agent const json = agent.serialize();
// We reinitialize the agent to the exact state he was agent = BeeAgent.fromSerialized(json);
// We continue in our conversation prompt = "When was he born?"; console.info(prompt); const response2 = await agent.run({ prompt, }); console.info(response2.result.text);