Open gkapellmann opened 1 month ago
Now trying from home is a different error, but same code:
System.NullReferenceException: Object reference not set to an instance of an object.
at BotSharp.Core.Routing.RoutingService.InvokeAgent(String agentId, List1 dialogs, Func
2 onFunctionExecuting)
at BotSharp.Core.Routing.Handlers.RouteToAgentRoutingHandler.Handle(IRoutingService routing, FunctionCallFromLlm inst, RoleDialogModel message, Func2 onFunctionExecuting) at BotSharp.Core.Routing.RoutingService.InstructDirect(Agent agent, RoleDialogModel message) at BotSharp.Core.Conversations.Services.ConversationService.SendMessage(String agentId, RoleDialogModel message, PostbackMessageModel replyMessage, Func
2 onMessageReceived, Func2 onFunctionExecuting, Func
2 onFunctionExecuted)
at BotSharp.OpenAPI.Controllers.ConversationController.SendMessage(String agentId, String conversationId, NewMessageModel input)
at lambda_method6(Closure, Object)
at Microsoft.AspNetCore.Mvc.Infrastructure.ActionMethodExecutor.AwaitableObjectResultExecutor.Execute(ActionContext actionContext, IActionResultTypeMapper mapper, ObjectMethodExecutor executor, Object controller, Object[] arguments)
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.
This is very weird.
I have added the Router parameter, but is empty in the end, is that correct?
Ok, figured it out, I had to add the llmConfig to the Agent structure in the settings. It is now like this:
"Agent": { "DataDir": "agents", "TemplateFormat": "liquid", //"HostAgentId": "01e2fc5c-2c89-4ec7-8470-7688608b496c", //"EnableTranslator": false "llmConfig": { "Provider": "llama-sharp", "Model": "llama-2-7b-chat.Q8_0.gguf", //"is_inherit": true, "max_recursion_depth": 3 } },
But I dont really understand this settings.
Thank you in advance!
The HostAgentId
is the Routing
agent entry when you're using the BotSharp-UI
.
OK! Thanks for clarifying that @Oceania2018
So if I am not using the UI, but just by API calls, there is really no need of a Hosting agent, I guess??
Hello again,
Following the steps of the documentation, I have configured my LlamaSharp model here:
"LlamaSharp": { "Interactive": true, "ModelDir": "D:/C#/Llms", "DefaultModel": "llama-2-7b-chat.Q4_K_M.gguf", "MaxContextLength": 1024, "NumberOfGpuLayer": 20 },
and in the providers section too:
"LlmProviders": [ { "Provider": "llama-sharp", "Models": [ { "Name": "llama-2-7b-chat.Q4_K_M.gguf", "Type": "chat" } ] } ],
As you can tell, I am only adding Llamasharp, no other provider.
When running the app, my plugins folder remains empty, and when trying to send a message to start the conversation I get an exception:
System.ArgumentNullException: Value cannot be null. (Parameter 'path2') at System.ArgumentNullException.Throw(String paramName) at System.IO.Path.Combine(String path1, String path2) at BotSharp.Plugins.LLamaSharp.LlamaAiModel.LoadModel(String model) at BotSharp.Plugin.LLamaSharp.Providers.ChatCompletionProvider.GetChatCompletions(Agent agent, ListgAwaited|10_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Rethrow(ActionExecutedContextSealed context)
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.InvokeInnerFilterAsync()
--- End of stack trace from previous location ---
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.g__Awaited|25_0(ResourceInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.Rethrow(ResourceExecutedContextSealed context)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.InvokeFilterPipelineAsync()
--- End of stack trace from previous location ---
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.g Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope)
at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware.Invoke(HttpContext context)
at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware.Invoke(HttpContext context)
at Microsoft.AspNetCore.Diagnostics.DeveloperExceptionPageMiddlewareImpl.Invoke(HttpContext context)
1 conversations) at BotSharp.Core.Routing.RoutingService.InvokeAgent(String agentId, List
1 dialogs, Func2 onFunctionExecuting) at BotSharp.Core.Routing.Handlers.RouteToAgentRoutingHandler.Handle(IRoutingService routing, FunctionCallFromLlm inst, RoleDialogModel message, Func
2 onFunctionExecuting) at BotSharp.Core.Routing.RoutingService.InstructDirect(Agent agent, RoleDialogModel message) at BotSharp.Core.Conversations.Services.ConversationService.SendMessage(String agentId, RoleDialogModel message, PostbackMessageModel replyMessage, Func2 onMessageReceived, Func
2 onFunctionExecuting, Func2 onFunctionExecuted) at BotSharp.OpenAPI.Controllers.ConversationController.SendMessage(String agentId, String conversationId, NewMessageModel input) at lambda_method6(Closure, Object) at Microsoft.AspNetCore.Mvc.Infrastructure.ActionMethodExecutor.AwaitableObjectResultExecutor.Execute(ActionContext actionContext, IActionResultTypeMapper mapper, ObjectMethodExecutor executor, Object controller, Object[] arguments) at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<InvokeActionMethodAsync>g__Awaited|12_0(ControllerActionInvoker invoker, ValueTask
1 actionResultValueTask) at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Seems that it is not finding the provider, am I right? I have added the plugin "BotSharp.Plugin.LLamaSharp", so I have no idea what I could be missing.