Open deeper-coder opened 3 weeks ago
@deeper-coder we need function calling, if you can get a function calling model to work reliably then it will work. But you need a class with a run(task: str)
or __call__(task: str)
method to integrate into the ToTAgent class
I plan to use llama3 70B and I noticed that in the OpenAIFunctionCaller class, you’ve implemented the run method as shown in the image. So, can I achieve my desired functionality by passing base_url = "http://localhost:7788/v1/" in **kwargs?
@deeper-coder we need function calling, if you can get a function calling model to work reliably then it will work. But you need a class with a
run(task: str)
or__call__(task: str)
method to integrate into the ToTAgent class
Your current code uses OpenAI’s API key to access the LLM service by default. I’d like to switch it to use a local LLM, which I have deployed through LLaMA-Factory and is accessible via a local API, for example, at http://localhost:7788/v1/. Could you guide me on how to make this adjustment? Thank you!
Upvote & Fund