Closed dabhadepratik1 closed 4 months ago
Thank you, I finally got around to fixing the issue. The problem was duplicate tokens - currently this code:
# Initialize the generator with the Hugging Face model and our functions
generator = Generator(
functions,
LlamaModel(".../codellama-7b-instruct.Q2_K.gguf"),
)
# Generate text using a prompt
function_call = generator.generate(
"What is the weather like today in Brooklyn?", suffix="\n"
)
Returns this:
{'name': 'get_current_weather', 'parameters': '{
"location": "Brooklyn, NY"
}'}
Closing this issue now.
The function calling feature within the "local_function_calling" tool is currently not functioning as expected. When attempting to use this feature, it does not produce the intended results or throws an error. Use llm:- codellama-13b-instruct.Q4_K_M.gguf code:- function_call = generator.generate("What is the weather like today in Brooklyn?") print(function_call) output got:- {'name': 'get_current_weathe', 'parameters': '{\n "location": "Microsoft.Azure.Comm"\n}'}![function_calling_error](https://github.com/rizerphe/local-llm-function-calling/assets/114305341/09903d9b-0545-44d2-992c-32ad32601aab)