rizerphe / local-llm-function-calling

A tool for generating function arguments and choosing what function to call with local LLMs
https://local-llm-function-calling.readthedocs.io/
MIT License
320 stars 30 forks source link

Function calling Issue #9

Closed dabhadepratik1 closed 4 months ago

dabhadepratik1 commented 8 months ago

The function calling feature within the "local_function_calling" tool is currently not functioning as expected. When attempting to use this feature, it does not produce the intended results or throws an error. Use llm:- codellama-13b-instruct.Q4_K_M.gguf code:- function_call = generator.generate("What is the weather like today in Brooklyn?") print(function_call) output got:- {'name': 'get_current_weathe', 'parameters': '{\n "location": "Microsoft.Azure.Comm"\n}'} function_calling_error

rizerphe commented 4 months ago

Thank you, I finally got around to fixing the issue. The problem was duplicate tokens - currently this code:

# Initialize the generator with the Hugging Face model and our functions
generator = Generator(
    functions,
    LlamaModel(".../codellama-7b-instruct.Q2_K.gguf"),
)

# Generate text using a prompt
function_call = generator.generate(
    "What is the weather like today in Brooklyn?", suffix="\n"
)

Returns this:

{'name': 'get_current_weather', 'parameters': '{
    "location": "Brooklyn, NY"
}'}

Closing this issue now.