Maximilian-Winter / llama-cpp-agent

The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
Other
472 stars 42 forks source link

Some about documentation page of project. #49

Closed svjack closed 5 months ago

svjack commented 5 months ago

I think we should add the documentation link "https://llama-cpp-agent.readthedocs.io/en/latest/" on the top of README.md in repo.

And some example, lack technical guidance, such as difference between simple_function_calling.py parallel_function_calling.py

Some discussion about relatively independent part of the project. Such as hermes_2_pro_agent.py

hermes_2_pro_agent.py seems no ability of parallel_function_calling.py because it solely a model calling of tool calling LLM and lack chain ability. But can run fast with the help of 91% accuracy of function tool Function format parsing capability.

Some background introduction will make people use this project more convenient.

Can you open a discord to make us improve the project together.😊

Maximilian-Winter commented 5 months ago

@svjack I opened a discord server. https://discord.gg/6tGznupZGX

Maximilian-Winter commented 5 months ago

@svjack I added some comments to simple_function_calling.py and parallel_function_calling.py to make the difference more clear.

Maximilian-Winter commented 5 months ago

You can use my version of the Hermes 2 Pro agent with parallel function calling and it works for the examples.