Maximilian-Winter / llama-cpp-agent

The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
Other
445 stars 38 forks source link

Cleanup examples folder #26

Closed rhohndorf closed 4 months ago

rhohndorf commented 4 months ago
rhohndorf commented 4 months ago

Not all examples are working for me yet and I'm pretty sure in two cases that's because of a bug/change in the agent code. I feel the examples are more useful for newcomers now. I know the naming of the examples folders is not very pythonic (PEP8) but I went with clarity over conformity here. I can change it though if you don't like it.

Maximilian-Winter commented 4 months ago

Thank you for the PR! Yes I changed a lot and I'm currently working on some additional stuff. I will revisit the examples ones I'm finished with what I'm working on.

rhohndorf commented 4 months ago

Okay, so what does this mean for this PR.

Maximilian-Winter commented 4 months ago

It would be great if you could continue it.

Maximilian-Winter commented 4 months ago

I will work with your code and fix any errors you can't fix.

rhohndorf commented 4 months ago

llama_cpp_agent.get_chat_response returns a generator now ? Is this intended? All examples have to be adapted to that then. Can you tell me how ?

rhohndorf commented 4 months ago

See this output of examples/01_Basics/chatbot_using_llama_cpp_server.py

image

Maximilian-Winter commented 4 months ago

Sorry this was my fault. I have just fixed it back.

Maximilian-Winter commented 4 months ago

@rhohndorf I have fixed the parallel function calling error and the generator stuff.

rhohndorf commented 4 months ago

@Maximilian-Winter Yupp everything works now. Please merge