NousResearch / Hermes-Function-Calling

MIT License
464 stars 73 forks source link

Prompt format to follow + Multi tools for Hermes-2-Theta-Llama-3-8B with Ollama #26

Open aiseei opened 1 month ago

aiseei commented 1 month ago

Hi - great work and thanks for this source!

I am not clear on the prompt format to use with model & Ollama. 1) one at https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B 2) as per the examples. https://github.com/NousResearch/Hermes-Function-Calling/blob/main/examples/ollama-multiple-fn.ipynb or 3) https://github.com/NousResearch/Hermes-Function-Calling/blob/main/prompt_assets/sys_prompt.yml

Any inputs will be greatly appreciated!

Thanks

kiiwee commented 1 month ago

I am currently trying to make tooling with Ollama work but as mentioned, the model was trained with a tool role which Ollama still doesn't support

aiseei commented 1 month ago

@kiiwee - we are trying to make it work directly with llama-cpp server . https://github.com/ggerganov/llama.cpp/tree/master/examples/server

Have u had any experience with that ?

kiiwee commented 1 month ago

@aiseei managed to make it work with the llama.cpp python lib by inserting the tool template into the response and feed back of the tool for example, when the tool responds add the xml template and return it with a tool role. As for the llama.cop server im going to try to function call (not the openai way) from a node app to test soon but i dont see why it wouldn't

Im still quite puzzled why ollama decided to strictly allow 3 roles only (system, assistant and user)

aiseei commented 1 month ago

@kiiwee was it using this method ? https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/OpenHermesFunctionCalling.ipynb

kiiwee commented 1 month ago

@aiseei #29 check this pull request. I think it should work the same with the Theta version. I used the Ollama example and added a callback to the model with the tool response since I always needed that