Open rickbeeloo opened 3 months ago
@rickbeeloo, sorry for the delayed response. Multiple choice is possible, and I've been thinking about how to implement it.
One easy option would build a workflow using the exact_string primitive, and removing it given answers as they're giving, and running until all valid answers are given.
I just pushed a huge update to this library. If you still need multiple choice, I can work on it. I intend to add the workflow as a reasoning workflow at some point.
Hey @ShelbyJenkins! Gonna try it out :)
I have been getting quite some errors when installing, now stuck at:
--> src/llm_backends/llama_cpp/server.rs:3:12
|
3 | gguf::{GgufLoader, GgufLoaderTrait},
| ^^^^^^^^^^ ^^^^^^^^^^^^^^^ no `GgufLoaderTrait` in `models::open_source_model::gguf`
| |
| no `GgufLoader` in `models::open_source_model::gguf`
| help: a similar name exists in the module: `LlmGgufLoader`
error[E0432]: unresolved imports `llm_utils::models::open_source_model::gguf::GgufLoader`, `llm_utils::models::open_source_model::gguf::GgufLoaderTrait`
--> src/llm_backends/llama_cpp/mod.rs:17:12
|
17 | gguf::{GgufLoader, GgufLoaderTrait},
| ^^^^^^^^^^ ^^^^^^^^^^^^^^^ no `GgufLoaderTrait` in `models::open_source_model::gguf`
| |
| no `GgufLoader` in `models::open_source_model::gguf`
| help: a similar name exists in the module: `LlmGgufLoader`
error[E0432]: unresolved import `llm_utils::models::open_source_model::GgufLoaderTrait`
--> src/prelude.rs:17:29
|
17 | open_source_model::{GgufLoaderTrait, HfTokenTrait, LlmPresetTrait},
| ^^^^^^^^^^^^^^^ no `GgufLoaderTrait` in `models::open_source_model`
llm utils is set at llm_utils={path="../llm_utils"}
but this does not exist, changing it to the latest version 0.8.0
gives the above error.
Any idea on how to fix this?
Changing that in the code, to LlmGgufLoader
instead gives another bunch of errors, that keeps going. Hope you can make it run cause the workflows look amazing
Doh! Yes, this is a mistake on my part. I've been reworking somethings on the backend and broke the versions of llm_client and llm_utils.
A temporary fix is to use the newest version of llm_client on crates.io. I just checked and the llm_client version on crates uses the correct version of llm_utils. They should work together correctly.
In the future I'll make sure not to commit any breaking changes without updating both libraries at once.
Hey Shelby, thanks for wrapping this all in a package!
I'm often working with multiple choice questions. From previous experiments with a variety of LLMs I noticed that it's much "easier" for the LLM to answer: "Is X part of Y", "Is X part of Z", etc. than answering "To which categories: [Y, Z] does X belong".
Would multiple choice also be possible with
llm_client
? Any thoughts on that?Have a good day!