Closed Yvonne-Aizawa closed 8 months ago
Hey, I don't think we need to support other formats than json since we're using Serde, the user does not matter in what format we get the data as they would just access it through the Rust bindings that this library provides.
I do not know if i understand it right. But this is how ollama works now docs. it forces the llm to return valid json. this just got added in ollama 0.1.9. you still recieve the data as a string.(the response part) but the string would be a valid json object
example with formatting to json on
what is the weather like? {"message": "I'm just an AI, I don't have access to real-time weather information. However, I can tell you that the weather can vary depending on your location and time of year. Could you please provide me with more context or specify a location you are interested in?", "request_id": 123456}
example with formatting none
what is the weather like? I'm just an AI, I don't have real-time access to current weather conditions. However, I can tell you the typical weather patterns for a place if you provide me with the location. Please let me know the city or region you are interested in, and I will do my best to give you an idea of what the weather is like there.
The output is both a string but one can be converted to a json object if needed
removed parameter from the new() method. and updated code to reflect it altered the comment to explain better what the field does removed the test from generation.rs
with ollama 0.1.9 you can send if you want the output to be formatted. for now it only supports JSON. but when left empty it generates as normal.
I added a parameter to GenerationRequest to specify if it needs to be formatted as json. when it is None it will leave out the parameter and generate as normal. I made it an enum as i seems like the format parameter might get more types and it would be easy to add new ones to it.