Can you add an optional "input" for Ollama Generate? This way, I can chain of thought the prompt generation, or even take the output from Ollama vision and ask Ollama Generate to do something with it.
To keep it simple, the input is just a string. As soon as the string is done being input, process that prompt.
Can you add an optional "input" for Ollama Generate? This way, I can chain of thought the prompt generation, or even take the output from Ollama vision and ask Ollama Generate to do something with it.
To keep it simple, the input is just a string. As soon as the string is done being input, process that prompt.