I get more than 80,000 words when I use the Ollama Vision node, unbelievable!!! The Ollama model I use is llama3.2-vision:11b, I am not sure if that model's problem or others.
This is quite likely to happen, and I really encourage you to make a token limit for this.
I get more than 80,000 words when I use the Ollama Vision node, unbelievable!!! The Ollama model I use is llama3.2-vision:11b, I am not sure if that model's problem or others. This is quite likely to happen, and I really encourage you to make a token limit for this.