langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
51.5k stars 7.44k forks source link

Using llama3.2-vision to upload local images results in an error. #10530

Closed AAEE86 closed 2 days ago

AAEE86 commented 2 days ago

Self Checks

Dify version

0.11.0

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

The following error occurs when uploading pictures locally:

Query or prefix prompt is too long, you can reduce the prefix prompt, or shrink the max token, or switch to a llm with a larger token limit size.

There is no error when uploading pictures using URL.

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

dosubot[bot] commented 2 days ago

I found a similar closed issue that might help you resolve the error you're encountering. The issue suggests the following solutions for the "Query or prefix prompt is too long" error:

  1. Reduce the Prefix Prompt: Shorten the initial part of your prompt to fit within the token limit.
  2. Lower the Max Token Setting: Decrease the max token setting further to allow more space for the prompt within the model's token limit.
  3. Use a Model with a Larger Token Limit: If possible, switch to a model that supports a larger token limit to accommodate your needs.

Ensure that the sum of prompt tokens and max tokens does not exceed the model's context size [1].

To continue talking to Dosu, mention @dosu.