landing-ai / vision-agent

Vision agent
Apache License 2.0
1.17k stars 115 forks source link

Errors often occur during use #166

Open hmppt opened 1 month ago

hmppt commented 1 month ago

Could not extract JSON from the given str image

dillonalaird commented 1 month ago

Could you show me which version you are using? You can run pip list | grep vision-agent to find the version. This was an issue in older versions sometimes with the debugger when the response would get cutoff because it was too long. We made 2 changes, added a retry https://github.com/landing-ai/vision-agent/blob/main/vision_agent/agent/vision_agent.py#L406 and increased the max tokens https://github.com/landing-ai/vision-agent/blob/main/vision_agent/lmm/lmm.py#L87

hmppt commented 1 month ago

Of course, the version I'm using is 0.2.81. This version was installed directly by me using pip install vision-agent.

dillonalaird commented 1 month ago

Hmmm we've definitely had this issue before but I haven't seen it since we made the fixes I mentioned. I'm having some trouble reproducing it now (just ran your example in your other issue post). Do you feel like it's random or happens with certain types of prompts you use? The case you posted is interesting because it looks like the LLM responded with the string:

.
Function input:

And was parsed incorrectly, which is not the issue I mentioned above where the output JSON gets cut off.

hmppt commented 1 month ago

Is it possible that this problem is caused by the custom function returning an empty return value when running?

dillonalaird commented 1 month ago

I don't think so, because it's coming from the debugger here https://github.com/landing-ai/vision-agent/blob/main/vision_agent/agent/vision_agent.py#L406 which doesn't deal with the tools. Do you notice the error is always something strange like that string? Trying to think of how to reproduce it on my end

hmppt commented 1 month ago

I modified the prompt and custom tools, and now I don’t have this problem anymore. Thanks for the answer. :)

hmppt commented 1 month ago

Hmmm we've definitely had this issue before but I haven't seen it since we made the fixes I mentioned. I'm having some trouble reproducing it now (just ran your example in your other issue post). Do you feel like it's random or happens with certain types of prompts you use? The case you posted is interesting because it looks like the LLM responded with the string:

.
Function input:

And was parsed incorrectly, which is not the issue I mentioned above where the output JSON gets cut off.

I started having this problem again image

It shows that the error occurs here image

This may be because the string it receives cannot be converted into json format (judging from the error report, it does not even have a '}')

shankar-vision-eng commented 1 week ago

@dillonalaird - is this issue resolved ?

dillonalaird commented 1 week ago

Hey @hmppt could you give it another test? I've added several updates that should help: