e2b-dev / fragments

Open-source Next.js template for building apps that are fully generated by AI. By E2B.
https://fragments.e2b.dev
Apache License 2.0
3.11k stars 402 forks source link

SyntaxError: JSON.parse: unexpected end of data #70

Open stoykovstoyk opened 1 week ago

stoykovstoyk commented 1 week ago

Hello and thank you for the great product.

I experience a this trouble when I try to use it with local llama models.

At first it starts to generate some code and somewhere in the middle I receive this error

image

Unhandled Runtime Error SyntaxError: JSON.parse: unexpected end of data at line 1 column 1 of the JSON data

I tried many different models but every time I get the same error.

I saw in the models.ts file there is if (providerId === 'fireworks') { return 'json' }

So i tried to set if (providerId === 'ollama') { return 'json' }

but I still experience the same error.

Maybe the models do not response on proper JSON format or something else.

I also tried to add in the prompts.ts "You should response in JSON only" but it did not worked.

What could be the reason?

Is anyone able to help with this issue?

Once again I want to express my deep appreciation for this great project, but running a local models is a must have for my use case that is why I decided to open this issue.

linear[bot] commented 1 week ago

E2B-1009 SyntaxError: JSON.parse: unexpected end of data

mishushakov commented 1 week ago

Which model were you using?

stoykovstoyk commented 1 week ago

I tried with mistral-large, mistral-nemo and also llama3.2 Non of them worked

I saw in the ollama website that JSON can be specified as preferred format when the call is made to the generate endpoint like in this example:

curl http://localhost:11434/api/generate -d '{ "model": "llama3.2", "prompt": "What color is the sky at different times of the day? Respond using JSON", "format": "json", "stream": false }'

but I cannot find where to put this in the code base.

mishushakov commented 1 week ago

Do you know the line number where the issue occurs? Might not be related to the LLM at all