Open slilov opened 2 months ago
The problem is, llm model is not capable of returning correct formatted JSON response. LLM should return JSON-compatible value. Shouldn't return like double quotes or other non applicable chars. We are trying to fix this issue by double checking llm response,.. Until next release, you can use Anthropic or try again and hope llm returns correct JSON.
Thank you! I will try other LLM in ollama and I hope they return the right JSON :)
Installed Flux-Magic as described with no errors. I use Ollama local service with Gemma2 model and ComfyUI local service. In UI when prompt anything and press "Magic!" button, this happens:
D:\Flux-Magic>node app.js {"level":30,"time":1723554468111,"pid":19576,"hostname":"INET-220","msg":"Connecting to url: ws://127.0.0.1:8188/ws?clientId=7cb39106-c58a-4d36-9f38-228dec673c6b"} Server running at http://localhost:3333 {"level":30,"time":1723554468133,"pid":19576,"hostname":"INET-220","msg":"Connection open"} New workflow loaded, json is: flux.json, placeHolders are:width,height,batchSize,positive,unet_name,steps,seed Initial stats loaded: { totalImagesGenerated: 0, models: {} } A user connected ollama chat started SyntaxError: Unexpected token '`', "```json {""... is not valid JSON at JSON.parse ()
at file:///D:/Flux-Magic/app.js:118:36
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
I'm trying to generate images from text locally with Flux.Schnell using Flux-Magic. And I'm not very experienced with this staff. Please help!