Open darshanbuildx opened 6 days ago
here Cole says it doesn't work with Chrome, only Firefox or Canary. I have been trying with Chrome for days and I believe a lot of other newbies like me should know this up front in the read.me.
@DavidWalters123 Oh man, I can totally relate! I've been at it for days too—I must have cloned the repo at least 10 times and tried running it with and without Docker. Even when I selected the Ollama 2.5 Coder model, it seemed like it was working at times, but later I checked my usage logs and realized that I was actually using the Sonnet model the whole time. The Coder 2.5 model never got used. Thanks so much for pointing that out about the browsers—I’ll definitely give Firefox or Canary a shot!
@DavidWalters123 Thanks for the tip! I tried using Canary as well, but unfortunately, the Ollama model still isn't working for me. Only the Groq and OpenAI models, along with all API-set versions, seem to be functioning correctly. If anyone else has faced this issue or found a solution, I'd appreciate any advice!
Check Screenshots Attached.
@DavidWalters123 It's strange—after selecting the model and waiting a few seconds, it did work on the Canary browser initially. However, after some time, it stopped working even when I reselected the model, waited, and typed a new prompt. Also, it seems to generate very basic code, and there's no preview available as it doesn’t run like bolt.new with npm run dev or similar commands—it only runs with something like npm start. Has anyone experienced something similar or found a consistent fix for this?
@darshanbuildx use 127.0.0.1:11434
as your base url for ollama. I was always getting error for using localhost because in the backend for some reason it was calling on IPv6 (which is 127.0.0.1:11434) instead on IPv4 (localhost). Give it a shot it worked for me
Describe the bug
I'm unable to install and run the Bolt.new-any-LLM project locally without Docker as per the instructions in the README. I followed all the setup steps, but I'm encountering an access violation error related to MiniflareCoreError [ERR_RUNTIME_FAILURE] when running pnpm run dev.
Please see the attached screenshot for the full terminal output:
Link to the Bolt URL that caused the error
could not get url
Steps to reproduce
Verified Node.js version as per package.json requirements. Reinstalled node_modules and cleared pnpm-lock.yaml. Updated Miniflare and Wrangler to stable versions. Attempted running with admin privileges. Attempted configuration tweaks as per documentation.
Expected behavior
The project should install and run locally without any errors when following the setup instructions in the README. Running pnpm run dev should start the development server without encountering runtime or configuration errors.
Screen Recording / Screenshot
Platform
Additional context
No response