coleam00 / bolt.new-any-llm

Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
https://bolt.new
MIT License
4.43k stars 1.82k forks source link

Using Ollama makes the response like any other AI model, instead of populating the project #152

Open ameetqyrus opened 3 weeks ago

ameetqyrus commented 3 weeks ago

Describe the bug

The response is on chat like any other AI model instead of building project.

Link to the Bolt URL that caused the error

localhost://5173

Steps to reproduce

  1. Setup Bolt
  2. Select Ollama and CodeLlama:7B
  3. Prompt 4

Expected behavior

The project needs to be built instead of general response like any other model.

Screen Recording / Screenshot

image image

Platform

Additional context

No response

Archviz360 commented 3 weeks ago

i have same problem here. it also not saving the chat.

ElG0hary commented 3 weeks ago

same here

ElG0hary commented 3 weeks ago

i have same problem here. it also not saving the chat.

it does save, if u noticed it shows like that in link like /chat1 or /chat2 so on

Archviz360 commented 3 weeks ago

@ElG0hary yes i saw it. now how do io get it to create the main files in the editor. right now it works like any chat bot for me but it doesnt add content to the editor.

ElG0hary commented 3 weeks ago

if using Ollama, it does have this bug with ollama only, try to use any other API and it would works fine

Archviz360 commented 3 weeks ago

@ElG0hary isthere free api without limits?

the0ckid82 commented 3 weeks ago

some what of a work around that works "most of the time" start your project and after you submit your first message stop it right away. click the code button to open the editor and then resubmit your first message. Cole mentioned this in one of his YT videos Cole mentioned this in one of his YT videos

ElG0hary commented 3 weeks ago

### Solved: Create a file name Modelfile (no file extension) then open it with notepad and paste this:

FROM llama3.2:latest
PARAMETER num_ctx 32768

change only the model name as what u are using. after that open cmd and use this command ollama create -f Modelfile bolt.new-Ollama Note: bolt.new-Ollama is the new model you are gonna choose when using it with your locale Bolt.new so dont choose regular ollama 3.2b

Reason for not working to begin with: Difference between Context window between ollama and other models.

shaojialz commented 3 weeks ago

After my tests, the Deepseek and Mistral models have been running well on the local deployment of boltnew, while other models have had more or less some issues. Good luck to you.