open-webui / open-webui

User-friendly WebUI for LLMs (Formerly Ollama WebUI)
https://openwebui.com
MIT License
39.12k stars 4.56k forks source link

"ValueError: could not convert string to float: 'or start new chat, or edit the System Prompt, like " | Migration failed: 010_migrate_modelfiles_to_models #2765

Closed ThatCoffeeGuy closed 3 months ago

ThatCoffeeGuy commented 3 months ago

Bug Report

Description

Bug Summary: After updating to a newer version, open-webui stopped working when it tried to migrate something that failed. Restarting the container not helping at all, it's just spamming the error and crashing.

Steps to Reproduce: Download certain models / profiles it doesn't like and let it try migrating them by an update.

Expected Behavior: It should work.

Actual Behavior: It is failing and instead of skipping, crashes entirely.

Environment

Reproduction Details

Confirmation:

Docker Container Logs:

Loading WEBUI_SECRET_KEY from file, not provided as an environment variable.
Loading WEBUI_SECRET_KEY from .webui_secret_key
Migration failed: 010_migrate_modelfiles_to_models
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/peewee_migrate/router.py", line 199, in run_one
    migrate(migrator, self.database, fake=fake)
  File "<string>", line 43, in migrate
  File "<string>", line 69, in migrate_modelfile_to_model
  File "/app/backend/utils/misc.py", line 162, in parse_ollama_modelfile
    value = float(value)
            ^^^^^^^^^^^^
ValueError: could not convert string to float: 'or start new chat, or edit the System Prompt, like `You are a dreamer, creating a beautiful advanced dreamy fantasy sentence ...`'
/app
Traceback (most recent call last):
  File "/usr/local/bin/uvicorn", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 410, in main
    run(
  File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 578, in run
    server.run()
  File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 61, in run
    return asyncio.run(self.serve(sockets=sockets))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
  File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 68, in serve
    config.load()
  File "/usr/local/lib/python3.11/site-packages/uvicorn/config.py", line 473, in load
    self.loaded_app = import_from_string(self.app)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/importer.py", line 21, in import_from_string
    module = importlib.import_module(module_str)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/app/backend/main.py", line 23, in <module>
    from apps.ollama.main import app as ollama_app, get_all_models as get_ollama_models
  File "/app/backend/apps/ollama/main.py", line 34, in <module>
    from apps.webui.models.models import Models
  File "/app/backend/apps/webui/models/models.py", line 11, in <module>
    from apps.webui.internal.db import DB, JSONField
  File "/app/backend/apps/webui/internal/db.py", line 38, in <module>
    router.run()
  File "/usr/local/lib/python3.11/site-packages/peewee_migrate/router.py", line 229, in run
    done.append(self.run_one(mname, migrator, fake=fake, force=fake))
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/peewee_migrate/router.py", line 199, in run_one
    migrate(migrator, self.database, fake=fake)
  File "<string>", line 43, in migrate
  File "<string>", line 69, in migrate_modelfile_to_model
  File "/app/backend/utils/misc.py", line 162, in parse_ollama_modelfile
    value = float(value)
            ^^^^^^^^^^^^
ValueError: could not convert string to float: 'or start new chat, or edit the System Prompt, like `You are a dreamer, creating a beautiful advanced dreamy fantasy sentence ...`'

Installation Method

docker pull ghcr.io/open-webui/open-webui:main
docker stop open-webui
docker rm open-webui
docker run -d -p 3000:8080 --add-host=127.0.0.1:host-gateway --network=host -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
tjbck commented 3 months ago

I'll modify the code to skip the modelfiles that cause errors so you won't have issues with the migration, but basically you had an invalid modelfile.

tjbck commented 3 months ago

Should be fixed, let me know if the issue persists!

ThatCoffeeGuy commented 3 months ago

Thank you, I pulled and it seems to have fixed the issue, I am able to run and access my history. Thanks!