mudler / LocalAI

:robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
https://localai.io
MIT License
21.75k stars 1.66k forks source link

Crashes attempting to run Luna-AI model #1331

Closed ThommoMC closed 6 months ago

ThommoMC commented 7 months ago

LocalAI version:

v1.40.0

Environment, CPU architecture, OS, and Version:

Compiled for CUDA on Linux thommopc 6.6.2-arch1-1 #1 SMP PREEMPT_DYNAMIC Mon, 20 Nov 2023 23:18:21 +0000 x86_64 GNU/Linux

Describe the bug

Attempting to load the Luna AI model using the YAML config from this guide https://localai.io/howtos/easy-model-import-downloaded/ will cause LocalAI to crash To Reproduce

Follow the instructions for that guide and attempt to prompt the model Expected behavior

The model responds with a answer
Logs

See attached text file localaierror.txt

Additional context

Running the model without a config allows it to work

lunamidori5 commented 7 months ago

Notes from Discord Removing the Backend from the yaml file, lets the model load...

localai-bot commented 7 months ago

:warning::warning::warning::warning::warning:

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me, I don't support (yet) replying!

:warning::warning::warning::warning::warning:

I apologize for the issue you are experiencing with LocalAI. I have reviewed the guide and the logs you provided, and it seems like the issue is related to the YAML configuration file for the Luna AI model. To troubleshoot this issue, we can try a few steps:

  1. Verify that the YAML config file matches the version of the model you have downloaded. If not, try downloading the latest version of the model and updating the config file accordingly.
  2. Ensure that the YAML config file is formatted correctly and contains all the required fields for the Luna AI model. Refer to the guide again to ensure that all the required fields are included in the config file.
  3. Check if there are any errors or warnings in the logs when running LocalAI with the config file. If so, please provide the full error message and stack trace from the logs.
  4. Try running the model without a config file to see if it works correctly. This will help us determine if the issue is related to the config file or something else. If none of these steps work, please let me know and we can try other troubleshooting steps or provide technical support to help you resolve the issue.

Sources: