wandaweb / InvokeAI-Sagemaker-Studio-Lab

running invoke ai on sagemaker studio lab
MIT License
4 stars 1 forks source link

problem after add ngrok #1

Open mohqwt0 opened 8 months ago

mohqwt0 commented 8 months ago

An exception has occurred: /tmp/invoke/models/core/convert/sd-vae-ft-mse is missing == STARTUP ABORTED == One or more necessary files is missing from your InvokeAI root directory Please rerun the configuration script to fix this problem. From the launcher, selection option [7]. From the command line, activate the virtual environment and run "invokeai-configure --yes --skip-sd-weights" ** (To skip this check completely, add "--ignore_missing_core_models" to your CLI args. Not installing these core models will prevent the loading of some or all .safetensors and .ckpt files. However, you can always come back and install these core models in the future.) Press any key to continue...Traceback (most recent call last): File "/home/studio-lab-user/InvokeAI/invokeai/backend/install/check_root.py", line 24, in check_invokeai_root assert path.exists(), f"{path} is missing" AssertionError: /tmp/invoke/models/core/convert/sd-vae-ft-mse is missing

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/studio-lab-user/.conda/envs/invoke/bin/invokeai-web", line 8, in sys.exit(invoke_api()) File "/home/studio-lab-user/InvokeAI/invokeai/app/api_app.py", line 235, in invoke_api check_invokeai_root(app_config) # note, may exit with an exception if root not set up File "/home/studio-lab-user/InvokeAI/invokeai/backend/install/check_root.py", line 40, in check_invokeai_root input("Press any key to continue...") EOFError: EOF when reading a line

wandaweb commented 8 months ago

Try running sh configure.sh again. Looks like it failed to download one of the support models. Does it show any errors?

Alternatively, you can edit the file "start.sh" and change the line invokeai-web --root ~/invokeai & to invokeai-web --root ~/invokeai --ignore_missing_core_models &

This will allow the app to start despite the missing model. I was not able to reproduce the error, but I'll leave it open in case anyone else has the same issue.