microsoft / vscode-ai-toolkit

MIT License
872 stars 37 forks source link

Inference in AI project not working #82

Closed liorbalmas closed 1 day ago

liorbalmas commented 1 week ago

Project's Readme says that python gradio_chat.py --baseonly should start a local web server for inference but the command generates this error:

 torch_dtype = torch.<compute_dtype>  # Set the appropriate torch data type
                        ^
SyntaxError: invalid syntax

it seems like compute_dtype is an input parameter that should be provided by the user after the page and model are loaded

swatDong commented 6 days ago

@liorbalmas - how was the project generated? Did you meet any issue during the "Generate Project" step? (the step looks like)

image

There are several place holders ("<xxx_xxx>") in the draft project and are replaced in the "Generate Project" step.

liorbalmas commented 5 days ago

yes. I went through that screen and it went smooth. But the paths here do not exist (taken from console_chat.py)

image

additionally, the python environment was not created in WSL. the README assumes it is as it says to runconda activate [conda-env-name] but the environment wasn't created...

liorbalmas commented 5 days ago

I am using anaconda. It's installed directly under c:\ I gave full permissions to my user to anaconda's folder, and now models are created (project generation takes much more time, I can see it downloads them) and when it finishes, on click on the button to open the project, it opens in WSL as expected. So far so good but the conda env was not created during project creation

when I try to run first_time_setup.sh manuallt I get some networking issue

image

eventually I get this message:

image

Thanks

swatDong commented 5 days ago

Could you please try VSC command palette "AI Toolkit: Validate environment prerequisites" and see if all prerequisites checked? Like:

image

BTW, did a quick search and it also seems conda version issue itself. E.g. https://stackoverflow.com/questions/68507862/how-to-solve-condahttperror-http-000-connection-failed-error-in-wsl https://stackoverflow.com/questions/67923183/miniconda-on-wsl2-ubuntu-20-04-fails-with-condahttperror-http-000-connection

liorbalmas commented 5 days ago

image

liorbalmas commented 5 days ago

The issue seems to be my wsl distro (ubuntu) cannot access the internet restarting doesn't help Trying to fix that..

Would have been helpful if AI toolkit project had checked that and displayed a clear error message

fixed that using this answer:

https://stackoverflow.com/a/63578387/2703411

first_time_setup.sh ran sucessfully.

inference is running now

Thank you for you help

swatDong commented 1 day ago

Thanks for the good suggestion to add network to prerequisites check. Added to backlog.

Closing as the original issue resolved.