pawansharmaaaa / Lip_Wise

Orchestrating AI for stunning lip-synced videos. Effortless workflow, exceptional results, all in one place.
Apache License 2.0
56 stars 14 forks source link

CUDA problem help please #16

Closed vikolaz closed 2 months ago

vikolaz commented 2 months ago

it give me this error, I should have install Cuda toolkit already

To create a public link, set share=True in launch(). ERROR: Exception in ASGI application Traceback (most recent call last): File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 411, in run_asgi result = await app( # type: ignore[func-returns-value] File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 69, in call return await self.app(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\fastapi\applications.py", line 1054, in call await super().call(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\applications.py", line 123, in call await self.middleware_stack(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\middleware\errors.py", line 186, in call raise exc File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\middleware\errors.py", line 164, in call await self.app(scope, receive, _send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\route_utils.py", line 707, in call await self.app(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\middleware\exceptions.py", line 65, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app raise exc File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 756, in call await self.middleware_stack(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 776, in app await route.handle(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 297, in handle await self.app(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 77, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app raise exc File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\routing.py", line 75, in app await response(scope, receive, send) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\responses.py", line 352, in call await send( File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 50, in sender await send(message) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette_exception_handler.py", line 50, in sender await send(message) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\starlette\middleware\errors.py", line 161, in _send await send(message) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 549, in send raise RuntimeError("Response content shorter than Content-Length") RuntimeError: Response content shorter than Content-Length Traceback (most recent call last): File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\queueing.py", line 527, in process_events response = await route_utils.call_process_api( File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\route_utils.py", line 270, in call_process_api output = await app.get_blocks().process_api( File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\blocks.py", line 1847, in process_api result = await self.call_function( File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\blocks.py", line 1433, in call_function prediction = await anyio.to_thread.run_sync( File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\anyio\to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\anyio_backends_asyncio.py", line 2144, in run_sync_in_worker_thread return await future File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\anyio_backends_asyncio.py", line 851, in run result = context.run(func, args) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\gradio\utils.py", line 805, in wrapper response = f(args, *kwargs) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(args, **kwargs) File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main\infer.py", line 58, in infer_image free_memory = torch.cuda.mem_get_info()[0] File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\torch\cuda\memory.py", line 661, in mem_get_info device = torch.cuda.current_device() File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\torch\cuda__init__.py", line 769, in current_device _lazy_init() File "D:_A.I._2024\Lip_Wise-main\Lip_Wise-main.lip-wise\lib\site-packages\torch\cuda__init__.py", line 289, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled

pawansharmaaaa commented 2 months ago

Install the pytorch-gpu module from here https://pytorch.org/get-started/locally/

vikolaz commented 2 months ago

Hi thanks for the response. Actually I tried several time to install pytorch-gpu module from https://pytorch.org/get-started/locally/ conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia

For sure there something that I'm doing wrong, I'm not so expert. I don't know if I run the torch command int the right place/folder.

If you can further help would super

Thank you!!

vikolaz commented 2 months ago

I've tried do downgrade the Nvidia Toolkit as I had the 12.1

Now I have reinstalled the Toolkit 11.8 version and reinstalled the releted pytorch

That what my system show i have:

**C:>python -c "import torch; print('PyTorch version:', torch.version)" PyTorch version: 2.3.0+cu118

C:>python -c "import torchvision; print('Torchvision version:', torchvision.version)" Torchvision version: 0.18.0+cu118

C:>nvcc --version nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2022 NVIDIA Corporation Built on Wed_Sep_21_10:41:10_Pacific_Daylight_Time_2022 Cuda compilation tools, release 11.8, V11.8.89 Build cuda_11.8.r11.8/compiler.31833905_0**

pawansharmaaaa commented 2 months ago

Hello, I don't think that this is a problem with your cuda toolkit, it is an issue with pytorch installation. Did you run setup script?

vikolaz commented 2 months ago

Yes i did, also tried to delete the folder and start a fresh setup

pawansharmaaaa commented 2 months ago

Do one thing, activate the environment and install pytorch-gpu with pip

vikolaz commented 2 months ago

Yesss that solved the issue!!

Thank you very much