Open Sachin-Bhat opened 1 month ago
Hello @Sachin-Bhat
Your RTX 3060 GPU might be running out of memory due to the high batch size during inference. To resolve this, try reducing the batch size using the --batch-size parameter.
For example, start with a batch size of 16:
python pdf_extract.py --pdf assets/examples/example.pdf --batch-size 16
If you still encounter memory issues, further reduce the batch size:
python pdf_extract.py --pdf assets/examples/example.pdf --batch-size 8
python pdf_extract.py --pdf assets/examples/example.pdf --batch-size 4
This should help manage the GPU memory usage better.
Hey @wangbinDL
Thank you for the prompt response. I did try what you said. However even setting the batch size to 1 did not fix the issues. So I am unsure if this is a bug. Please let me know if I can try something else to work around this issue.
Cheers, Sachin
Hey @wangbinDL
Thank you for the prompt response. I did try what you said. However even setting the batch size to 1 did not fix the issues. So I am unsure if this is a bug. Please let me know if I can try something else to work around this issue.
Cheers,
Sachin
Maybe you need more than 8GB vram to run this project.
Hello there,
Firstly I would like to thank you for creating this tool.I am getting the following error with CUDA and was hoping I could get some assistance in debugging the issue. I am running a RTX 3060 GPU on my Laptop and I am currently unsure if my device meets the minimum requirements to run on the GPU.
I am currently running Artix Linux in case the distro information is needed.
Once again any assistance on this would be greatly appreciated.
Cheers, Sachin