ying09 / TextFuseNet

A PyTorch implementation of "TextFuseNet: Scene Text Detection with Richer Fused Features".
MIT License
476 stars 123 forks source link

How to run inference on multi-GPU machine ? #64

Closed 25b3nk closed 2 years ago

25b3nk commented 3 years ago

I am trying to run inference on my machine with two GPUs, but the demo script runs on single GPU and runs out of memory. Please let me know how to set the demo script to run on multi-GPU setup.

ying09 commented 2 years ago

sorry. we only run single-gpu inference in our research. If you want to run multi-gpu inference in your engineering project, you can refer to the training process to implement it yourself.

ying09 commented 2 years ago

sorry. we only run single-gpu inference in our research. If you want to run multi-gpu inference in your engineering project, you can refer to the training process to implement it yourself.