msight-tech / research-charnet

CharNet: Convolutional Character Networks
Other
612 stars 142 forks source link

speed up inference #22

Open Yakirbe opened 4 years ago

Yakirbe commented 4 years ago

hello,

How can I speed up inference time on charnet as described on tools/test_net.py? tried to apply torch multiprocess but it failed due to lambda usage in the model definition.

Some clues on how to batch predict on charnet may help a lot here.

Tnx!

hongzhenwang commented 4 years ago

replace lambda with torch.nn.functional

Yakirbe commented 4 years ago

Hi @hongzhenwang, where exactly? I C 2 lambda @ forwarding hourglass backbone. the first is for batch norm, and the other is for input upsampling

Tnx!

Tetsujinfr commented 4 years ago

Any update on this question regarding inference speed up? Btw, it seems that the speed is linear with the number of detected text items, is it something you are considering trying to improve? thanks a lot