Closed ShenZheng2000 closed 1 year ago
Thanks! Training takes around one day with the training settings specified in the code. We didn't test computational efficiency since it wasn't in the scope of the paper, but it would be useful if you make a test to report the results here for future reference.
Hi, authors!
Thanks for your excellent work. I would like to know how long it takes to train your model on a single GPU (e.g., 3090Ti).
Besides, I would like to know if you have tested the inference speed and FLOPs on high-resolution (e.g., 2k) images. How is your computational efficiency compared with the previous state-of-the-art?