FacePerceiver / FaRL

FaRL for Facial Representation Learning [Official, CVPR 2022]
https://arxiv.org/abs/2112.03109
MIT License
375 stars 22 forks source link

Weird time behaviour for face parsing #16

Closed adarc8 closed 1 year ago

adarc8 commented 1 year ago

I believe there is something about the JIT (Just-In-Time) load causing some unusual behavior. The first batch takes around 2 seconds, the second batch takes around 20 seconds, but the third batch and subsequent batches only take 0.1 seconds.

Is there any information available about this issue?

icech commented 1 year ago

did you solve this problem?

adarc8 commented 1 year ago

@icech Yeah it was some torch command that optimize the performance I think it was torch.backends.cudnn.benchmark but i cant find it in the repo so im not sure.. debug the code when loading the model, there is some torch command that should optimize the model or soemthing like that. but it actually cause a large delay at the begining (this command is probably good when there are many epochs of training, and not just a single inference run)