qAp / kgl_deepfake

Apache License 2.0
1 stars 0 forks source link

Limit batch size in inference kernel #2

Open JoshVarty opened 4 years ago

JoshVarty commented 4 years ago

Although it's unlikely, we should guard against the possibility that there may be 50+ faces in a given video. For this reason we should ensure that our batch size is never larger than 128 during inference.