Closed thuzhaowang closed 4 years ago
That seems too slow, but we've never tested on that specific hardware, so it's hard to say. We are going to release faster versions of PackNet (probably in the upcoming week), they should be better for training in these conditions.
Thanks for your quick reply! I wonder if you have observed a large speed gap between PackNet and Resnet-18 model in your experiments, or they just have the similar training speed?
There is a speed gap, PackNet has more parameters and 3D convolutions are slower to compute, but it shouldn't be by that much (I'd say about 4-5 times slower, based on what I am getting here).
Thanks for sharing this wonderful work!
I'm trying to train the PackNet model with a single RTX2080 card, using batch_size=1 (due to the limited memory). But it basically needs 7 seconds to perform an iteration (forward and backward), while the DispNet only needs 0.2s. Is this the right training speed for the model or something wrong with it?