xingyizhou / pytorch-pose-hg-3d

PyTorch implementation for 3D human pose estimation
GNU General Public License v3.0
613 stars 143 forks source link

Training time question #46

Closed dihuangcode closed 5 years ago

dihuangcode commented 5 years ago

Hi, thx for your awesome work! @xingyizhou I'm trying to repeat your exp. I rewrite some part of your main function, however, didn't change your dataset and model. I find it is quite slow to train the net(more than two day training on single TITAN xp, for stage 2) I wonder if I make some mistakes. Could you tell me how much time did you spend on training the stage2 and stage3? :)

xingyizhou commented 5 years ago

Hi, Thanks for your interest in our work! As I remembered, one epoch of stage 2 (~8k iters) takes about 30mins, so it should take about 15 hours. Firstly make sure you are using python2 with pytorch 0.1.12 for this repo. I do have observed there might be some slowdown for higher version without modification. If that doesn't help, can you show the data loading time by modifying train.py with following lines:

import time
  # xxx
  data_time, batch_time = AverageMeter(), AverageMeter()
  end = time.time()
  for i, batch in enumerate(data_loader):
    data_time.update(time.time() - end)
    # xxx (load data, forward, and optimize) 
    batch_time.update(time.time() - end)
    end = time.time()
    if not opt.hide_data_time:
      time_str = ' |Data {dt.avg:.3f}s({dt.val:.3f}s)' \
                 ' |Net {bt.avg:.3f}s'.format(dt = data_time,
                                                             bt = batch_time)
    else:
      time_str = ''
    Bar.suffix = '{split}: [{0}][{1}/{2}] |Total {total:} |ETA {eta:}' \
                 '|Loss {loss.avg:.5f} |Acc {Acc.avg:.4f}'\
                 '{time_str}'.format(epoch, i, nIters, total=bar.elapsed_td, 
                                     eta=bar.eta_td, loss=Loss, Acc=Acc, 
                                     split = split, time_str = time_str)
dihuangcode commented 5 years ago

thx for your reply! I'll try it :+1: