lanpa / tensorboardX

tensorboard for pytorch (and chainer, mxnet, numpy, ...)
https://tensorboardx.readthedocs.io/en/latest/tensorboard.html
MIT License
7.87k stars 862 forks source link

add_hparam creates subdirectory #519

Open andreaskoelsch opened 5 years ago

andreaskoelsch commented 5 years ago

add_hparam creates a subdirectory with name time.time(). In tensorboard, this shows up as an individual run. Is this intended behavior? I think it would be much cleaner to put it in the same events file, or -if that is not possible- at least in the same directory of the actual run.

Having add_scalars as different runs makes sense, I guess. Different behavior could be implemented using Custom Scalars. But for HParams, I do not see why it should be a separate run

versatran01 commented 5 years ago

Yeah, I was also wondering why add_hparam creates a subdir for every call. It's not like I will change any hyper-parameters (like batch size) during each run. Would like to hear the reason behind this.

lanpa commented 4 years ago

https://github.com/pytorch/pytorch/issues/32651#issuecomment-579621209