Closed udion closed 6 years ago
Hi, I see one or two possible explanations: first is that step
that is passed to log_value
function was reset to zero at some point, and second (not sure it's actually possible) that there are some runs that are recorded in the same folder, so they also have step going from one. You can switch to relative and wall time displays (grey buttons on the left) to check.
I checked, the step is not becoming zero, however the --logdir
that I passed does contain multiple events file, when I place only one file in one directory it shows something that makes sense.
Thanks
Great that you solved it! Yes, this issue with multiple runs going into the same directory is something that bites me from time to time as well.
Hi, I recorded values of a loss functions and various components in pytorch. when displaying the graph using
tensorboard_logger
, where the x-axis is steps I get graphs going in backward direction in x axis (i.e at a particular step I have multiple values of loss function)How is this even possible? what am i interpreting wrong? Please correct me if I am missing out something.