pytorch / ignite

High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.
https://pytorch-ignite.ai
BSD 3-Clause "New" or "Revised" License
4.5k stars 608 forks source link

How should I use global_step_from_engine #3257

Closed H4dr1en closed 4 weeks ago

H4dr1en commented 4 weeks ago

❓ Questions/Help/Support

Hi, I am currently logging plots during the validation step and I use the evaluator.state.epoch to identify the epoch.

This evaluator.state.epoch is always 1, but I want it to be the same as the trainer.state.epoch obviously. I am looking for a simple fix. I saw the function global_step_from_engine but it's unclear to me from the function docstring and its implementation, how I am supposed to use it. So my questions are:

  1. Can I use global_step_from_engine to fix my issue, if yes, how?
  2. If not, what is the proper way of doing it?

Thank you!

vfdev-5 commented 4 weeks ago

Hi @H4dr1en , you can take a look at this example in the docs: https://pytorch.org/ignite/v0.5.0.post2/generated/ignite.handlers.tensorboard_logger.html#ignite.handlers.tensorboard_logger.OutputHandler

I agree it may be helpful to update the docs of the method: https://pytorch.org/ignite/v0.5.0.post2/generated/ignite.handlers.global_step_from_engine.html

H4dr1en commented 4 weeks ago

Hi @vfdev-5 ,

It is still not clear to me how I can use it; currently I am doing:

evaluator.add_event_handler(Events.EPOCH_COMPLETED, draw_confidences)

def draw_confidences(engine):
    # .... draw figure
    plt.savefig(f"confidences_iter_{engine.state.epoch}")

With global_step_from_engine, if I understand correctly it would become:

evaluator.add_event_handler(Events.EPOCH_COMPLETED, draw_confidences, global_step_from_engine(trainer, Events.EPOCH_COMPLETED))

def draw_confidences(engine, global_step_transform):
    # .... draw figure
    # Not clear to me how I can get the epoch from global_step_transform here
    epoch = global_step_transform()
    plt.savefig(f"confidences_iter_{epoch}")

Thanks in advance 👍

vfdev-5 commented 4 weeks ago

In this case, you can directly pass trainer to the hander:

def draw_confidences(evaluator, trainer):
    # .... draw figure
    # Not clear to me how I can get the epoch from global_step_transform here
    epoch = trainer.state.epoch
    plt.savefig(f"confidences_iter_{epoch}")

evaluator.add_event_handler(Events.EPOCH_COMPLETED, draw_confidences, trainer)

Let me know it this does(not) work for your use-case

H4dr1en commented 4 weeks ago

I see - yes that would work 👍

What's the use case of global_step_from_engine then? If I understood correctly, one can always pass the engine (in this case, the trainer) to store its reference to later get the epoch property, or?

vfdev-5 commented 4 weeks ago

What's the use case of global_step_from_engine then?

Typical usage is with loggers like TensorBoard logger (https://pytorch.org/ignite/v0.5.0.post2/generated/ignite.handlers.tensorboard_logger.html#ignite.handlers.tensorboard_logger.OutputHandler)

If I understood correctly, one can always pass the engine (in this case, the trainer) to store its reference to later get the epoch property, or?

yes, that's correct. In your case you manually crafted the handler, so either you can use a ref to the trainer or have a function global_step_transform like in OutputHander to fetch the step value.