Lightning-AI / pytorch-lightning

Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
27.49k stars 3.3k forks source link

Simple documentation of automatic optimization with learning rate scheduler #16564

Open turian opened 1 year ago

turian commented 1 year ago

📚 Documentation

The documentation for the learning rate scheduler is unclear. It focuses on manual optimization, and automatic optimization is buried in a note without a code example. Adding, the documentation for configure_optimizers is not linked and that section has no examples of configure_optimizers usage.

This has caused confusion in the past, for example:

I suggest the section begin with a simple code example with automatic optimization, explaining what will happen for all four cases of frequency and interval being defined or undefined. Alternately, that section linked above should say: "For a simple example of learning rate scheduling with automatic optimization, see the documentation for [configure_optimizers](https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html#configure-optimizers)"

cc @borda

Borda commented 1 year ago

I think you have a good point, and any docs improvements are very welcome!

pranith7 commented 11 months ago

Hey @Borda can i work on this?. Thank you

Borda commented 11 months ago

Hey @Borda can i work on this?. Thank you

Sure, go ahead! and thank you :)

Bhavay-2001 commented 3 months ago

Hi @Borda, is this issue open right now? I would like to work on this. Thanks

Bhavay-2001 commented 3 months ago

Hi @Borda, can you quickly guide me please on how should I proceed? Where to start looking from? Thanks

nlgranger commented 1 week ago

Also the manual_training = True example does not mention that the scheduler must be stepped manually. One could be led to think this is done automatically elsewhere.