daifeng2016 / End-to-end-CD-for-VHR-satellite-image

The project aims to contribute to geoscience community
65 stars 21 forks source link

learning rate scheduler strategy #13

Open songkq opened 4 years ago

songkq commented 4 years ago

Hey, as it is said in the paper, the learning rate drops after every 5 epochs, so how should I set thelearning rate scheduler strategy, something like"Poly LR", "Step LR"?

During the training process, Adam optimizer with a learning rate of 1 × 10−4 is applied. Based on the GPU memory, the batch size is set to 8 for 15 epochs, and the learning rate drops after every 5 epochs.