novioleo / FOTS

implement FOTS and apply it on real scenario.
68 stars 23 forks source link

Adjusting Training Learning Rate #13

Closed RyanRTJJ closed 5 years ago

RyanRTJJ commented 5 years ago

Hi @novioleo,

My training seems to be going well. I can see the model performing better and better (look at this - one of the better examples @ 100 epochs): COCO_train2014_000000000081

But my learning rate is dropping quite fast. This is my config:

    "lr_scheduler_type": "ExponentialLR",
    "lr_scheduler_freq": 50,
    "lr_scheduler": {
            "gamma": 0.94
    },

I was expecting the learning rate to be 0.0001 - 1e-05 = 9e-05 after 50 epochs, but by 100 epochs, it had gone down to about 2.1e-07. Did I understand something wrong somewhere?

Is there a way to resume from a checkpoint.pth.tar but with a different learning rate and decay?

What's your recommended learning rate and decay? My train set is 12,000 + images, validation set is 2000+ images.

novioleo commented 5 years ago

@RyanRTJJ you can try this:

  "lr_scheduler_type": "StepLR",
  "lr_scheduler_freq": 100,
  "lr_scheduler": {
    "gamma": 0.9,
    "step_size": 100
  },

initialize learning rate i suggest 1e-4,decay 1e-5

RyanRTJJ commented 5 years ago

Okay, thank you. I'll give it a try some time.

novioleo commented 5 years ago

@RyanRTJJ can u share me your recent attempts?