arthurdouillard / incremental_learning.pytorch

A collection of incremental learning paper implementations including PODNet (ECCV20) and Ghost (CVPR-W21).
MIT License
383 stars 60 forks source link

Can anyone get the result of LwM? #19

Closed JoyHuYY1412 closed 3 years ago

JoyHuYY1412 commented 4 years ago

I run python -minclearn --model lwm --increment 20 -memory 0. And I change self._attention_config = args.get("attention_config", {"factor":1}) and self._distillation_config["factor"] =1 in lwm.py.

However, the result is very low. Could anyone give me some advice?

arthurdouillard commented 4 years ago

I've never been able to reach the results of LwM. And to be honest I have some doubt about the papers results, as they seem very high while using no rehearsal memory.