shivamsaboo17 / Overcoming-Catastrophic-forgetting-in-Neural-Networks

Elastic weight consolidation technique for incremental learning.
124 stars 22 forks source link

nas #4

Open nassimcnn opened 4 years ago

nassimcnn commented 4 years ago

what is crit argument in ewc = ElasticWeightConsolidation(model, crit, lr=0.01, weight=0.1)? and how to build it ?

ThomasAtlantis commented 10 months ago

It's short for criterion, i.e. the basic loss function. As shown by demo.ipynb, you can build a cross entropy loss for classification tasks: crit = nn.CrossEntropyLoss().