titipata / allennlp-tutorial

Tutorial on AllenNLP library with demo "which journal to submit paper?"
http://bleen.seas.upenn.edu:8000/
32 stars 9 forks source link

How to use regularizer in this tutorial #3

Open kevinling0218 opened 4 years ago

kevinling0218 commented 4 years ago

I know this may looks a bit lame but seems difficult to understand how does the regularizer works in this case, as all the example I saw online has a regularizer = None.

What should i do if I want to add regularization on the LSTM layer or feedforward layer?

Many thanks

titipata commented 4 years ago

Hi @kevinling0218, I think you can add the sum square of your parameters in the loss (see this discussion).