howardhsu / BERT-for-RRC-ABSA

code for our NAACL 2019 paper: "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis"
Apache License 2.0
455 stars 110 forks source link

Questions about “L1-normed probe” #27

Open zjj0266 opened 2 years ago

zjj0266 commented 2 years ago

Hello @howardhsu ,

This is a very nice repo.After reading the paper Understanding pretrained BERT4ABSA,Im interested in “probe”,could you please give some description about “L1-normed probe”? in “ We train an L1-normed probe on hidden states of aspects and non-aspect words from both domains,”