Closed ElaheA90 closed 1 year ago
Hello,
Those were exploratory scripts that we either didn't follow through or didn't work. I don't exactly remember what was the case. One thing I remember was that the RNN solution had the highest sensitivity and we were aiming for that in PEPPER.
I fount that you have a code for attention based Seq2Seq model Seq2Seq_atn.py for pepper but it seems that you did not use it. Is there any specific reason that you are not using attention based one?