macournoyer / neuralconvo

Neural conversational model in Torch
777 stars 346 forks source link

Perplexity calculation #67

Open BranaHi opened 7 years ago

BranaHi commented 7 years ago

Im wondering if the perplexity calculation is correct. You calculate the total loss for every batch and average it over the (average)sequence length in that batch an in the end take the average over all batches. As far as i understood the sum of all batch losses as well as the total number of target tokens need to be collected and divided after the epoch is done? Could you please tell me why you are calculating it this way? Is it just an aproximation? Also the ppl's calculated this way seem rather small?! Thanks