jerryji1993 / DNABERT

DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
https://doi.org/10.1093/bioinformatics/btab083
Apache License 2.0
561 stars 150 forks source link

pre-training perplexity #74

Open gianfilippo opened 2 years ago

gianfilippo commented 2 years ago

Hi,

I am pre-training your model and would like to have the perplexity plot/values in your pre-training stage, if possible, to see how I am doing. If the pre-training perplexity history is unavailable, the final perplexity is also ok. I would expect that, given the same model, I should achieve comparable perplexity.

Thanks

phmin commented 6 months ago

May I know the final perplexity result? I would like to get an idea of the approximate value. Thank you.