lopuhin / transformer-lm

Transformer language model (GPT-2) with sentencepiece tokenizer
164 stars 47 forks source link

Neucleus sampling (top-p) #22

Closed knok closed 4 years ago

knok commented 4 years ago

The original GPT-2 implementation contains neucleus sampling (top-p). This is the code, refer from huggingface/transformers.

ref: https://github.com/openai/gpt-2-output-dataset/issues/5

lopuhin commented 4 years ago

Thanks for the PR @knok , looks great after an initial look, would be eager to play with it and check in more detail next week.

awfulcooking commented 4 years ago

hey, friendly bump! thanks @lopuhin @knok

lopuhin commented 4 years ago

merged in #27, thanks @knok