pytorch-labs / gpt-fast

Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
BSD 3-Clause "New" or "Revised" License
5.36k stars 485 forks source link

Inference on a dataset instead of an individual prompt #77

Closed yafehlis closed 6 months ago

yafehlis commented 6 months ago

Hi: May I ask the best way to inference on a dataset (such as HumanEval dataset) instead of an individual prompt? I want the model to be loaded once for different prompts.

Also, what is "num_samples" in the arguments?

Thanks, Yao Fehlis (AMD)