minimaxir / aitextgen

A robust Python tool for text-based AI training and generation using GPT-2.
https://docs.aitextgen.io
MIT License
1.84k stars 220 forks source link

Support for Parallel CPU execution (distributed parallelization) #46

Open mmagithub opened 4 years ago

mmagithub commented 4 years ago

Hi, Just wondering if aitextgen finetuning supports multi-threading/multi-CPU (parallel/distributed) training. I understand the execution is much faster on GPUs, but in case that GPU is not available, can we get the same performance using many CPUs ? I tried running the code twice with doubling the number of CPUs, but no increase in performance was found.

Thanks

minimaxir commented 4 years ago

This is a ways out.

You can hack it using ray but there aren't any promises.