Closed caichuang0415 closed 8 months ago
For DiLu, all inference and training are conducted through the OpenAI API, so there's no need for local GPU resources for inference or training. Please follow the instructions in the README for guidance on how to use the API effectively.
For DiLu, all inference and training are conducted through the OpenAI API, so there's no need for local GPU resources for inference or training. Please follow the instructions in the README for guidance on how to use the API effectively.
OK Thanks for your reply, I will read them in detail
Thanks for sharing your codes! But I want to know what gpu and how many do I need to do the inference and trainning? And how much time?