bytedance / dplm

Official Implemetation of DPLM (ICML'24) - Diffusion Language Models Are Versatile Protein Learners
https://bytedance.github.io/dplm/
Apache License 2.0
77 stars 8 forks source link

Question regarding the time taken for training and what type of compute that's been used #14

Closed haresh121 closed 1 month ago

haresh121 commented 1 month ago

Hi, Can you please answer the following questions

  1. How much time is taken for the DPLM to get trained on the datasets mentioned ?
  2. What are the memory requirements and on what type of GPU that you've performed the training on ?
wxy-nlp commented 1 month ago

Hi @haresh121, Sorry for the late reply. The time and memory cost during training is as below: 1) DPLM 150M: