Neph0s / LMKE

Code for the paper 'Language Models as Knowledge Embeddings'
53 stars 7 forks source link

Question about training LMKE #8

Closed handsomelys closed 1 year ago

handsomelys commented 1 year ago

Hello. I have questions about what are the GPUs used for training in your experiment? And how long did you train the model for WN18RR and FB15K237? Can you give me the answer? Because I want to follow your work but don't know whether my resource can support me or not.

Neph0s commented 1 year ago

Every experiment is conducted on 1 single RTX 3090. Typically, most experiments can be finished in 2-3 days.