dfdazac / blp

"Inductive Entity Representations from Text via Link Prediction" @ The Web Conference 2021
MIT License
58 stars 6 forks source link

Is the pre-trained model for Wikidata5M available? #4

Closed xhh232018 closed 2 years ago

xhh232018 commented 2 years ago

Hello Daniel, I am very impressed by your work. Now I am trying to reproduce the experiment on link prediction task for Wikidata5M. I found that it needs nearly 5 days to finish the whole training process under my RTX A5000 GPU, which is quite time-consuming . Therefore, I am wondering if your pre-trained model is open-sourced.

dfdazac commented 2 years ago

Hi, thank you for your interest! I'm currently organizing all the files to upload pretrained models to the repository for all datasets. I'm hoping this will be ready in the next week.

dfdazac commented 2 years ago

@xhh232018 I have uploaded pretrained models for most of the models and datasets. Please see this updated section of the readme.