Closed Manajit89 closed 4 years ago
Not familiar with bert-as-service but this (https://github.com/naver/biobert-pretrained/issues/7) might help.
Thank you very much for this. Not sure how I missed that.
On 26 Feb 2020, at 12:54 PM, Jinhyuk Lee notifications@github.com<mailto:notifications@github.com> wrote:
Not familiar with bert-as-service but this (naver/biobert-pretrained#7https://github.com/naver/biobert-pretrained/issues/7) might help.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/dmis-lab/biobert/issues/96?email_source=notifications&email_token=AH75FKF5LN4AF5AJJ2KYY6TREZJ6DA5CNFSM4KZD5RXKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEM757HY#issuecomment-591388575, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AH75FKGS5JNO4NFSQSEZKQ3REZJ6DANCNFSM4KZD5RXA.
Hi!
First of all let me begin by thanking you for the pre-trained BERT model which is a life saver. For my latest project, I would like to get to ELMo like contextual embeddings from BioBERT. To reduce the complexity barrier, I resorted to using bert-as-service (https://github.com/hanxiao/bert-as-service), which admittedly accepts pre-trained models. However, when I try to run it I encounter strange errors. I am wondering if it is due to the Tensorflow version or something. According to the documentation of bert-as-service, (https://bert-as-service.readthedocs.io/en/latest/section/faq.html#can-i-use-my-own-fine-tuned-bert-model), one needs to provide just the checkpoint, vocab and bert_config.json, which are all present in the BioBert directory. Despite that, I see the following errors:
Am I missing something? Is there an easier alternative to get the embeddings without having to set-up the code ecosystem?
Thanks in advance, Manajit