QizhiPei / BioT5

BioT5 (EMNLP 2023) and BioT5+ (ACL 2024 Findings)
https://arxiv.org/abs/2310.07276
MIT License
91 stars 5 forks source link

Add Model Config Files to HuggingFace #2

Closed Moocember closed 11 months ago

Moocember commented 11 months ago

Thank you very much for adding this repo.

When you uploaded to HuggingFace, only the model weights were included and not the config files for the tokenizer or model architecture. (Only one file here). https://huggingface.co/QizhiPei/BioT5/tree/main/pretrained

Can you please add the config files to HuggingFace as well? An example of this being done would be the base model you used (T5) which contains key files such as: tokenizer_config.json, config.json, etc. https://huggingface.co/google/t5-v1_1-base/tree/main

Doing this isn't hard: All you have to do is run hf's save_pretrained method with push_to_hub set to true: i.e. model.save_pretrained("QizhiPei/BioT5/pretrained",push_to_hub=True) Docs: https://huggingface.co/docs/transformers/add_new_model#transformers.PreTrainedModel.save_pretrained

Thanks again!

QizhiPei commented 11 months ago

Thank you for pointing that out! I appreciate the clear instructions and have followed your suggestion and added the necessary config files to the HuggingFace repository for the BioT5 model. The links in the README file have also been updated.

Moocember commented 11 months ago

Thank you! I genuinely believe the work you've done here will save lives!

Do you have any plans to apply BioT5 to solve real world problems?

QizhiPei commented 11 months ago

Thank you! I genuinely believe the work you've done here will save lives!

Do you have any plans to apply BioT5 to solve real world problems?

Thank you for your interest in BioT5. As of now, we do not have the conditions to conduct wet-lab experiments. However, we appreciate the suggestion and might consider its real-world application in the future.