epfLLM / meditron

Meditron is a suite of open-source medical Large Language Models (LLMs).
https://huggingface.co/epfl-llm
Apache License 2.0
1.85k stars 169 forks source link

Are you planing to release fine-tuned models? #10

Closed anowak closed 10 months ago

anowak commented 10 months ago

Thank you for this great work and very detailed paper! In the paper, you write:

MEDITRON models (7B and 70B) with and without fine-tuning to the public to ensure access for real-world evaluation and to facilitate similar efforts in other domains.

Should we expect fine-tuned models to be released soon?

eric11eca commented 10 months ago

Hi! Thank you for your interest in our work!

Yes, we are currently finishing some documentation on the fine-tuned models. We will release them soon afterward. There will be three fine-tuned models (all 70B):

  1. PubMedQA with CoT
  2. MedMCQA with CoT
  3. MedQA with Cot

Note that they are all task-specific, not instruction-tuned.

anowak commented 10 months ago

Thank you, I will stay tuned!

seanxuu commented 8 months ago

Hi! Thank you for your interest in our work!

Yes, we are currently finishing some documentation on the fine-tuned models. We will release them soon afterward. There will be three fine-tuned models (all 70B):

  1. PubMedQA with CoT
  2. MedMCQA with CoT
  3. MedQA with Cot

Note that they are all task-specific, not instruction-tuned.

Do you have a specific timetable about releasing the models?