microsoft / LLaVA-Med

Large Language-and-Vision Assistant for Biomedicine, built towards multimodal GPT-4 level capabilities.
Other
1.29k stars 148 forks source link

confusion about the provided model download #25

Open WindMarx opened 7 months ago

WindMarx commented 7 months ago

Thank you very much for making your code publicly available!But I have a question: llava_med_in_text_60k_ckpt2_delta.zip is the checkpoint of ? Also, the PMC-Atricles is too large to download, Can you provide a zip file for your images?

vanmosti commented 7 months ago

From what I understand, you do not need to download the PMC Articles in order to use the model. You need to obtain the LLaMa weights, along with the llava_med_in_text_60k_delta weights, then combine them into the Llava_med_in_text_60k weights. From there you can serve the model (though I personally haven't been able to due to some pesky error)

WindMarx commented 7 months ago

From what I understand, you do not need to download the PMC Articles in order to use the model. You need to obtain the LLaMa weights, along with the llava_med_in_text_60k_delta weights, then combine them into the Llava_med_in_text_60k weights. From there you can serve the model (though I personally haven't been able to due to some pesky error)

In fact, I have followed the official instructions and applied the delta weights to Llama, but the accuracy on Slake is only 40%. I'm considering starting the training from scratch instead of using the delta weights provided.

atultiwari commented 7 months ago

From what I understand, you do not need to download the PMC Articles in order to use the model. You need to obtain the LLaMa weights, along with the llava_med_in_text_60k_delta weights, then combine them into the Llava_med_in_text_60k weights. From there you can serve the model (though I personally haven't been able to due to some pesky error)

In fact, I have followed the official instructions and applied the delta weights to Llama, but the accuracy on Slake is only 40%. I'm considering starting the training from scratch instead of using the delta weights provided.

Hi @WindMarx I am Dr. Atul Tiwari, a medical doctor and a pathologist. I have been trying to make it work for a very long time, but I am getting errors one after another. Can you please share the commands that you used to make it work, may be in Google Colab notebook. That would be a great help. Thank you. Regards, Dr. Atul

OHaiYo-lzy commented 7 months ago

@WindMarx hey, can you tell which exact LLama weight is used to merge? I find many weights in llama zoos, thus I am wondering which one to use great thanks!

M3Dade commented 7 months ago

@WindMarx

From what I understand, you do not need to download the PMC Articles in order to use the model. You need to obtain the LLaMa weights, along with the llava_med_in_text_60k_delta weights, then combine them into the Llava_med_in_text_60k weights. From there you can serve the model (though I personally haven't been able to due to some pesky error)

In fact, I have followed the official instructions and applied the delta weights to Llama, but the accuracy on Slake is only 40%. I'm considering starting the training from scratch instead of using the delta weights provided.

Hello, can you tell me how to evaluate on slake? I'm having trouble getting the slack dataset because it's not available on huggingface. Also terminal output when running run_eval.py script on vqa-rad FileNotFoundError: [Errno 2] No such file or directory: 'candidate.json'.

WindMarx commented 7 months ago

I only used the script in the official documentation and provided code, so I'm sorry I can't share my command, you can follow the instructions in README.md

From what I understand, you do not need to download the PMC Articles in order to use the model. You need to obtain the LLaMa weights, along with the llava_med_in_text_60k_delta weights, then combine them into the Llava_med_in_text_60k weights. From there you can serve the model (though I personally haven't been able to due to some pesky error)

In fact, I have followed the official instructions and applied the delta weights to Llama, but the accuracy on Slake is only 40%. I'm considering starting the training from scratch instead of using the delta weights provided.

Hi @WindMarx I am Dr. Atul Tiwari, a medical doctor and a pathologist. I have been trying to make it work for a very long time, but I am getting errors one after another. Can you please share the commands that you used to make it work, may be in Google Colab notebook. That would be a great help. Thank you. Regards, Dr. Atul

I only used the script in the official documentation and provided code, so I'm sorry I can't share my command, you can follow the instructions in README.md

WindMarx commented 7 months ago

@WindMarx hey, can you tell which exact LLama weight is used to merge? I find many weights in llama zoos, thus I am wondering which one to use great thanks!

I USED llama-7b

M3Dade commented 7 months ago

I only used the script in the official documentation and provided code, so I'm sorry I can't share my command, you can follow the instructions in README.md

From what I understand, you do not need to download the PMC Articles in order to use the model. You need to obtain the LLaMa weights, along with the llava_med_in_text_60k_delta weights, then combine them into the Llava_med_in_text_60k weights. From there you can serve the model (though I personally haven't been able to due to some pesky error)

In fact, I have followed the official instructions and applied the delta weights to Llama, but the accuracy on Slake is only 40%. I'm considering starting the training from scratch instead of using the delta weights provided.

Hi @WindMarx I am Dr. Atul Tiwari, a medical doctor and a pathologist. I have been trying to make it work for a very long time, but I am getting errors one after another. Can you please share the commands that you used to make it work, may be in Google Colab notebook. That would be a great help. Thank you. Regards, Dr. Atul

I only used the script in the official documentation and provided code, so I'm sorry I can't share my command, you can follow the instructions in README.md

@WindMarx Thank you for your reply. Do you use candidate.json when evaluating on the Slake dataset? If so, how to get it?

WindMarx commented 7 months ago

I only used the script in the official documentation and provided code, so I'm sorry I can't share my command, you can follow the instructions in README.md

From what I understand, you do not need to download the PMC Articles in order to use the model. You need to obtain the LLaMa weights, along with the llava_med_in_text_60k_delta weights, then combine them into the Llava_med_in_text_60k weights. From there you can serve the model (though I personally haven't been able to due to some pesky error)

In fact, I have followed the official instructions and applied the delta weights to Llama, but the accuracy on Slake is only 40%. I'm considering starting the training from scratch instead of using the delta weights provided.

Hi @WindMarx I am Dr. Atul Tiwari, a medical doctor and a pathologist. I have been trying to make it work for a very long time, but I am getting errors one after another. Can you please share the commands that you used to make it work, may be in Google Colab notebook. That would be a great help. Thank you. Regards, Dr. Atul

I only used the script in the official documentation and provided code, so I'm sorry I can't share my command, you can follow the instructions in README.md

@WindMarx Thank you for your reply. Do you use candidate.json when evaluating on the Slake dataset? If so, how to get it?

Actually, I do not use the candidate.json, I think it may have something to do with the candidates mentioned in the paper. But, everyone do not know the candidate.json but the author

zhongzee commented 6 months ago

@WindMarx Hello, if you don't use candidate.json, how do you evaluate it? Can you share the code? In addition, the weight of LLama 7b is no longer available for download. Is it possible to share it? Or can you share the weight after adding delta? Thank you very much.