ShellRedia / SAM-OCTA

35 stars 2 forks source link

The weights after pretraining on OCTA datasets #1

Open hwei-hw opened 1 year ago

hwei-hw commented 1 year ago

Thanks for your interesting and valuable work. Could you provide the model weights on the OCTA prertraining? Otherwise, I need to re-pre-train it again.

Thanks for your work and looking forwarding to your reply!

ShellRedia commented 1 year ago

Thank you for your kind words and interest in our work. We appreciate your support and enthusiasm for our research.

Regarding your request for the OCTA pretraining model weights, we are sorry to let you know that, at present, the paper associated with the SAM-OCTA has not been accepted by any journals or conferences. Due to the current status of the work and the ongoing review process, we find ourselves unable to disclose or provide the pretraining parameters publicly.


这篇论文还没被接收,所以暂时不能提供预训练参数,十分抱歉orz

hwei-hw commented 1 year ago

Thanks for your kind reply and good news on your paper soon!

ShellRedia commented 10 months ago

The trained weights have been updated now. The link is added at the end of README.


预训练权重已经更新了,链接添加在了README的末尾。

ORlGlN commented 9 months ago

can you upload to google drive too? Thanks!