FoundationVision / GLEE

[CVPR2024 Highlight]GLEE: General Object Foundation Model for Images and Videos at Scale
https://glee-vision.github.io/
MIT License
1.06k stars 82 forks source link

The training code #11

Open liuxingbin opened 7 months ago

liuxingbin commented 7 months ago

Hi, Thanks for the solid work. Could you let me know when you'll release the training code?

wjf5203 commented 7 months ago

Hi~ Thanks for your attention! The training code and models will be released in this week!

liuxingbin commented 7 months ago

Thanks for your reply. I am looking forward to the training code. Best,

在 2024-03-18 14:34:45,"Junfeng Wu" @.***> 写道:

Hi~ Thanks for your attention! The training code and models will be released in this week!

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

wjf5203 commented 7 months ago

Hi~Thank you for your patience. The training code and image-level inference script have been released. I will continue to update the inference script for other results in the paper.

liuxingbin commented 7 months ago

Hi~ Thanks for your attention! The training code and models will be released in this week!

Hi, thanks for the update. I am wondering if there are any tips for fine-tuning on my own dataset?

wjf5203 commented 7 months ago

Hi~ Thanks for your attention! The training code and models will be released in this week!

Hi, thanks for the update. I am wondering if there are any tips for fine-tuning on my own dataset?

I'm glad to hear that you are trying to use GLEE to finetune your own dataset. For the newly added data, it should first be processed into the standard COCO Detection or RefCOCO format. Then, you can refer to the builtin.py file, which contains numerous datasets newly registered on top of Detectron2 that can be used as references. Additionally, for the newly incorporated data, you need to define its task name and update several places in the code with the corresponding category names, the number of categories here, here and here, as well as the newly added denoising embedding. It's best to find an existing dataset similar to your own and check its routing path, and then set up a config. I recommend loading the -joint.pth weights for fine-tuning.

muengsuaengsuai commented 6 months ago

@wjf5203 Thanks for the solid work. When do you plan to release finetune scripts?

Hiutin commented 6 months ago

is there a benchmark for training efficiency? thanks.

CrazyBrick commented 4 months ago

Hi~ Thanks for your attention! The training code and models will be released in this week!

Hi, thanks for the update. I am wondering if there are any tips for fine-tuning on my own dataset?

@liuxingbin hi, have you tried it successfully?could you share the scripts?