WongKinYiu / yolov7

Implementation of paper - YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors
GNU General Public License v3.0
13.07k stars 4.13k forks source link

we want to submit a PR about Source-Free compression training on YOLOv7,is it ok? #381

Open leiqing1 opened 1 year ago

leiqing1 commented 1 year ago

Hello, we have done a Source-Free compression training function, and the benefits of YOLOv7 are as follows. I want to submit a PR, is it ok?

model method input size mAPval 0.5:0.95 prediction delay FP32 (ms) prediction delay FP16 (ms) prediction delay INT8 (ms)
YOLOv7 Base 640*640 51.1 26.84ms 7.44ms -
YOLOv7 KL Offline quantification 640*640 50.2 - - 4.55ms
YOLOv7 Quantitative distillation training 640*640 50.8 - - 4.55ms

We use the training knowledge technology and PACT quantization technology to improve the compression efficiency of YOLOv7.Knowledge distillation techniques can automatically add training logic to AI models. First, it loads the inference model file specified by the user, and copies the AI model in memory as the teacher model in knowledge distillation, and the original model as the student model. Then, the model structure is automatically analyzed to find a layer suitable for adding distillation loss, usually the last layer with trainable parameters. Finally, the teacher model supervises the sparse training or quantized training of the original model through the distillation loss. The process is shown in the figure below.

Errol-golang commented 1 year ago

Do you have any plans to open source your code about distillation?

leiqing1 commented 1 year ago

Do you have any plans to open source your code about distillation?

Hi @Errol-golang , we plan to open the distillation code to the YOLOv7 repo in the next two weeks.

Errol-golang commented 1 year ago

Do you have any plans to open source your code about distillation?

Hi @Errol-golang , we plan to open the distillation code to the YOLOv7 repo in the next two weeks.

Thanks. I will keep an eye on it.

leiqing1 commented 1 year ago

Do you have any plans to open source your code about distillation?

Hi @Errol-golang , we plan to open the distillation code to the YOLOv7 repo in the next two weeks.

Thanks. I will keep an eye on it.

hi, @Errol-golang we have submitted a pr to the YOLOv7 repo, and wanting for @WongKinYiu review. https://github.com/WongKinYiu/yolov7/pull/612

pytholic commented 1 year ago

@leiqing1 Hi. Can you guide me about the compression of yolo-w6-pose model?