Open KIMGEONUNG opened 1 month ago
It looks cool. Code available?
Maybe next week, I will provide the code .
@cyy2427 ,@IceClear
Depth-skip pruning minimal implementation for StableSR is ready to the below link. https://github.com/KIMGEONUNG/StableSR_Depth-skip?tab=readme-ov-file
I am reaching out to share our recent intriguing findings from a compression experiment. Our results demonstrate a significant reduction in model size without compromising quality. Briefly, we discovered that restoration models derived from large T2I models seldom utilize the coarse layers of the UNet. By simply removing the network blocks beyond a predetermined depth in the skip-connection setup, we observed minimal impact on the results. Specifically, for Stable SR, only depth-level 9, which utilizes 60% of the parameters, is required to achieve high-quality restoration. Here are the quantitative results from the DIV2K test set:![image](https://github.com/IceClear/StableSR/assets/32098205/fc2af603-91a6-4ffc-abd1-0c4f89e30dd8)
You can find more details about our research at the following link: https://arxiv.org/abs/2401.17547