Closed xiao-11 closed 6 months ago
Hi @xiao-11, thank you for your interest. It seems the link provided may be incorrect. You can find the configuration file here https://github.com/LeapLabTHU/FLatten-Transformer/blob/master/cfgs/flatten_pvt_t.yaml.
Hi @xiao-11, thank you for your interest. It seems the link provided may be incorrect. You can find the configuration file here https://github.com/LeapLabTHU/FLatten-Transformer/blob/master/cfgs/flatten_pvt_t.yaml.
Thank you~
Thank you for your work in how to utilize the linear attention. But the config you provided is valid, like https://github.com/LeapLabTHU/cfgs/flatten_pvt_t.yaml, can you renew them?