OFA-Sys / Chinese-CLIP

Chinese version of CLIP which achieves Chinese cross-modal retrieval and representation generation.
MIT License
4.32k stars 448 forks source link

The context length to use; all baseline models use 52 as the context length #343

Open qiuchen001 opened 1 month ago

qiuchen001 commented 1 month ago

https://github.com/OFA-Sys/Chinese-CLIP/blob/85f3fa3639e207d0b76f69a401105cad5d509593/cn_clip/clip/utils.py#L131 这里有提到:“所有基线模型都使用 52 作为上下文长度” 想知道的是这里的基线模型指的是哪些模型,有什么办法增加上下文长度吗