This repository is build upon mmocr 0.4.0.
NightTime-ArT dataset, collected from ArT, can be downloaded from here.
The code is based on mmocr. Please first install the mmcv-full
and mmocr
following the official guidelines (mmocr).
Please following the mmocr official guidelines to prepare the datasets accordingly.
Configure the dataset path in ocrclip/configs/_base_/det_datasets
.
RN50.pt
) and save them to the pretrained
folder.model = dict(
pretrained='xxx/ocrclip/pretrained/RN50.pt',
)
To pretrain the TCM model on SynthText/Synth150k, please configure the corresponding dataset path, then run:
bash dist_train.sh configs/textdet/xxnet/xxx.py 8
To finetune the TCM model based on pretrained model, please configure the load_from
to the pretrained checkpoint path, then run:
bash dist_train.sh configs/textdet/xxnet/xxx.py 8
To evaluate the performance with checkpoint, run:
bash dist_test.sh configs/textdet/xxnet/xxx.py /path/to/checkpoint 1 --eval hmean-iou
Method | Data | F-measure | Model |
---|---|---|---|
TCM-DB | TD | 88.8% | config weights |
TCM-DB | IC15 | 88.8% | config weights |
TCM-DB | CTW | 85.1% | config |
TCM-DB | TT | 85.9% | config |
Please refer to the spotter
folder for more details.
Please refer to the rotated_object_detection
folder for more details.
If you find this project helpful for your research, please consider citing the paper
@inproceedings{Yu2023TurningAC,
title={Turning a CLIP Model into a Scene Text Detector},
author={Wenwen Yu and Yuliang Liu and Wei Hua and Deqiang Jiang and Bo Ren and Xiang Bai},
booktitle={IEEE Conference on Computer Vision and Pattern Recognition},
year={2023}
}
@article{Yu2024TurningAC,
title={Turning a CLIP Model into a Scene Text Spotter},
author={Wenwen Yu and Yuliang Liu and Xingkui Zhu and Haoyu Cao and Xing Sun and Xiang Bai},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
year={2024}
}
This project is under the CC-BY-NC 4.0 license. See LICENSE
for more details.
The project partially based on MMOCR, CLIP, MMRotate, DenseCLIP, AdelaiDet, Deformable-DETR, TESTR. Thanks for their great works.