Closed sniper0110 closed 5 years ago
You do not need the pickled data. Simply replace
with open('gt_util_synthtext_seglink.pkl', 'rb') as f:
gt_util = pickle.load(f)
with
gt_util = GTUtility('data/SynthText/', polygon=True)
The gt_util_synthtext_seglink.pkl
is only to speed up the parsing of the dataset. It is serialized in datasets.ipynb
and does not contain the image data itself, only filenames, bounding boxes and so on. See also #1 ...
Thanks for the quick reply. Did you get the SynthText dataset from here http://www.robots.ox.ac.uk/~vgg/data/scenetext/ ?
Yes!
Alright thanks!
Could someone who has generated the .pkl file PR it? I'm on a limited bandwidth network and would like to run the end2end notebook.
Hello,
Can you please upload the pickled dataset that you used for training SegLink?
It would be great if we can just run the code first and then try to understand the pipeline. I am asking this because I am finding it difficult to understand how the data is prepared for training SegLink. I have trained object detectors before but I think I am missing a step when it comes to training text detectors. So checking your data and playing with it would definitely help me understand better the pipeline.
Thanks in advance.