Closed zjl123001 closed 1 month ago
Hello,
Thanks for bringing this up! Here are a few clarifications regarding the pretrained and finetuned checkpoints:
I noticed that I initially forgot to include the charset from the HWDB dataset (CASIA v2). I've now added the missing charset in the corresponding folder. HWDB.py (CASIA v2) were already designed to look for this charset but i was missing:
self.data = pickle.load(open(os.path.join(datasets_path, 'HWDB', 'data.pkl'), "rb"))
self.charset = self.data['charset']
Please let me know if you need further assistance.
i am confused that the dimension in finetuned checkpoint HWDB is 2704 but the charset in /data/HWDB_v1/charset.pkl is 7356 could you pls help me about this problem? thanks