Open PC-god opened 3 years ago
Hi, I can calculate xml outputs, too. And I have TEDS code, do you need? I can share with you. But I want to ask you the results you get on UNLV test data are like this?
this jpg is in the /TabStructNet-master/trained_model/tab/result_jpg/AR_1024.jpg and AR_1025.jpg, do you get the same results with me? I am wonder I get something wrong
Can you please try to increase the DETECTION_NMS_THRESHOLD value to 0.5 or higher and test again?
Can you please try to increase the DETECTION_NMS_THRESHOLD value to 0.5 or higher and test again?
Thank you very much! I change the DETECTION_NMS_THRESHOLD value to 0.5 then I get all the bboxs ! This is really help me a lot! May I ask you another question? I try to use two GPUS to finetune the model on my own dataset, change the GPU_COUNT=2 in tabnet.py, however i got this error seems like batch_size couldn't match the code. But if i change Image_PER_GPU=2 (or4) that is ok, could you please to help me use more than one GPU?
Hi, I will try and figure this out and will update you.
Thank you very much! Looking forward to your reply~~
@PC-god @wangwen-whu Hi, is there any comments talking about how to evalutate the output xml files? Thanks for your help.
Hi @wangwen-whu, can you please share the code to generate the TEDS metric?
Hi,the TEDS code is mentioned here https://github.com/ibm-aur-nlp/PubTabNet/tree/master/src it need html style to be the input, next month, I will upload a transform code in my work: WTW dataset . You can use it if you need it too.
@wangwen-whu, thank you very much!
Hi @wangwen-whu , were you able to solve the issue with training on two GPUs? My GPU is just 8GB and it won't fit two images so I can't make IMAGE_PER_GPU=2 but I have access to multiple GPUs of 8GB. I am now stucked at the problem you mentioned earlier.
Thanks your project. According to lasted commit, I can calculate xml outputs. When I want to calculate TEDS metric, I don't find the corresponding function. If you write corresponding function, please indicate its location. In the end, thanks very much.