sem-tab-challenge / 2022

SemTab 2022: Semantic Web Challenge on Tabular Data to Knowledge Graph Matching
https://sem-tab-challenge.github.io/2022/
2 stars 0 forks source link

Groundtruth for GitTables_SemTab_2022 **_targets.csv #2

Open TommyDzh opened 1 month ago

TommyDzh commented 1 month ago

I appreciate your efforts in organizing the competition. However, I can't find the ground-truth for GitTables_SemTab_2022 like dbpedia_property_targets.csv, then how can I evaluate the performance on the test set? Thanks.

vefthym commented 1 month ago

Hi! Thanks for your interest. Please check also the SemTab 2022 webpage (https://sem-tab-challenge.github.io/2022/), where all datasets are listed. The data files that you are looking for are here: https://github.com/sem-tab-challenge/2022/blob/main/datasets/GitTables_SemTab_2022_dbpedia_dataset.zip

Let us know if that is not what you were looking for.

TommyDzh commented 1 month ago

Thank you for your response. But in the https://github.com/sem-tab-challenge/2022/blob/main/datasets/GitTables_SemTab_2022_dbpedia_dataset.zip, the dbpedia_property_train.csv provides both target column and ground truth, however, the dbpedia_property_targets.csv only provides target column without ground truth. I wonder how I can evaluate the method on the dbpedia_property_targets.csv. image

madelonhulsebos commented 1 month ago

Hi @TommyDzh,

Thanks for reaching out. The dataset with ground truth was published on Zenodo after the challenge ended. You can find it here: https://zenodo.org/records/5706316. Let me know if anything is missing or unclear.

Good luck!

Madelon