HamedBabaei / LLMs4OL

LLMs4OL:‌ Large Language Models for Ontology Learning
MIT License
87 stars 8 forks source link

Dataset Help #54

Open KeyLKey opened 11 months ago

KeyLKey commented 11 months ago

It's a great honor to see your masterpiece, but now I'm facing difficulties. Can you provide nci_entities.json data file, thank you very much

igorcouto commented 11 months ago

+1 in seeing the dataset and better instructions. I received error messages in everything I tried.

HamedBabaei commented 11 months ago

It's a great honor to see your masterpiece, but now I'm facing difficulties. Can you provide nci_entities.json data file, thank you very much

Dear @KeyLKey , thanks for the comment, I will add more information on how to create data! Unfortunately due to the LICENSE‌ of UMLS datasets, we might not be able to share it, however, we can provide the details of how to create one.

HamedBabaei commented 11 months ago

+1 in seeing the dataset and better instructions. I received error messages in everything I tried.

Dear @igorcouto , thanks for the comment, can you share the error message with me till I can check what could be the issue and fix it? thanks

KeyLKey commented 11 months ago

It's a great honor to see your masterpiece, but now I'm facing difficulties. Can you provide nci_entities.json data file, thank you very much

Dear @KeyLKey , thanks for the comment, I will add more information on how to create data! Unfortunately due to the LICENSE‌ of UMLS datasets, we might not be able to share it, however, we can provide the details of how to create one.

Dear author, could you tell me which data file to download? Is it named UMLS Metathesaurus Full Subset or UMLS Semantic Network files? The former decompressed 27.1GB, and I am very eager to build a dataset like yours. Thank you very much!

HamedBabaei commented 11 months ago

It's a great honor to see your masterpiece, but now I'm facing difficulties. Can you provide nci_entities.json data file, thank you very much

Dear @KeyLKey , thanks for the comment, I will add more information on how to create data! Unfortunately due to the LICENSE‌ of UMLS datasets, we might not be able to share it, however, we can provide the details of how to create one.

Dear author, could you tell me which data file to download? Is it named UMLS Metathesaurus Full Subset or UMLS Semantic Network files? The former decompressed 27.1GB, and I am very eager to build a dataset like yours. Thank you very much!

Hi @KeyLKey, for UMLS you need to download the umls-2022AB-metathesaurus-full.zip file and follow the instructions for creating a dataset for Task A, B, C using this notebook TaskA/notebooks/umls-dataset-preprations_for_TaskABC.ipynb (this is available in the repository).

You will build datasets for MEDCIN, NCI, and SNOMEDCT_US.

More later, for Task A, you need to run TaskA/build_entity_dataset.py only the last parts which are as follows:

    config = BaseConfig(version=3).get_args(kb_name="umls")
    umls_builder = dataset_builder(config=config)
    dataset_json, dataset_stats = umls_builder.build()
    for kb in list(dataset_json.keys()):
        DataWriter.write_json(data=dataset_json[kb],
                              path=BaseConfig(version=3).get_args(kb_name=kb.lower()).entity_path)
        DataWriter.write_json(data=dataset_stats[kb],
                              path=BaseConfig(version=3).get_args(kb_name=kb.lower()).dataset_stats)

You need to look at the TaskA/configuration/config.py to make sure you have the right path to be sent to create the dataset.

for task B you need to run the following scripts (please also consider checking those scripts to use only for UMLS)

1. build_hierarchy.py
2. build_datasets.py
3. train_test_split.py

And for C please only run the following script:

1. build_datasets.py
2. train_test_split.py

I hope this helps and Good Luck,

tage384 commented 11 months ago

Dear author, I'm trying to build nci_entities.json following your method, but found that it's missing UMLS_entity_types_with_levels.tsv.May I ask what went wrong? Thank you very much!