JulianSampels / OntoMatch

A new ontology matcher.
GNU General Public License v3.0
1 stars 0 forks source link

run_matcher.py fails #19

Open FelixFrizzy opened 12 hours ago

FelixFrizzy commented 12 hours ago

I'm trying to execute ontomatch, but i fail.

Steps to reproduce:

It uses the default dataset, I didn't include my own. This time it is not related to (missing or faulty) python packages I guess, but rather to problems in the code base itself. Do you have any ideas how to fix it? It would be great if you keep the issue open until it's fixed, I am interested in running it also after OAEI has finished.

Verbalizing:   0%|          | 0/71 [00:00<?, ?item/s]
['translate Graph to English: <H> event <R> is parent of <T> scientificevent <H> scientificevent <R> is parent of <T> individualpresentation <H> individualpresentation <R> is parent of <T> contributedtalk  <H> event <R> is parent of <T> socialevent <H> socialevent <R> is parent of <T> conferencetrip ']
ERROR VERBALISING translate Graph to English: <H> event <R> is parent of <T> scientificevent <H> scientificevent <R> is parent of <T> individualpresentation <H> individualpresentation <R> is parent of <T> contributedtalk  <H> event <R> is parent of <T> socialevent <H> socialevent <R> is parent of <T> conferencetrip 
Traceback (most recent call last):
  File "...OntoMatch/src/run_matcher.py", line 362, in <module>
    main()
  File "...OntoMatch/src/run_matcher.py", line 223, in main
    tripleVerbalizer.verbaliseFile(tripleFilePath, tripleVerbalizedFilePath)
  File "...OntoMatch/src/verbalizer/tripleVerbalizer.py", line 33, in verbaliseFile
    verbalised_text = verbalise(triples, verb_module)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...OntoMatch/src/verbalizer/tripleVerbalizer.py", line 16, in verbalise
    return verbModule.verbalise(ans)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...OntoMatch/src/verbalizer/verbalisation_module.py", line 143, in verbalise
    return self.verbalise_sentence(input)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...OntoMatch/src/verbalizer/verbalisation_module.py", line 103, in verbalise_sentence
    gen_output = self.__generate_verbalisations_from_inputs(inputs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...OntoMatch/src/verbalizer/verbalisation_module.py", line 46, in __generate_verbalisations_from_inputs
    gen_output = self.g2t_module.model.generate(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/username/.virtualenvs/ontomatch/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/username/.virtualenvs/ontomatch/lib/python3.12/site-packages/transformers/generation/utils.py", line 1713, in generate
    self._prepare_special_tokens(generation_config, kwargs_has_attention_mask, device=device)
  File "/Users/username/.virtualenvs/ontomatch/lib/python3.12/site-packages/transformers/generation/utils.py", line 1556, in _prepare_special_tokens
    raise ValueError(
ValueError: `decoder_start_token_id` or `bos_token_id` has to be defined for encoder-decoder generation.
JulianSampels commented 5 hours ago

Perhaps this is caused by using a different module version. Which version of transformers are you using? Please run pip show transformers.