Alibaba-NLP / ACE

[ACL-IJCNLP 2021] Automated Concatenation of Embeddings for Structured Prediction
Other
302 stars 45 forks source link

Questions about reproduction of Aspect Term Extraction #35

Closed YanZehong closed 2 years ago

YanZehong commented 2 years ago

Hi Wang Xinyu, I am a graduate student from NUS. Thanks for sharing such valuable study and source code. Now, I am trying to reproduce the results of aspect term extraction. I followed the instruction of Named Entity Recognition and referred to several files (like bert-en-ner-finetune.yaml, en-bert-extract.yaml, conll_03_english.yaml, etc.). But I still found it is difficult to write an executable config file for aspect extraction(targets: ast). The guidance is incomplete and makes me so confused. For example, why do I need to provided a finetuned model name since I haven't trained a finetuned model yet, like 'model_name: en-bert_10epoch_32batch_0.00005lr_10000lrrate_en_monolingual_nocrf_fast_sentbatch_sentloss_finetune_saving_nodev_newner4'. Plus, there are some keywords that are not explained, such as 'processors' and 'teachers' in 'upos', 'anneal_factor', 'interpolation'. I have tried to retrieve some information about that in flair/datasets.py file. But I was really unfamiliar with such code style and found it hard to start. I would appreciate it if you share such related instructions or files about aspect term extraction. Look forward to your reply. Best wishes

wangxinyu0922 commented 2 years ago

Hi YanZehong, I have uploaded four configs for aspect term extraction (semeval15_res.yaml, semeval14_lap.yaml, semeval14_res.yaml, semeval16.yaml). You need to train the fine-tuned embeddings and modify the path to the corresponding fine-tuned embeddings (I also upload the config files for the fine-tuning).

For your confusing parts:

YanZehong commented 2 years ago

Thank you so much. That works for me :)