Closed YanZehong closed 2 years ago
Hi YanZehong, I have uploaded four configs for aspect term extraction (semeval15_res.yaml
, semeval14_lap.yaml
, semeval14_res.yaml
, semeval16.yaml
). You need to train the fine-tuned embeddings and modify the path to the corresponding fine-tuned embeddings (I also upload the config files for the fine-tuning).
For your confusing parts:
model_name
is the model name to store the trained model.professors
, and teachers
, just ignore them. The code will not read these parts. Only the parts I listed in the Guide are important.Thank you so much. That works for me :)
Hi Wang Xinyu, I am a graduate student from NUS. Thanks for sharing such valuable study and source code. Now, I am trying to reproduce the results of aspect term extraction. I followed the instruction of Named Entity Recognition and referred to several files (like bert-en-ner-finetune.yaml, en-bert-extract.yaml, conll_03_english.yaml, etc.). But I still found it is difficult to write an executable config file for aspect extraction(targets: ast). The guidance is incomplete and makes me so confused. For example, why do I need to provided a finetuned model name since I haven't trained a finetuned model yet, like 'model_name: en-bert_10epoch_32batch_0.00005lr_10000lrrate_en_monolingual_nocrf_fast_sentbatch_sentloss_finetune_saving_nodev_newner4'. Plus, there are some keywords that are not explained, such as 'processors' and 'teachers' in 'upos', 'anneal_factor', 'interpolation'. I have tried to retrieve some information about that in flair/datasets.py file. But I was really unfamiliar with such code style and found it hard to start. I would appreciate it if you share such related instructions or files about aspect term extraction. Look forward to your reply. Best wishes