easonnie / combine-FEVER-NSMN

This repository provides the implementation for the paper "Combining Fact Extraction and Verification with Neural Semantic Matching Networks".
MIT License
76 stars 22 forks source link

Exception happened while reload tokenizer dictionary #4

Closed sylviawangfr closed 5 years ago

sylviawangfr commented 5 years ago

Java Version: openjdk 1.8.0.191 while running python src/pipeline/auto_pipeline.py, exception happened at reload tokenizer dictionary. Console logs: 100%|███████████████████████████████████████████████████████| 597040/597040 [47:48<00:00, 208.13it/s] Reload tokenizer dictionary Load Jsonl: /home/chenglab/sylvia/fever/combine-FEVER-NSMN/results/pipeline_r_aaai_doc/2019_03_02_17:43:58_r/t_shared_task_dev.jsonl 19998it [00:00, 30181.29it/s] Traceback (most recent call last): File "/home/chenglab/anaconda3/lib/python3.6/site-packages/pexpect/expect.py", line 111, in expect_loop incoming = spawn.read_nonblocking(spawn.maxread, timeout) File "/home/chenglab/anaconda3/lib/python3.6/site-packages/pexpect/pty_spawn.py", line 482, in read_nonblocking raise TIMEOUT('Timeout exceeded.') pexpect.exceptions.TIMEOUT: Timeout exceeded.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "src/pipeline/auto_pipeline.py", line 633, in steps=default_steps) File "src/pipeline/auto_pipeline.py", line 230, in pipeline method=doc_retrieval_method, top_k=100) File "src/pipeline/auto_pipeline.py", line 550, in first_doc_retrieval retri_object.instance.sample_answer_with_priority(d_list, top_k=top_k) File "/home/chenglab/sylvia/fever/combine-FEVER-NSMN/src/chaonan_src/doc_retrieval_experiment.py", line 129, in sample_answer_with_priority self.item_rb.first_only_rules(item) File "/home/chenglab/sylvia/fever/combine-FEVER-NSMN/src/chaonan_src/_doc_retrieval/item_rules_spiral.py", line 191, in return lambda x: self.initialize_item(x)\ File "/home/chenglab/sylvia/fever/combine-FEVER-NSMN/src/chaonan_src/_doc_retrieval/item_rules_spiral.py", line 34, in exact_match_rule self.get_token_lemma_from_claim(item['claim']) File "/home/chenglab/sylvia/fever/combine-FEVER-NSMN/src/chaonan_src/_doc_retrieval/item_rules.py", line 196, in get_token_lemma_from_claim claim_tok_r = self.tokenizer.tokenize(claim_norm) File "/home/chenglab/sylvia/fever/combine-FEVER-NSMN/dep_packages/DrQA/drqa_yixin/tokenizers/corenlp_tokenizer.py", line 96, in tokenize self.corenlp.expect_exact('NLP>', searchwindowsize=100) File "/home/chenglab/anaconda3/lib/python3.6/site-packages/pexpect/spawnbase.py", line 418, in expect_exact return exp.expect_loop(timeout) File "/home/chenglab/anaconda3/lib/python3.6/site-packages/pexpect/expect.py", line 119, in expect_loop return self.timeout(e) File "/home/chenglab/anaconda3/lib/python3.6/site-packages/pexpect/expect.py", line 82, in timeout raise TIMEOUT(msg) pexpect.exceptions.TIMEOUT: Timeout exceeded. <pexpect.pty_spawn.spawn object at 0x7fdf20c2b198>

sylviawangfr commented 5 years ago

move to another machine with better RAM, issue disappeared