Closed unknowed-ER closed 3 years ago
now ,I change my enviorment into following ,but the error still exist。 cuda 10.1 pytorch 1.6.0 java openjdk 1.8
now ,I change my enviorment into following ,but the error still exist。 cuda 10.1 pytorch 1.6.0 java openjdk 1.8
have u unzip the file? or maybe just try the absolute path?
I solved this problem. This problem is caused by the default port 9000 being occupied.
when I running "python run.py preprocess experiments/spider-bert-run.jsonnet", get following error:
WARNING <class 'ratsql.models.enc_dec.EncDecModel.Preproc'>: superfluous {'name': 'EncDec'} DB connections: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 166/166 [00:00<00:00, 358.94it/s] train section: 0%| | 0/8659 [00:00<?, ?it/s] WARNING: CoreNLP connection timeout. Recreating the server... train section: 0%| | 0/8659 [00:30<?, ?it/s] Traceback (most recent call last): File "/home/fsx/model/rat-sql/ratsql/resources/corenlp.py", line 28, in annotate result = self.client.annotate(text, annotators, output_format, properties) File "/home/fsx/anaconda3/envs/fsxtorch/lib/python3.8/site-packages/corenlp/client.py", line 225, in annotate r = self._request(text.encode('utf-8'), properties) File "/home/fsx/anaconda3/envs/fsxtorch/lib/python3.8/site-packages/corenlp/client.py", line 178, in _request self.ensure_alive() File "/home/fsx/anaconda3/envs/fsxtorch/lib/python3.8/site-packages/corenlp/client.py", line 119, in ensure_alive raise PermanentlyFailedException("Timed out waiting for service to come alive.") corenlp.client.PermanentlyFailedException: Timed out waiting for service to come alive.
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "run.py", line 108, in
main()
File "run.py", line 73, in main
preprocess.main(preprocess_config)
File "/home/fsx/model/rat-sql/ratsql/commands/preprocess.py", line 53, in main
preprocessor.preprocess()
File "/home/fsx/model/rat-sql/ratsql/commands/preprocess.py", line 32, in preprocess
to_add, validation_info = self.model_preproc.validate_item(item, section)
File "/home/fsx/model/rat-sql/ratsql/models/enc_dec.py", line 36, in validate_item
enc_result, enc_info = self.enc_preproc.validate_item(item, section)
File "/home/fsx/model/rat-sql/ratsql/models/spider/spider_enc.py", line 722, in validate_item
preproc_schema = self._preprocess_schema(item.schema)
File "/home/fsx/model/rat-sql/ratsql/models/spider/spider_enc.py", line 735, in _preprocess_schema
result = preprocess_schema_uncached(schema, self._tokenize,
File "/home/fsx/model/rat-sql/ratsql/models/spider/spider_enc.py", line 78, in preprocess_schema_uncached
r.normalized_column_names.append(Bertokens(col_toks))
File "/home/fsx/model/rat-sql/ratsql/models/spider/spider_enc.py", line 555, in init
self.normalize_toks()
File "/home/fsx/model/rat-sql/ratsql/models/spider/spider_enc.py", line 606, in normalize_toks
ann = corenlp.annotate(tok, annotators=['tokenize', 'ssplit', 'lemma'])
File "/home/fsx/model/rat-sql/ratsql/resources/corenlp.py", line 46, in annotate
return _singleton.annotate(text, annotators, output_format, properties)
File "/home/fsx/model/rat-sql/ratsql/resources/corenlp.py", line 34, in annotate
result = self.client.annotate(text, annotators, output_format, properties)
File "/home/fsx/anaconda3/envs/fsxtorch/lib/python3.8/site-packages/corenlp/client.py", line 225, in annotate
r = self._request(text.encode('utf-8'), properties)
File "/home/fsx/anaconda3/envs/fsxtorch/lib/python3.8/site-packages/corenlp/client.py", line 178, in _request
self.ensure_alive()
File "/home/fsx/anaconda3/envs/fsxtorch/lib/python3.8/site-packages/corenlp/client.py", line 119, in ensure_alive
raise PermanentlyFailedException("Timed out waiting for service to come alive.")
corenlp.client.PermanentlyFailedException: Timed out waiting for service to come alive.
And I already download corenlp from http://nlp.stanford.edu/software/stanford-corenlp-full-2018-10-05.zip and put it into os.path.join(os.path.dirname(file),'../../third_party/stanford-corenlp-full-2018-10-05'))
the environment is : Ubuntu 18.04 TLS 64 Python 3.8.11 cuda 11.1 pytorch 1.8.0 java openjdk 11.0.11