Closed Bill-ai closed 4 years ago
Start a local stanford-corenlp service.
### Thanks, started the server and this pops up. I know there is already an issue created for this, but I didn't understand the answer given by you. Can you please explain it to me in a simpler way?****
Traceback (most recent call last):
File "train.py", line 837, in
PS: I'm using the def infer below the input, but you were talking that infer API is not working and thus to use test? Please correct me if I'm wrong.
Best Bill
Prepare the data as the readme written.
Took the same example of readMe file and pasted it into ctable.tables.json in data_and_model folder.
python3 train.py
BERT-type: uncased_L-12_H-768_A-12
Batch_size = 32
BERT parameters:
learning rate: 1e-05
Fine-tune BERT: False
vocab size: 30522
hidden_size: 768
num_hidden_layer: 12
num_attention_heads: 12
hidden_act: gelu
intermediate_size: 3072
hidden_dropout_prob: 0.1
attention_probs_dropout_prob: 0.1
max_position_embeddings: 512
type_vocab_size: 2
initializer_range: 0.02
Load pre-trained parameters.
Seq-to-SQL: the number of final BERT layers to be used: 2
Seq-to-SQL: the size of hidden dimension = 100
Seq-to-SQL: LSTM encoding layer size = 2
Seq-to-SQL: dropout rate = 0.3
Seq-to-SQL: learning rate = 0.001
Type question: what
Traceback (most recent call last):
File "train.py", line 837, in <module>
beam_size=1, show_table=False, show_answer_only=True
File "train.py", line 653, in infer
hds1 = tb1['header']
KeyError: 'header'
Had a look into train.py the def infer has a variable assigned hsa1 which accesses header. I'm I doing something wrong?
You have to debug and recode..
I tried the repo from scratch, but still getting the same error. In ctable.tables.json I have added the same sample data input in the first line. Can you please help me out here? I don't understand if there is no header in your input sample data, but you still want to access it through def infer? How is that possible?
https://drive.google.com/file/d/1iJvsf38f16el58H4NPINQ7uzal5-V4v4
the table data is here
my sample data is the questions and SQLs. The table data is not uploaded.
Okay, thanks for the clarity, But the thing is I'm actually doing an inference. So I don't need the questions and sqls I guess? I have already downloaded the table data file, I see some tables and jsonl files and uncased vocab and model files also.
Can you tell me what should I insert into ctable.tables.json file to run the main.py file for inference? Or Quick way you can tell is how can I convert the CSV file to get queries directly by running main.py. Please tell me how do I preprocess the CSV file? and what changes do I need to have in main.py? Or send me a file which I can directly run for inference.
I'm sorry for the long conversation. Can I email you up?
Hi, I am also facing the same issue. Is there any update on this. Can u share the test for influence?
Inference is not supported now.
corenlp.client.PermanentlyFailedException: Timed out waiting for service to come alive. The model asked for the Type question, when I typed question and the above error pops up? Do you know why this might be a problem? Please help.
PS: Calling the def infer function, Changed the args like --do_train to False --do_infer to True --infer_loop to True --EG to True and in the def infer block changed the args show_table to True and show_anwer_only to True.
Thanks Bill