salesforce / TabularSemanticParsing

Translating natural language questions to a structured query language
https://arxiv.org/abs/2012.12627
BSD 3-Clause "New" or "Revised" License
222 stars 51 forks source link

Training error #3

Closed duyvuleo closed 3 years ago

duyvuleo commented 3 years ago

Hi,

I followed the steps and trained a model with Spider dataset and experienced the following error:

./experiment-bridge.sh configs/bridge/spider-bridge-bert-large.sh --train 0
run CUDA_VISIBLE_DEVICES=0 python3 -m src.experiments     --train     --data_dir data/spider     --db_dir data/spider/database     --dataset_name spider     --question_split          --question_only               --denormalize_sql               --table_shuffling     --use_lstm_encoder     --use_meta_data_encoding          --sql_consistency_check          --use_picklist     --anchor_text_match_threshold 0.85               --top_k_picklist_matches 2     --process_sql_in_execution_order               --num_random_tables_added 0                         --save_best_model_only     --schema_augmentation_factor 1          --data_augmentation_factor 1          --vocab_min_freq 0     --text_vocab_min_freq 0     --program_vocab_min_freq 0     --num_values_per_field 0     --max_in_seq_len 512     --max_out_seq_len 60     --model bridge     --num_steps 100000     --curriculum_interval 0     --num_peek_steps 1000     --num_accumulation_steps 2     --train_batch_size 16     --dev_batch_size 24     --encoder_input_dim 1024     --encoder_hidden_dim 400     --decoder_input_dim 400     --num_rnn_layers 1     --num_const_attn_layers 0     --emb_dropout_rate 0.3     --pretrained_lm_dropout_rate 0     --rnn_layer_dropout_rate 0     --rnn_weight_dropout_rate 0     --cross_attn_dropout_rate 0     --cross_attn_num_heads 8     --res_input_dropout_rate 0.2     --res_layer_dropout_rate 0     --ff_input_dropout_rate 0.4     --ff_hidden_dropout_rate 0.0     --pretrained_transformer bert-large-uncased          --bert_finetune_rate 0.00006     --learning_rate 0.0005     --learning_rate_scheduler inverse-square     --trans_learning_rate_scheduler inverse-square     --warmup_init_lr 0.0005     --warmup_init_ft_lr 0.00003     --num_warmup_steps 4000     --grad_norm 0.3     --decoding_algorithm beam-search     --beam_size 16     --bs_alpha 1.05     --gpu 0
./experiment-bridge.sh: line 205: CUDA_VISIBLE_DEVICES=0: command not found
(pyvenv37-oda-text2sql-tabular-semantic-parsing) [vchoang@oda2-vm-gpu3-4-ad3-03 TabularSemanticParsing]$ vim ./experiment-bridge.sh
(pyvenv37-oda-text2sql-tabular-semantic-parsing) [vchoang@oda2-vm-gpu3-4-ad3-03 TabularSemanticParsing]$ ./experiment-bridge.sh configs/bridge/spider-bridge-bert-large.sh --train 0
run python3 -m src.experiments     --train     --data_dir data/spider     --db_dir data/spider/database     --dataset_name spider     --question_split          --question_only               --denormalize_sql               --table_shuffling     --use_lstm_encoder     --use_meta_data_encoding          --sql_consistency_check          --use_picklist     --anchor_text_match_threshold 0.85               --top_k_picklist_matches 2     --process_sql_in_execution_order               --num_random_tables_added 0                         --save_best_model_only     --schema_augmentation_factor 1          --data_augmentation_factor 1          --vocab_min_freq 0     --text_vocab_min_freq 0     --program_vocab_min_freq 0     --num_values_per_field 0     --max_in_seq_len 512     --max_out_seq_len 60     --model bridge     --num_steps 100000     --curriculum_interval 0     --num_peek_steps 1000     --num_accumulation_steps 2     --train_batch_size 16     --dev_batch_size 24     --encoder_input_dim 1024     --encoder_hidden_dim 400     --decoder_input_dim 400     --num_rnn_layers 1     --num_const_attn_layers 0     --emb_dropout_rate 0.3     --pretrained_lm_dropout_rate 0     --rnn_layer_dropout_rate 0     --rnn_weight_dropout_rate 0     --cross_attn_dropout_rate 0     --cross_attn_num_heads 8     --res_input_dropout_rate 0.2     --res_layer_dropout_rate 0     --ff_input_dropout_rate 0.4     --ff_hidden_dropout_rate 0.0     --pretrained_transformer bert-large-uncased          --bert_finetune_rate 0.00006     --learning_rate 0.0005     --learning_rate_scheduler inverse-square     --trans_learning_rate_scheduler inverse-square     --warmup_init_lr 0.0005     --warmup_init_ft_lr 0.00003     --num_warmup_steps 4000     --grad_norm 0.3     --decoding_algorithm beam-search     --beam_size 16     --bs_alpha 1.05     --gpu 0
Model directory created: /mnt/shared_ad3_mt1/vchoang/works/projects/oda/text2sql/code/TabularSemanticParsing/model/spider.bridge.lstm.meta.ts.ppl-0.85.2.dn.eo.feat.bert-large-uncased.xavier-1024-400-400-16-2-0.0005-inv-sqr-0.0005-4000-6e-05-inv-sqr-3e-05-4000-0.3-0.3-0.0-0.0-1-8-0.0-0.0-res-0.2-0.0-ff-0.4-0.0.210111-233401.kvzx
Visualization directory created: /mnt/shared_ad3_mt1/vchoang/works/projects/oda/text2sql/code/TabularSemanticParsing/viz/spider.bridge.lstm.meta.ts.ppl-0.85.2.dn.eo.feat.bert-large-uncased.xavier-1024-400-400-16-2-0.0005-inv-sqr-0.0005-4000-6e-05-inv-sqr-3e-05-4000-0.3-0.3-0.0-0.0-1-8-0.0-0.0-res-0.2-0.0-ff-0.4-0.0.210111-233401.kvzx
* text vocab size = 30522
* program vocab size = 99

pretrained_transformer = bert-large-uncased
fix_pretrained_transformer_parameters = False

bridge module created
execution order restoration cache copied
source: data/spider/dev.eo.pred.restored.pkl
dest: /mnt/shared_ad3_mt1/vchoang/works/projects/oda/text2sql/code/TabularSemanticParsing/model/spider.bridge.lstm.meta.ts.ppl-0.85.2.dn.eo.feat.bert-large-uncased.xavier-1024-400-400-16-2-0.0005-inv-sqr-0.0005-4000-6e-05-inv-sqr-3e-05-4000-0.3-0.3-0.0-0.0-1-8-0.0-0.0-res-0.2-0.0-ff-0.4-0.0.210111-233401.kvzx/dev.eo.pred.restored.pkl

loading preprocessed data: data/spider/spider.bridge.question-split.ppl-0.85.2.dn.eo.bert.pkl
8659 training examples loaded
1034 dev examples loaded
Model initialization (xavier)
--------------------------
encoder_embeddings.trans_parameters.embeddings.word_embeddings.weight (skipped)
encoder_embeddings.trans_parameters.embeddings.position_embeddings.weight (skipped)
encoder_embeddings.trans_parameters.embeddings.token_type_embeddings.weight (skipped)
encoder_embeddings.trans_parameters.embeddings.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.embeddings.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.0.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.1.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.2.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.3.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.4.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.5.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.6.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.7.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.8.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.9.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.10.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.11.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.12.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.13.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.14.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.15.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.16.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.17.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.18.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.19.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.20.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.21.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.22.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.self.query.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.self.query.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.self.key.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.self.key.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.self.value.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.self.value.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.attention.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.intermediate.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.intermediate.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.output.dense.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.output.dense.bias (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.output.LayerNorm.weight (skipped)
encoder_embeddings.trans_parameters.encoder.layer.23.output.LayerNorm.bias (skipped)
encoder_embeddings.trans_parameters.pooler.dense.weight (skipped)
encoder_embeddings.trans_parameters.pooler.dense.bias (skipped)
decoder_embeddings.embeddings.weight done
encoder.bilstm_encoder.rnn.rnn.rnn.weight_ih_l0 done
encoder.bilstm_encoder.rnn.rnn.rnn.weight_hh_l0 done
encoder.bilstm_encoder.rnn.rnn.rnn.bias_ih_l0
encoder.bilstm_encoder.rnn.rnn.rnn.bias_hh_l0
encoder.bilstm_encoder.rnn.rnn.rnn.weight_ih_l0_reverse done
encoder.bilstm_encoder.rnn.rnn.rnn.weight_hh_l0_reverse done
encoder.bilstm_encoder.rnn.rnn.rnn.bias_ih_l0_reverse
encoder.bilstm_encoder.rnn.rnn.rnn.bias_hh_l0_reverse
encoder.text_encoder.rnn.rnn.rnn.weight_ih_l0 done
encoder.text_encoder.rnn.rnn.rnn.weight_hh_l0 done
encoder.text_encoder.rnn.rnn.rnn.bias_ih_l0
encoder.text_encoder.rnn.rnn.rnn.bias_hh_l0
encoder.text_encoder.rnn.rnn.rnn.weight_ih_l0_reverse done
encoder.text_encoder.rnn.rnn.rnn.weight_hh_l0_reverse done
encoder.text_encoder.rnn.rnn.rnn.bias_ih_l0_reverse
encoder.text_encoder.rnn.rnn.rnn.bias_hh_l0_reverse
encoder.schema_encoder.primary_key_embeddings.embeddings.weight done
encoder.schema_encoder.foreign_key_embeddings.embeddings.weight done
encoder.schema_encoder.field_type_embeddings.embeddings.weight done
encoder.schema_encoder.feature_fusion_layer.linear1.weight done
encoder.schema_encoder.feature_fusion_layer.linear1.bias
encoder.schema_encoder.feature_fusion_layer.linear2.weight done
encoder.schema_encoder.feature_fusion_layer.linear2.bias
encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.mdl_with_residual_connection.mdl.linear1.weight done
encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.mdl_with_residual_connection.mdl.linear1.bias
encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.mdl_with_residual_connection.mdl.linear2.weight done
encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.mdl_with_residual_connection.mdl.linear2.bias
decoder.out.linear.weight done
decoder.out.linear.bias
decoder.rnn.rnn.weight_ih_l0 done
decoder.rnn.rnn.weight_hh_l0 done
decoder.rnn.rnn.bias_ih_l0
decoder.rnn.rnn.bias_hh_l0
decoder.attn.wq.weight done
decoder.attn.wk.weight done
decoder.attn.wv.weight done
decoder.attn.wo.weight done
decoder.attn_combine.linear1.weight done
decoder.attn_combine.linear1.bias
decoder.pointer_switch.project.linear1.weight done
decoder.pointer_switch.project.linear1.bias
--------------------------

Model Parameters
--------------------------
mdl.decoder_embeddings.embeddings.weight 39600 requires_grad=True
mdl.encoder.bilstm_encoder.rnn.rnn.rnn.weight_ih_l0 819200 requires_grad=True
mdl.encoder.bilstm_encoder.rnn.rnn.rnn.weight_hh_l0 160000 requires_grad=True
mdl.encoder.bilstm_encoder.rnn.rnn.rnn.bias_ih_l0 800 requires_grad=True
mdl.encoder.bilstm_encoder.rnn.rnn.rnn.bias_hh_l0 800 requires_grad=True
mdl.encoder.bilstm_encoder.rnn.rnn.rnn.weight_ih_l0_reverse 819200 requires_grad=True
mdl.encoder.bilstm_encoder.rnn.rnn.rnn.weight_hh_l0_reverse 160000 requires_grad=True
mdl.encoder.bilstm_encoder.rnn.rnn.rnn.bias_ih_l0_reverse 800 requires_grad=True
mdl.encoder.bilstm_encoder.rnn.rnn.rnn.bias_hh_l0_reverse 800 requires_grad=True
mdl.encoder.text_encoder.rnn.rnn.rnn.weight_ih_l0 320000 requires_grad=True
mdl.encoder.text_encoder.rnn.rnn.rnn.weight_hh_l0 160000 requires_grad=True
mdl.encoder.text_encoder.rnn.rnn.rnn.bias_ih_l0 800 requires_grad=True
mdl.encoder.text_encoder.rnn.rnn.rnn.bias_hh_l0 800 requires_grad=True
mdl.encoder.text_encoder.rnn.rnn.rnn.weight_ih_l0_reverse 320000 requires_grad=True
mdl.encoder.text_encoder.rnn.rnn.rnn.weight_hh_l0_reverse 160000 requires_grad=True
mdl.encoder.text_encoder.rnn.rnn.rnn.bias_ih_l0_reverse 800 requires_grad=True
mdl.encoder.text_encoder.rnn.rnn.rnn.bias_hh_l0_reverse 800 requires_grad=True
mdl.encoder.schema_encoder.primary_key_embeddings.embeddings.weight 800 requires_grad=True
mdl.encoder.schema_encoder.foreign_key_embeddings.embeddings.weight 800 requires_grad=True
mdl.encoder.schema_encoder.field_type_embeddings.embeddings.weight 2400 requires_grad=True
mdl.encoder.schema_encoder.feature_fusion_layer.linear1.weight 640000 requires_grad=True
mdl.encoder.schema_encoder.feature_fusion_layer.linear1.bias 400 requires_grad=True
mdl.encoder.schema_encoder.feature_fusion_layer.linear2.weight 160000 requires_grad=True
mdl.encoder.schema_encoder.feature_fusion_layer.linear2.bias 400 requires_grad=True
mdl.encoder.schema_encoder.field_table_fusion_layer.layer_norm.gamma 400 requires_grad=True
mdl.encoder.schema_encoder.field_table_fusion_layer.layer_norm.beta 400 requires_grad=True
mdl.encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.mdl_with_residual_connection.mdl.linear1.weight 160000 requires_grad=True
mdl.encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.mdl_with_residual_connection.mdl.linear1.bias 400 requires_grad=True
mdl.encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.mdl_with_residual_connection.mdl.linear2.weight 160000 requires_grad=True
mdl.encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.mdl_with_residual_connection.mdl.linear2.bias 400 requires_grad=True
mdl.encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.layer_norm.gamma 400 requires_grad=True
mdl.encoder.schema_encoder.field_table_fusion_layer.res_feed_forward.layer_norm.beta 400 requires_grad=True
mdl.decoder.out.linear.weight 39600 requires_grad=True
mdl.decoder.out.linear.bias 99 requires_grad=True
mdl.decoder.rnn.rnn.weight_ih_l0 1280000 requires_grad=True
mdl.decoder.rnn.rnn.weight_hh_l0 640000 requires_grad=True
mdl.decoder.rnn.rnn.bias_ih_l0 1600 requires_grad=True
mdl.decoder.rnn.rnn.bias_hh_l0 1600 requires_grad=True
mdl.decoder.attn.wq.weight 160000 requires_grad=True
mdl.decoder.attn.wk.weight 160000 requires_grad=True
mdl.decoder.attn.wv.weight 160000 requires_grad=True
mdl.decoder.attn.wo.weight 160000 requires_grad=True
mdl.decoder.attn_combine.linear1.weight 320000 requires_grad=True
mdl.decoder.attn_combine.linear1.bias 400 requires_grad=True
mdl.decoder.pointer_switch.project.linear1.weight 800 requires_grad=True
mdl.decoder.pointer_switch.project.linear1.bias 1 requires_grad=True
Total # parameters = 342157588
--------------------------

wandb: Tracking run with wandb version 0.8.30
wandb: Wandb version 0.10.13 is available!  To upgrade, please run:
wandb:  $ pip install wandb --upgrade
wandb: Run data is saved locally in wandb/run-20210111_233422-17hmk4jb
wandb: Syncing run spider.bridge.lstm.meta.ts.ppl-0.85.2.dn.eo.feat.bert-large-uncased.xavier-1024-400-400-16-2-0.0005-inv-sqr-0.0005-4000-6e-05-inv-sqr-3e-05-4000-0.3-0.3-0.0-0.0-1-8-0.0-0.0-res-0.2-0.0-ff-0.4-0.0.210111-233422.pomb
wandb: ⭐️ View project at https://app.wandb.ai/duyvuleo/smore-spider-group--final
wandb: 🚀 View run at https://app.wandb.ai/duyvuleo/smore-spider-group--final/runs/17hmk4jb
wandb: Run `wandb off` to turn off syncing.

  0%|                                                                                                                 | 1/2000 [00:01<45:32,  1.37s/it]
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/local/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/mnt/shared_ad3_mt1/vchoang/works/projects/oda/text2sql/code/TabularSemanticParsing/src/experiments.py", line 407, in <module>
    run_experiment(args)
  File "/mnt/shared_ad3_mt1/vchoang/works/projects/oda/text2sql/code/TabularSemanticParsing/src/experiments.py", line 392, in run_experiment
    train(sp)
  File "/mnt/shared_ad3_mt1/vchoang/works/projects/oda/text2sql/code/TabularSemanticParsing/src/experiments.py", line 63, in train
    sp.run_train(train_data, dev_data)
  File "/mnt/shared_ad3_mt1/vchoang/works/projects/oda/text2sql/code/TabularSemanticParsing/src/common/learn_framework.py", line 207, in run_train
    formatted_batch = self.format_batch(mini_batch)
  File "/mnt/shared_ad3_mt1/vchoang/works/projects/oda/text2sql/code/TabularSemanticParsing/src/semantic_parser/learn_framework.py", line 430, in format_batch
    num_included_nodes=num_included_nodes)
  File "/mnt/shared_ad3_mt1/vchoang/works/projects/oda/text2sql/code/TabularSemanticParsing/src/data_processor/vectorizers.py", line 102, in vectorize_field_ptr_out
    if schema_pos < num_included_nodes:
TypeError: '<' not supported between instances of 'NoneType' and 'int'

wandb: Waiting for W&B process to finish, PID 21092
wandb: Program failed with code 1. Press ctrl-c to abort syncing.
wandb: Run summary:
wandb:      learning_rate/spider 0.0005
wandb:                  _runtime 44.21112084388733
wandb:                _timestamp 1610408083.642099
wandb:                     _step 1
wandb:   fine_tuning_rate/spider 3.00075e-05
wandb: Syncing files in wandb/run-20210111_233422-17hmk4jb:
wandb:   code/src/experiments.py
wandb: plus 7 W&B file(s) and 1 media file(s)
wandb:
wandb: Synced spider.bridge.lstm.meta.ts.ppl-0.85.2.dn.eo.feat.bert-large-uncased.xavier-1024-400-400-16-2-0.0005-inv-sqr-0.0005-4000-6e-05-inv-sqr-3e-05-4000-0.3-0.3-0.0-0.0-1-8-0.0-0.0-res-0.2-0.0-ff-0.4-0.0.210111-233422.pomb: https://app.wandb.ai/duyvuleo/smore-spider-group--final/runs/17hmk4jb

Do you know the reason for this error? Thanks!

TheurgicDuke771 commented 3 years ago

Maybe same as #1 , please check if this helps.

todpole3 commented 3 years ago

@ #1