Closed allanj closed 2 years ago
I run the following script to reproduce the results of SVAMP using roberta. I didn't change the code
python3 -m src.main -mode train -gpu 0 -embedding roberta -emb_name roberta-base -hidden_size 384 -depth 2 -lr 0.0002 -emb_lr 8e-4 -batch_size 8 -epochs 50 -dataset mawps-asdiv-a_svamp -no-full_cv -run_name run_svamp
but got the following errror
2021-11-10 01:41:07,779 | INFO | main.py: 420 : main() :: Experiment Name: run_svamp 2021-11-10 01:41:07,780 | DEBUG | main.py: 421 : main() :: Created Relevant Directories 2021-11-10 01:41:07,780 | INFO | main.py: 423 : main() :: Loading Data... Transfer numbers... 2021-11-10 01:41:09,253 | DEBUG | main.py: 429 : main() :: Data Loaded... 2021-11-10 01:41:09,254 | DEBUG | main.py: 431 : main() :: Number of Training Examples: 3138 2021-11-10 01:41:09,254 | DEBUG | main.py: 432 : main() :: Number of Testing Examples: 1000 2021-11-10 01:41:09,254 | DEBUG | main.py: 433 : main() :: Extra Numbers: ['0.01', '12.0', '1.0', '100.0', '0.1', '0.5', '3.0', '4.0', '7.0'] 2021-11-10 01:41:09,254 | DEBUG | main.py: 434 : main() :: Maximum Number of Numbers: 7 2021-11-10 01:41:09,254 | INFO | main.py: 437 : main() :: Creating Vocab... keep_words 4068 / 4068 = 1.0000 2021-11-10 01:41:10,316 | DEBUG | pre_data.py: 611 : prepare_data() :: Indexed 4071 words in input language, 21 words in output 2021-11-10 01:41:10,377 | WARNING | helper.py: 160 : get_latest_checkpoint() :: No Checkpoints Found 2021-11-10 01:41:10,378 | DEBUG | main.py: 460 : main() :: Vocab saved at models/run_svamp/vocab1.p 2021-11-10 01:41:10,378 | DEBUG | main.py: 472 : main() :: Config File Saved 2021-11-10 01:41:10,379 | INFO | main.py: 474 : main() :: Initializing Models... Some weights of the model checkpoint at roberta-base were not used when initializing RobertaModel: ['lm_head.layer_norm.bias', 'lm_head.layer_norm.weight', 'lm_head.bias', 'lm_head.dense.weight', 'lm_head.dense.bias', 'lm_head.decoder.weight'] - This IS expected if you are initializing RobertaModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model). - This IS NOT expected if you are initializing RobertaModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model). 2021-11-10 01:41:21,871 | DEBUG | main.py: 491 : main() :: Models Initialized 2021-11-10 01:41:21,871 | INFO | main.py: 492 : main() :: Initializing Optimizers... 2021-11-10 01:41:21,872 | DEBUG | main.py: 500 : main() :: Optimizers Initialized 2021-11-10 01:41:21,872 | INFO | main.py: 501 : main() :: Initializing Schedulers... 2021-11-10 01:41:21,873 | DEBUG | main.py: 509 : main() :: Schedulers Initialized 2021-11-10 01:41:21,873 | INFO | main.py: 511 : main() :: Loading Models on GPU 0... 2021-11-10 01:41:29,035 | DEBUG | main.py: 521 : main() :: Models loaded on GPU 0 2021-11-10 01:41:29,035 | INFO | main.py: 529 : main() :: Starting Training Procedure 2021-11-10 01:41:32,693 | INFO | logger.py: 33 : print_log() :: Epoch: 1 Traceback (most recent call last): File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/usr/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "/opt/tiger/intro/SVAMP/code/graph2tree/src/main.py", line 805, in <module> main() File "/opt/tiger/intro/SVAMP/code/graph2tree/src/main.py", line 545, in main num_pos_batches[idx], graph_batches[idx]) File "/opt/tiger/intro/SVAMP/code/graph2tree/src/train_and_evaluate.py", line 704, in train_tree input_seq1 = input_seq1.transpose(0,1) AttributeError: 'str' object has no attribute 'transpose'
Sorry, I think i use a wrong transformers version.
I run the following script to reproduce the results of SVAMP using roberta. I didn't change the code
but got the following errror