daiquocnguyen / CapsE

A Capsule Network-based Embedding Model for Knowledge Graph Completion and Search Personalization (NAACL 2019)
Apache License 2.0
144 stars 32 forks source link

some question #5

Closed 1780041410 closed 5 years ago

1780041410 commented 5 years ago

hello ,when i run the follow ,some question caused. image

1780041410 commented 5 years ago

could you help me ? thanks

1780041410 commented 5 years ago

the first is python CapsE.py --embedding_dim 100 --num_epochs 31 --num_filters 50 --learning_rate 0.0001 --name FB15k-237 --useConstantInit --savedEpochs 30 --model_name fb15k237

the seccond is(the above picture ) python evalCapsE.py --embedding_dim 100 --num_filters 50 --name FB15k-237 --useConstantInit --model_index 30 --model_name fb15k237 --num_splits 8 --decode

datquocnguyen commented 5 years ago

See: https://github.com/daiquocnguyen/CapsE/blob/master/command.sh You have to run commands in command.sh before running python evalCapsE.py --embedding_dim 100 --num_filters 50 --name FB15k-237 --useConstantInit --model_index 30 --model_name fb15k237 --num_splits 8 --decode

zysforever commented 4 years ago

See: https://github.com/daiquocnguyen/CapsE/blob/master/command.sh You have to run commands in command.sh before running python evalCapsE.py --embedding_dim 100 --num_filters 50 --name FB15k-237 --useConstantInit --model_index 30 --model_name fb15k237 --num_splits 8 --decode

After I run python CapsE.py --embedding_dim 100 --num_epochs 31 --num_filters 50 --learning_rate 0.0001 --name FB15k-237 --useConstantInit --savedEpochs 30 --model_name fb15k237, should I run the python evalCapsE.py --embedding_dim 100 --num_filters 50 --name FB15k-237 --useConstantInit --model_index 30 --model_name fb15k237 --num_splits 8 --testIdx 0 each time until testIdx 7 ? Then run python evalCapsE.py --embedding_dim 100 --num_filters 50 --name FB15k-237 --useConstantInit --model_index 30 --model_name fb15k237 --num_splits 8 --decode for the final results. Am I right to do like this? Looking forward to your reply.