Open auspicious3000 opened 4 years ago
Why not install a conda in your docker? curl -LO https://repo.anaconda.com/miniconda/Miniconda3-py37_4.8.3-Linux-x86_64.sh
It's a long story. I have many other packages installed in the same image, and I have used this for development since the very beginning. If I switch to conda, I have to move all my dependencies into conda, which a few of them don't work well with conda.
So, in this case, should I install a stand-alone mkl separately?
Try to use pip install mkl. And set path of your mkl correctly. https://github.com/Tencent/TurboTransformers/blob/f2d66bc12f0b904328372f472f6379aba50007cc/cmake/FindMKL.cmake#L42
After hours of struggling with strange errors, I finally got the following output. It is compiled using gcc/g++-6.5.0. It is running on v100 gpu. Do these numbers look reasonable to you? Somehow, these numbers are generally larger than those from using your pre-built docker image.
Test project /tmp/build Start 1: tt_core_test 1/24 Test #1: tt_core_test ............................. Passed 2.44 sec Start 2: tt_kernels_test 2/24 Test #2: tt_kernels_test .......................... Passed 40.54 sec Start 3: albert_attention_test 3/24 Test #3: albert_attention_test .................... Passed 12.68 sec Start 4: albert_embedding_test 4/24 Test #4: albert_embedding_test .................... Passed 9.20 sec Start 5: albert_layer_test 5/24 Test #5: albert_layer_test ........................ Passed 10.87 sec Start 6: albert_model_test 6/24 Test #6: albert_model_test ........................ Passed 51.84 sec Start 7: bert_attention_test 7/24 Test #7: bert_attention_test ...................... Passed 8.67 sec Start 8: bert_embedding_test 8/24 Test #8: bert_embedding_test ...................... Passed 15.00 sec Start 9: bert_encoder_test 9/24 Test #9: bert_encoder_test ........................ Passed 10.27 sec Start 10: bert_intermediate_test 10/24 Test #10: bert_intermediate_test ................... Passed 9.45 sec Start 11: bert_layer_test 11/24 Test #11: bert_layer_test .......................... Passed 8.87 sec Start 12: bert_model_test 12/24 Test #12: bert_model_test .......................... Passed 12.34 sec Start 13: bert_output_test 13/24 Test #13: bert_output_test ......................... Passed 8.70 sec Start 14: bert_pooler_test 14/24 Test #14: bert_pooler_test ......................... Passed 7.75 sec Start 15: decoder_multi_headed_attention_test 15/24 Test #15: decoder_multi_headed_attention_test ...... Passed 10.19 sec Start 16: decoder_transformer_decoder_layer_test 16/24 Test #16: decoder_transformer_decoder_layer_test ... Passed 22.29 sec Start 17: gpt2_model_test 17/24 Test #17: gpt2_model_test .......................... Passed 24.12 sec Start 18: positionwise_ffn_test 18/24 Test #18: positionwise_ffn_test .................... Passed 9.30 sec Start 19: qbert_intermediate_test 19/24 Test #19: qbert_intermediate_test .................. Passed 5.75 sec Start 20: qbert_layer_test 20/24 Test #20: qbert_layer_test ......................... Passed 6.76 sec Start 21: qbert_output_test 21/24 Test #21: qbert_output_test ........................ Passed 5.68 sec Start 22: roberta_model_test 22/24 Test #22: roberta_model_test ....................... Passed 15.94 sec Start 23: sequence_pool_test 23/24 Test #23: sequence_pool_test ....................... Passed 7.60 sec Start 24: tensor_conversion_test 24/24 Test #24: tensor_conversion_test ................... Passed 7.22 sec
100% tests passed, 0 tests failed out of 24
Total Test time (real) = 323.49 sec
Congratulations. It works. Could you please share your dockerfile or some operation logs for us.
Sure. I can send the dockerfile to your gmail.
@auspicious3000 Could you please provide dockerfile where you set up without conda? I tried hard to make it work, but this image seems to be invalid (it was created long ago), even with conda.
Could you please send me the dockerfile to lyriccoder@gmail.com or provide it here?
@lyriccoder I installed most things manually after starting the docker. Unfortunately, those steps are not reflected in the dockerfile.
Could you please provide everything what you have, I will add it. I mean both docker file and commands you ran manually?
@lyriccoder Unfortunately, it's been a long time, thus I don't have the commands anymore. The dockerfile is just a simple base image.
I tried to install the turbotransformers into my own docker image. There is no conda in my docker image. I installed packages directly using apt-get and pip. However, when executing sh tools/build_and_run_unittests.sh $PWD -DWITH_GPU=ON, the compiler stuck on finding mkl. After some debugging, I found the $MKLROOT is empty and the /opt directory is also empty. When building my docker image, I used Dockerfile_release.gpu as a reference and installed mkl-include and other necessary packages using pip.
I was wondering if I missed anything when building the image?
Thanks in advance for your time and help!