NVIDIA / TensorRT-LLM

TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that execute those TensorRT engines.
https://nvidia.github.io/TensorRT-LLM
Apache License 2.0
8.78k stars 1k forks source link

Fail to run BLIP2-T5 in multiple A30 TP #1141

Open Titanpku opened 9 months ago

Titanpku commented 9 months ago

System Info

Who can help?

No response

Information

Tasks

Reproduction

Hanging when running mpirun --allow-run-as-root -np 2 python run.py \ --blip2_encoder \ --max_new_tokens 30 \ --input_text "Question: which city is this? Answer:" \ --hf_model_dir tmp/hf_models/${MODEL_NAME} \ --visual_engine_dir visual_engines/${MODEL_NAME} \ --llm_engine_dir trt_engines/${MODEL_NAME}/2-gpu/bfloat16/tp2 but works for running: python run.py \ --blip2_encoder \ --max_new_tokens 30 \ --input_text "Question: which city is this? Answer:" \ --hf_model_dir tmp/hf_models/${MODEL_NAME} \ --visual_engine_dir visual_engines/${MODEL_NAME} \ --llm_engine_dir trt_engines/${MODEL_NAME}/1-gpu/bfloat16/tp1

Expected behavior

Finish as python run.py \ --blip2_encoder \ --max_new_tokens 30 \ --input_text "Question: which city is this? Answer:" \ --hf_model_dir tmp/hf_models/${MODEL_NAME} \ --visual_engine_dir visual_engines/${MODEL_NAME} \ --llm_engine_dir trt_engines/${MODEL_NAME}/1-gpu/bfloat16/tp1

actual behavior

Hanging when running mpirun --allow-run-as-root -np 2 python run.py \ --blip2_encoder \ --max_new_tokens 30 \ --input_text "Question: which city is this? Answer:" \ --hf_model_dir tmp/hf_models/${MODEL_NAME} \ --visual_engine_dir visual_engines/${MODEL_NAME} \ --llm_engine_dir trt_engines/${MODEL_NAME}/2-gpu/bfloat16/tp2

additional notes

Hanging when running mpirun --allow-run-as-root -np 2 python run.py \ --blip2_encoder \ --max_new_tokens 30 \ --input_text "Question: which city is this? Answer:" \ --hf_model_dir tmp/hf_models/${MODEL_NAME} \ --visual_engine_dir visual_engines/${MODEL_NAME} \ --llm_engine_dir trt_engines/${MODEL_NAME}/2-gpu/bfloat16/tp2 but works for running: python run.py \ --blip2_encoder \ --max_new_tokens 30 \ --input_text "Question: which city is this? Answer:" \ --hf_model_dir tmp/hf_models/${MODEL_NAME} \ --visual_engine_dir visual_engines/${MODEL_NAME} \ --llm_engine_dir trt_engines/${MODEL_NAME}/1-gpu/bfloat16/tp1

hello-11 commented 1 week ago

@Titanpku Do you still have the question? If not, we will close it soon.