triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.38k stars 1.49k forks source link

Model Analyzer Fails to Connect to Triton Server ([StatusCode.UNAVAILABLE] failed to connect to all addresses) #7813

Open goudemaoningsir opened 6 days ago

goudemaoningsir commented 6 days ago

I encountered an issue while using the model-analyzer tool with Triton Server. The profiler fails to connect to the server and raises the following exception:

Traceback (most recent call last): File "/usr/local/bin/model-analyzer", line 8, in <module> sys.exit(main()) File "/usr/local/lib/python3.10/dist-packages/model_analyzer/entrypoint.py", line 278, in main analyzer.profile( File "/usr/local/lib/python3.10/dist-packages/model_analyzer/analyzer.py", line 130, in profile self._get_server_only_metrics(client, gpus) File "/usr/local/lib/python3.10/dist-packages/model_analyzer/analyzer.py", line 229, in _get_server_only_metrics client.wait_for_server_ready( File "/usr/local/lib/python3.10/dist-packages/model_analyzer/triton/client/client.py", line 72, in wait_for_server_ready raise TritonModelAnalyzerException(e) model_analyzer.model_analyzer_exceptions.TritonModelAnalyzerException: [StatusCode.UNAVAILABLE] failed to connect to all addresses; last error: UNKNOWN: ipv4:127.0.0.1:8001: Failed to connect to remote host: Timeout occurred: FD Shutdown

model-analyzer profile \ --model-repository=/mnt/models \ --profile-models=densenet_onnx \ --output-model-repository-path=results