NVIDIA-AI-IOT / deepstream_parallel_inference_app

A project demonstrating how to use nvmetamux to run multiple models in parallel.
87 stars 21 forks source link

ERROR: <link_element_to_tee_src_pad:40>: Failed to get src pad from tee #7

Closed challengesll closed 9 months ago

challengesll commented 9 months ago

When pulling RTSP video streams, the source is added above 16, there is an issue of not being able to link to the tee. The following is an error message 1701400998614 What is the reason for this?

challengesll commented 9 months ago

The macro definition "MAX_PRIMARY_GIE_BINS" in the deepstream_parallel_infer.h file defines the number of supported video streams.