issues
search
NVIDIA-AI-IOT
/
deepstream_parallel_inference_app
A project demonstrating how to use nvmetamux to run multiple models in parallel.
82
stars
20
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
nvds_obj_enc_process save wrong source frame image
#8
challengesll
opened
3 months ago
0
ERROR: <link_element_to_tee_src_pad:40>: Failed to get src pad from tee
#7
challengesll
closed
7 months ago
1
Is there a way of getting parallel inferences to appear on the same tile?
#6
lolitsjoey
opened
10 months ago
1
How to relate the correct source_id with model
#5
lmw0320
opened
11 months ago
0
libnvds_osd.so
#4
prince0310
opened
12 months ago
0
Error found when run the code
#3
lmw0320
opened
1 year ago
6
Use gst-nvdsmetamux to fuse 2 obj ids meta from nvpreprocessed src
#2
OsamaRyees
opened
1 year ago
0
Dynamic pipelines
#1
micuentadecasa
opened
1 year ago
2