marcoslucianops / DeepStream-Yolo

NVIDIA DeepStream SDK 7.0 / 6.4 / 6.3 / 6.2 / 6.1.1 / 6.1 / 6.0.1 / 6.0 / 5.1 implementation for YOLO models
MIT License
1.38k stars 343 forks source link

Urgent : How to save the prediction into txt file from Deepstream config file #542

Closed ashray21 closed 3 weeks ago

ashray21 commented 1 month ago

Deepstream Version: 6.3

My app is working correctly but i want to save the prediction of the model into the label.txt file of each frame or in any meta data format from where I can get the label class_name confidence x1 y1 x2 y2 . I want to save the output of both models. Is it possible?

My config files look like this.

[application] enable-perf-measurement=1 perf-measurement-interval-sec=5

[tiled-display] enable=0 rows=1 columns=1 width=1280 height=720 gpu-id=0 nvbuf-memory-type=0

[source0] enable=1 type=3 uri=file:///opt/nvidia/deepstream/deepstream-6.3/samples/streams/test.mp4 num-sources=1 gpu-id=0 cudadec-memtype=0

[sink1] enable=1 type=3 container=1 codec=1 enc-type=0 sync=0 bitrate=4000000 profile=0 output-file=output.mp4 source-id=0

[osd] enable=1 gpu-id=0 border-width=5 text-size=15 text-color=1;1;1;1; text-bg-color=0.3;0.3;0.3;1 font=Serif show-clock=0 clock-x-offset=800 clock-y-offset=820 clock-text-size=12 clock-color=1;0;0;0 nvbuf-memory-type=0

[streammux] gpu-id=0 live-source=0 batch-size=1 batched-push-timeout=40000 width=1920 height=1080 enable-padding=0 nvbuf-memory-type=0

[primary-gie] enable=1 gpu-id=0 gie-unique-id=1 nvbuf-memory-type=0 config-file=config_infer_primary_yoloV8.txt

[secondary-gie0] enable=1 gpu-id=0 gie-unique-id=2 nvbuf-memory-type=0 operate-on-gie-id=1 operate-on-class-ids=4 config-file=config_infer_char_yoloV8.txt

[tests] file-loop=0

ashray21 commented 3 weeks ago

I solved this problem using property gie-kitti-output-dir in application config