Closed neilyoung closed 1 year ago
Hi @neilyoung , to narrow down whether to slow-down on your system is coming from the camera interface or the display interface, you can try:
If I would think about a Rust implementation, what library would have to be imported and do the libs support C interfacing?
The core jetson-inference library is C++ - this is what the Python bindings use internally. You can see these used in the C++ samples and documentation:
The C++ inferencing code lives under: https://github.com/dusty-nv/jetson-inference/tree/master/c
Thanks for your (as always) prompt and accurate answer. This video thing is a very good idea. I will try that now
OK, 15 fps on a small video, 30 fps w/o inference.
Video is just about 22 MB. Shall I upload it here for your test?
OK, at least a screenshot here
The Jetson is power supplied by a barrel jack and cooled by a fan. JTOP is aside. The video is small - 15 fps...
No idea...
I was rendering the output to RTP now
display = jetson.utils.videoOutput("rtp://192.168.188.24:1234")
On my MacBook on the same network I'm receiving it using GST 1.18
gst-launch-1.0 -v udpsrc port=1234 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! fpsdisplaysink
The frame rate is disappointingly low (about 5.6 fps) with an input of MP4.
My next attempt was to drop the display at all and try with camera and file as input:
frames = 0
start = time.time()
while display.IsStreaming():
img = camera.Capture()
detections = net.Detect(img)
frames = frames + 1
now = time.time()
delta = now - start
if delta >= 1:
fps = frames / delta
start = now
frames = 0
print("FPS {0:2f}".format(fps))
To summarize:
Input | Output | Inference | FPS |
---|---|---|---|
USB Logitech H.264 640 x 480 | Display | Yes | 15 |
USB Logitech H.264 640 x 480 | Display | No | 15 |
USB Logitech H.264 640 x 480 | No | Yes | 15 |
USB Logitech H.264 640 x 480 | No | No | 15 |
File | No | No | 30 |
File | No | Yes | 22 |
File | Display | No | 29 |
File | Display | Yes | 19 |
So the with camera input it doesn't play any role, if display is on or not or if inference is on or not. It is always just 15 fps achievable. This in itself is a bummer...
With a file as input I only achieve 30 fps, if no display happens and inference is off. With display on, the impact is max. 2 fps lower, compared inferences off vs. inference on.
I don't want to be bitchy and say I don't see that 24 fps inference rate, it is close. But what the hell is wrong with the USB?
And to make it more confusing: All the file results render to max 15 fps with display and inference, if I use the bird
video instead of the cantina
video...
This is all in all not reliable.
EDIT: To be more specific: These are the results with another video:
Input | Output | Inference | FPS |
---|---|---|---|
File | No | No | 30 |
File | No | Yes | 22 |
File | Display | No | 29 |
File | Display | Yes | 16 |
Hmm strange - I wonder if it is related to the codec or resolution of that video?
When I stream detectnet.py via RTP here (to an Ubuntu PC over 5GHz WiFi), I get 22-24FPS.
I'm also using 5 GHz... But what concerns me more is the poor USB performance...
FWIW, it could just be my code (although I run C920 @ 30FPS here with video-viewer). If the lights are dim, the camera slows due to auto-exposure down, but my guess is your lights aren't dim.
If you just run the camera through a standalone GStreamer pipeline, do you get full framerate then?
Hmm. My lights are dimmed. And to be honest, there is literally no sun over here since weeks :)
I could confirm the poor results with this pipeline:
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=300 ! \
"video/x-raw,width=640,framerate=30/1" ! \
videorate ! "video/x-raw,framerate=30/1" ! \
jpegenc ! avimux ! filesink location=output.avi
300 frames should be on disk in 10 secs. Instead...
Got EOS from element "pipeline0".
Execution ended after 0:00:20.242976585
Having repeated the test right now with LIGHTS FULLY ON:
Execution ended after 0:00:13.978658301
which is about 23 fps... Hahahah :) How funny...
Will get three new ESP cams on Saturday. Let's see then.
Thanks for your patience and extraordinary support.
:laughing: yea, the dim lights got me scratching my head before too!
@neilyoung can you tell me how to make lights fully on ?
Dusty and me talked about the environment lights.
emmmm...
Hi Dusty,
I'm still fighting my frame rate issues. It is really like in Shrek 1: "By night one way, by day another". Means: Most the time the detection sample code just makes 15 fps @ 640 x 480. If I then use
video-viewer
to check the fps, I get the same, so just lame 15 fps from a Logitech 920, single camera.This is the code, slightly modified:
I have no idea anymore, why I don't get more, at least the 24 fps promised by the inference engine...
Would you mind to run this with your Logitech and tell me your results?
If I would think about a Rust implementation, what library would have to be imported and do the libs support C interfacing?