-
Hi,
You say a feed-forward inference on this model is very fast--2ms on a Titan Xp. Do you think the speed would hold up on a recent Android device (like a Google Pixel 4)? I'm looking to do real-t…
-
**Describe the bug**
I am deploying a model using ONNXRuntime on CPU. It is in an environment with a hard real-time budget of 20ms. The average inference time of the model is ~2ms. I have observed th…
-
**Describe the bug**
Not sure if it's a bug or the architecture.
When performing inference, more memory is consumed than the model checkpoint capacity in the current implementation.
In experime…
-
I have some queries:
1. How do you load your model for inference in real time?
2. Can we use this for developing with my custom dataset?
-
On my system with RTX 3080 8GB and Ryzen 9 6900hx, I get around 4 FPS for feature detection(input images were at 1280x720 resolution). Is there any way to increase the inference speed ?
Btw amazing…
-
MultiPoseNet, i have many questions about that paper, especially for the inference time of that paper.
how do u think about those questions ? can u help me ?
in paper's abstract ,"the fastest real…
-
### Search before asking
- [X] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and found no similar bug report.
### Ultralytics YOLO Component
Expo…
-
Introduce interaction to bar-plots using e.g. plotly library.
Aim is to offer user-interaction via input and output probability distributions while updating the model (computing inference) in real-…
-
I have used mmdeploy to get the TersorrRT model. How can I do real-time inference with camera ?Could you give me some simple examples?
-
Thanks your code, this is helpful for my personal project.
I want to know if this is working with webGL and webCam.