A fast, easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models.
the thing is, when i run the code, the fps drop until 1 and half fps. this also happen when im using my rtsp camera and webcam. it also happen when i run on my other pc. im using latest inference version library
Could you please provide more information on the problem? Does drop happens immediately or it is faster at the beginning?
What is the hardware you run the code?
Search before asking
Question
im trying to run code that provided in inferencepipeline documentation
`from inference import InferencePipeline from inference.core.interfaces.stream.sinks import render_boxes
pipeline = InferencePipeline.init( model_id="yolov8n-640", video_reference=0, on_prediction=render_boxes, )
pipeline.start() pipeline.join()`
the thing is, when i run the code, the fps drop until 1 and half fps. this also happen when im using my rtsp camera and webcam. it also happen when i run on my other pc. im using latest inference version library
Additional
No response