issues
search
roboflow
/
inference
A fast, easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models.
https://inference.roboflow.com
Other
1.13k
stars
84
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Remove parametrization of fields names
#404
grzegorz-roboflow
closed
1 week ago
0
Add link to Roboflow licensing
#403
capjamesg
closed
1 month ago
0
Store prediction type in sv.Detections, remove from blocks outputs
#402
grzegorz-roboflow
closed
1 month ago
0
Keypoints kind
#401
grzegorz-roboflow
opened
1 month ago
0
Create Universal Query Language
#400
PawelPeczek-Roboflow
closed
1 month ago
0
Paligemma Workflows Block
#399
hansent
opened
1 month ago
9
Fix workflows specification URL and other docs updates
#398
SolomonLake
closed
1 month ago
0
Bugfix paligemma-client
#397
probicheaux
closed
1 month ago
6
Paligemma
#396
probicheaux
closed
1 month ago
0
Document source_id parameter of VideoFrame
#395
sberan
closed
1 month ago
0
Is support for Jetpack 4.5.0 still needed
#394
PawelPeczek-Roboflow
opened
1 month ago
0
Fix/release 0.10.0 Jetson 4.5 build
#393
PawelPeczek-Roboflow
closed
1 month ago
0
Sv detections in core blocks
#392
grzegorz-roboflow
closed
1 month ago
0
Add `image_id` to images metadata
#391
grzegorz-roboflow
opened
1 month ago
0
CropBlock - ensure parent_id of detection and cropped image match
#390
grzegorz-roboflow
opened
1 month ago
0
ImportError
#389
ChristopherMarais
opened
1 month ago
3
Release `v0.10.0`
#388
PawelPeczek-Roboflow
closed
1 month ago
1
Fix instance segmentation batching
#387
grzegorz-roboflow
closed
1 month ago
0
Apply fixes before deployment into hosted platform
#386
PawelPeczek-Roboflow
closed
1 month ago
0
Inference Docs Overhaul
#385
LinasKo
closed
1 month ago
1
Need to skip Yoloact inst-seg tests for the parallel GPU inference server
#384
bigbitbus
closed
1 month ago
3
If I try to make an inference with a local RTSP, I still need to run API_KEY and server, right?
#383
YoungjaeDev
closed
1 month ago
2
Bump next from 13.5.6 to 14.2.3 in /inference/landing
#382
dependabot[bot]
closed
1 month ago
0
Make workflows Apache 2.0
#381
PawelPeczek-Roboflow
closed
1 month ago
1
Workflows block `DetectionOffsetBlock` is not preserving parent coordinates
#380
PawelPeczek-Roboflow
opened
2 months ago
0
Added the github runner instead of the custom runner
#379
bigbitbus
closed
1 month ago
4
Add aliases for keypoint models
#378
LinasKo
closed
2 months ago
0
Jetson 4.6.1: opset 17 not supported for onnxruntime-gpu 1.11.0
#377
TomasBooneHogent
opened
2 months ago
9
Refactor `inference.core.models.base.Model.infer_from_request` to always return `List[InferenceResponse]`
#376
grzegorz-roboflow
opened
2 months ago
0
Update CITATION.cff
#375
capjamesg
closed
2 months ago
0
pin fastapi version since 0.111.0 results in failed inference installation
#374
grzegorz-roboflow
closed
2 months ago
0
current constellation of dependencies inconsistently breaking installation process
#373
grzegorz-roboflow
closed
2 months ago
2
Docs/add keypoint detection
#372
LinasKo
closed
2 months ago
3
Fix metrics middleware attr error (num_errors set on wrong model manager)
#371
probicheaux
closed
2 months ago
0
Failed to install inference package ! Error "AttributeError: module 'pkgutil' has no attribute 'ImpImporter'"
#370
sridharreddyu
closed
2 months ago
5
Allows for custom providers to be passed
#369
Qfc9
opened
2 months ago
7
Lock Grounding DINO package version to 0.2.0
#368
skylargivens
closed
2 months ago
0
Upped skypilot version
#367
bigbitbus
closed
2 months ago
0
add api key fallback for model monitoring
#366
hansent
closed
2 months ago
1
Run Multiple Models all together.
#365
amankumarchagti
closed
1 month ago
10
Downgrade transformers to avoid faulty release of that package
#363
PawelPeczek-Roboflow
closed
2 months ago
1
Can add, delete, update stream realtime for only one pipeline has been run before?
#362
duckaivn
opened
2 months ago
1
inference.core.exceptions.ModelArtefactError: Unable to load ONNX session. Cause: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from /tmp/cache\chinese-calligraphy-recognition-sl0eb/2\best.onnx failed:Protobuf parsing failed.
#361
Sui-25
closed
2 months ago
6
`feature/restructure_workflows_steps_and_docs`
#360
SkalskiP
closed
2 months ago
0
Skip encode image as jpeg if no-resize is specified
#359
PacificDou
closed
2 months ago
1
Bugfix for gaze detection (batch request)
#358
PacificDou
closed
2 months ago
0
Minor docs update, API key in InferenceHTTPClient
#357
LinasKo
closed
2 months ago
0
Can't run inference on AWS Lambda
#356
DominiquePaul
opened
2 months ago
1
Error Running cogvlm model on Self-Hosted GPU Server with Roboflow Inference (Transformer Version)
#355
YoungjaeDev
opened
2 months ago
7
Improve benchmark output; fix exception handling
#354
grzegorz-roboflow
closed
2 months ago
0
Previous
Next