-
Hi @wanghao9610,
Congrats on this work! Niels here from the open-source team at Hugging Face. It's great to see you're releasing models on HF :) however some remarks:
## Download stats
I also…
-
Hi folks!
Grounding DINO is now available in the Transformers library, enabling easy inference in a few lines of code.
Here's how to use it:
```python
from transformers import AutoProcessor,…
-
## 🚀 Feature
Currently, the project uses `GroundingDINO` as the visual grounding model which is the best performing model for some benchmark datasets
![current benchmarks for zero-shot object dete…
-
### Issues Policy acknowledgement
- [X] I have read and agree to submit bug reports in accordance with the [issues policy](https://www.github.com/mlflow/mlflow/blob/master/ISSUE_POLICY.md)
### Where…
-
Thanks for this video , I tried to use this code :
!python clip_object_tracker.py --source /content/zero-shot-object-tracking/data/video/Turkey.mp4 --url https://detect.roboflow.com/video-track/2 …
-
Hello @ebenbaruch ,
The annotation files can not be downloaded from the link mentioned in Zero-Shot Object Detection. Can you provide the annotation files download link ?
Thanks,
Strawberry
-
Hi team,
whether AutoGPTQ can be used to do 8 bit quantization of the owl-vit model for zero shot object detection tasks?
your feedback will be helpful.
-
Hi! !مرحبا! السلام عليكم
Let's bring the documentation to all the Arabic-speaking community 🌏 (currently 0 out of 267 complete)
Would you want to translate? Please follow the 🤗 [TRANSLATING guid…
-
#ECCV2018 submission
Institute: CSIRO
URL: https://arxiv.org/pdf/1803.06051.pdf
Keyword: Zero-shot tagging, Object detection
#성장하는 CSIRO #발전하는 CSIRO
-
Namespace(agnostic_nms=False, api_key=None, augment=False, cfg='/content/zero-shot-object-tracking/models/yolov5s.yaml', classes=None, confidence=0.4, detection_engine='yolov5', device='', exist_ok=Fa…