roboflow / inference

A fast, easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models.
https://inference.roboflow.com
Other
1.3k stars 116 forks source link

Discussion: How to structure LLM, LMM blocks regarding structured usage of outputs in other blocks #417

Open PawelPeczek-Roboflow opened 4 months ago

PawelPeczek-Roboflow commented 4 months ago

Search before asking

Question

Some LLM/LMM would be able to produce output in specific format. For instance - detecting bounding boxes. We have now LMM block and LMMForClassification block made solely for the purpose of producing structured output, required for compatibility with other blocks. This approach is not scalable, we should think what to do.

Initial idea:

Additional

No response