-
Thanks for adding support to VLM.
I was using [this](https://github.com/stanfordnlp/dspy/blob/main/examples/vlm/mmmu.ipynb) notebook.Tried with the `Qwen2-VL-7B-Instruct` and `Llama-3.2-11B-Vision-…
-
-
-
Thanks for adding VLM support to textgrad.
This doc describe how to use textgrad to do the autoprompt for [`gpt-4o`.](https://github.com/zou-group/textgrad/blob/main/examples/notebooks/Tutorial-Mul…
-
Dear author and peers,
I have one important question about how you use the the constraints output from the VLM to generate the exact calculation codes of the cost? In the paper it is said that the …
-
Use a model server like [ollama](https://github.com/ollama/ollama) or [vllm](https://docs.vllm.ai/en/latest/getting_started/quickstart.html#offline-batched-inference).
Two advantages:
1. it works …
-
## Summary
### Acceptance Criteria:
```[tasklist]
- [ ] Officially on contract
- [ ] Implementation kick-off
```
# Dependencies
All implementation teams to deliver documentation
# Risk…
-
```Epoch:[0/19](0/991) loss:10.057 lr:0.0000010 epoch_Time:46.0min:
…
-
### Describe the issue as clearly as possible:
I tried to run the Cookbook ["Receipt Data Extraction with VLMs"](https://github.com/dottxt-ai/outlines/blob/b55d31463cb6ed38fc0109e018f53ce0cdafbe19/…
-
Hi Daniel,
Thank you so much for releasing such an awesome VLM fine-tuning notebook to the public!
I was really excited, tried the notebook out and found the following error:
![image](https:/…