huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
132.7k stars 26.44k forks source link

Fix paligemma detection inference #31587

Closed molbap closed 3 months ago

molbap commented 3 months ago

What does this PR do?

Fixes #31425

hf hub discussion: https://huggingface.co/google/paligemma-3b-mix-448/discussions/6

Who can review?

@pcuenca @ArthurZucker

molbap commented 3 months ago

cc @zucchini-nlp, a generation bugfix you might want to look at

HuggingFaceDocBuilderDev commented 3 months ago

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

edmondja commented 2 months ago

Is this fix already integrated to the last release https://github.com/huggingface/transformers/releases ? I am not sure release notes are supposed to be exhaustive @ArthurZucker (PS: merci de représenter le MVA)

ArthurZucker commented 2 months ago

Hey!

image

this has been in the last 2 patch releases! They release notes for 4.42.0 should have it!

ArthurZucker commented 2 months ago

😉 sympa de croiser un alumni!

edmondja commented 2 months ago

Hey! image this has been in the last 2 patch releases! They release notes for 4.42.0 should have it!

I really don't see it, no mention of "paligemma" "slow" or "extended" in the page I sent for "v4.42.0: Gemma 2, RTDETR, InstructBLIP, LLAVa Next, New Model Adder" but maybe I have "merde dans les yeux". (oui c'est sympa surtout quand on voit le succès de certains comme toi)

So to conclude maybe I don't see it, and if its the case please accept my apologies for wasting your time. Either way thank you for confirming me the fix was added, good continuation.

ArthurZucker commented 2 months ago

It is at the bottom:

image

and absolutely no worries, we know this was quite important for everyone! 🤗