huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
132.7k stars 26.44k forks source link

enable low-precision pipeline #31625

Closed jiqing-feng closed 1 week ago

jiqing-feng commented 3 months ago

Hi @amyeroberts @Narsil .

As the previous PR #31444 mentioned, it could enable low-precision pipelines by converting the outputs to float(). I followed the codes in here. Do you mind taking a review? Thx!

aliencaocao commented 3 months ago

Related to https://github.com/huggingface/transformers/pull/31342 but I dont quite get your changes - what exactly does it fix? When I tested all the pipelines in fp16 none of them had issues outputting logits

jiqing-feng commented 2 months ago

Related to #31342 but I dont quite get your changes - what exactly does it fix? When I tested all the pipelines in fp16 none of them had issues outputting logits

Hi @aliencaocao , FP16 works fine but BF16 is not acceptable for numpy, you can see: image

jiqing-feng commented 2 months ago

Hi @SunMarc. I have fixed all your comments; please review them. BTW, the failed tests are due to the connection error, not related to my changes.

jiqing-feng commented 2 months ago

Hi @amyeroberts , would you please review this PR? Thx!

HuggingFaceDocBuilderDev commented 2 months ago

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

jiqing-feng commented 2 months ago

Hi @amyeroberts . I have passed all tests, do you mind take a review and merge it? Thx!

jiqing-feng commented 2 months ago

Hi @amyeroberts . This PR should be ready to merge, please take a review, thx!

amyeroberts commented 2 months ago

Hi @jiqing-feng, thanks for opening this PR! I'll get to reviewing it soon, but will likely be in a few days.

jiqing-feng commented 2 months ago

Hi @amyeroberts . Do you mind reviewing this PR? Thx.

jiqing-feng commented 1 month ago

Hi @amyeroberts . Do you think it could be merged?

yao-matrix commented 1 month ago

@amyeroberts , could you help review this PR? Thx.

yao-matrix commented 2 weeks ago

@SunMarc , do you have a suggestion any other people from HF can help review and merge this PR? Seems @amyeroberts has no bandwidth on this these serval months. Thx.

LysandreJik commented 1 week ago

Sorry for the delay @yao-matrix!

@Rocketknight1 is managing the pipeline; Matt, would you mind reviewing this PR when you have a second?

jiqing-feng commented 1 week ago

Overall this PR seems good to me! However, I prefer if outputs.dtype in (torch.bfloat16, torch.float16) rather than if outputs.dtype == torch.bfloat16, so we can catch float16 as well.

Other than that, I'm happy with it!

Hi @Rocketknight1 , thanks for your review, I have fixed it by your comments.

jiqing-feng commented 1 week ago

Hi @Rocketknight1 , do you mind helping to re-run the tests? Thx.

Rocketknight1 commented 1 week ago

Hi @jiqing-feng, another PR at #33554 touched the same files. I'm sorry - I didn't realize that it was doing the same thing as this PR! That PR has been merged, so I've merged this PR with the code there to avoid conflicts.

Rocketknight1 commented 1 week ago

@jiqing-feng tests pass now and this looks good - are you okay for us to merge it?

Also cc @LysandreJik for final review, but no rush!

jiqing-feng commented 1 week ago

@jiqing-feng tests pass now and this looks good - are you okay for us to merge it?

Yes, please.