huggingface / optimum-habana

Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Apache License 2.0
153 stars 202 forks source link

Pr1280 fix #1504

Open Luca-Calabria opened 2 days ago

Luca-Calabria commented 2 days ago

What does this PR do?

Fix style issue from https://github.com/huggingface/optimum-habana/pull/1280. I can't modify the PR itself but we can merge a fork with fixes

Luca-Calabria commented 2 days ago

@libinta @emascarenhas @nedo99 this is a PR to replace the original https://github.com/huggingface/optimum-habana/pull/1280 with fixes on style

github-actions[bot] commented 1 day ago

The code quality check failed, please run make style.

HuggingFaceDocBuilderDev commented 1 day ago

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

regisss commented 1 day ago

@Luca-Calabria It seems there is still an issue with the code format.

Can you also add a test here adding the model output there please?

Luca-Calabria commented 1 day ago

@Luca-Calabria It seems there is still an issue with the code format.

Can you also add a test here adding the model output there please?

@regiss I fixed the format issue and added test case for Gemma2. I also rebased to main branch.