huggingface/transformers (transformers)
### [`v4.46.1`](https://redirect.github.com/huggingface/transformers/releases/tag/v4.46.1): Patch release v4.46.1
[Compare Source](https://redirect.github.com/huggingface/transformers/compare/v4.46.0...v4.46.1)
##### Patch release v4.4.61
This is mostly for `fx` and `onnx` issues!
\*\* Fix regression loading dtype [#34409](https://redirect.github.com/huggingface/transformers/issues/34409) by [@SunMarc](https://redirect.github.com/SunMarc)
\*\* LLaVa: latency issues [#34460](https://redirect.github.com/huggingface/transformers/issues/34460) by [@zucchini-nlp](https://redirect.github.com/zucchini-nlp)
\*\* Fix pix2struct [#34374](https://redirect.github.com/huggingface/transformers/issues/34374) by [@IlyasMoutawwakil](https://redirect.github.com/IlyasMoutawwakil)
\*\* Fix onnx non-exposable inplace aten op [#34376](https://redirect.github.com/huggingface/transformers/issues/34376) by [@IlyasMoutawwakil](https://redirect.github.com/IlyasMoutawwakil)
\*\* Fix torch.fx issue related to the new `loss_kwargs` keyword argument [#34380](https://redirect.github.com/huggingface/transformers/issues/34380) by [@michaelbenayoun](https://redirect.github.com/michaelbenayoun)
Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
[ ] If you want to rebase/retry this PR, check this box
This PR contains the following updates:
==4.46.0
->==4.46.1
Release Notes
huggingface/transformers (transformers)
### [`v4.46.1`](https://redirect.github.com/huggingface/transformers/releases/tag/v4.46.1): Patch release v4.46.1 [Compare Source](https://redirect.github.com/huggingface/transformers/compare/v4.46.0...v4.46.1) ##### Patch release v4.4.61 This is mostly for `fx` and `onnx` issues! \*\* Fix regression loading dtype [#34409](https://redirect.github.com/huggingface/transformers/issues/34409) by [@SunMarc](https://redirect.github.com/SunMarc) \*\* LLaVa: latency issues [#34460](https://redirect.github.com/huggingface/transformers/issues/34460) by [@zucchini-nlp](https://redirect.github.com/zucchini-nlp) \*\* Fix pix2struct [#34374](https://redirect.github.com/huggingface/transformers/issues/34374) by [@IlyasMoutawwakil](https://redirect.github.com/IlyasMoutawwakil) \*\* Fix onnx non-exposable inplace aten op [#34376](https://redirect.github.com/huggingface/transformers/issues/34376) by [@IlyasMoutawwakil](https://redirect.github.com/IlyasMoutawwakil) \*\* Fix torch.fx issue related to the new `loss_kwargs` keyword argument [#34380](https://redirect.github.com/huggingface/transformers/issues/34380) by [@michaelbenayoun](https://redirect.github.com/michaelbenayoun)Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.