Closed dependabot[bot] closed 1 month ago
OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version
or @dependabot ignore this minor version
. You can also ignore all major, minor, or patch releases for a dependency by adding an ignore
condition with the desired update_types
to your config file.
If you change your mind, just re-open this PR and I'll resolve any conflicts on it.
Bumps transformers from 4.39.3 to 4.42.4.
Release notes
Sourced from transformers's releases.
... (truncated)
Commits
fc35907
v4.42.4e002fcd
[Gemma2
] Support FA2 softcapping (#31887)2e43416
[ConvertSlow
] make sure the order is preserved for addedtokens (#31902)c43fd9d
Fixes to alternating SWA layers in Gemma2 (#31775)0be998b
Requires for torch.tensor before casting (#31755)b7ee1e8
v4.42.3da50b41
Gemma capping is a must for big models (#31698)086c74e
v4.42.28691867
Fix Gemma2 4d attention mask (#31674)7edc993
don't zero out the attention_mask when using sliding window with flash attent...Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show