hangyav / textLSP

Language server for text spell and grammar check with various tools.
GNU General Public License v3.0
47 stars 3 forks source link

build(deps): bump the python-packages group with 4 updates #40

Closed dependabot[bot] closed 3 weeks ago

dependabot[bot] commented 1 month ago

Bumps the python-packages group with 4 updates: language-tool-python, openai, ollama and transformers.

Updates language-tool-python from 2.8 to 2.8.1

Commits


Updates openai from 1.37.1 to 1.43.0

Release notes

Sourced from openai's releases.

v1.43.0

1.43.0 (2024-08-29)

Full Changelog: v1.42.0...v1.43.0

Features

  • api: add file search result details to run steps (#1681) (f5449c0)

v1.42.0

1.42.0 (2024-08-20)

Full Changelog: v1.41.1...v1.42.0

Features

  • parsing: add support for pydantic dataclasses (#1655) (101bee9)

Chores

v1.41.1

1.41.1 (2024-08-19)

Full Changelog: v1.41.0...v1.41.1

Bug Fixes

Chores

  • client: fix parsing union responses when non-json is returned (#1665) (822c37d)

v1.41.0

1.41.0 (2024-08-16)

Full Changelog: v1.40.8...v1.41.0

Features

  • client: add uploads.upload_file helper (aae079d)

v1.40.8

1.40.8 (2024-08-15)

Full Changelog: v1.40.7...v1.40.8

... (truncated)

Changelog

Sourced from openai's changelog.

1.43.0 (2024-08-29)

Full Changelog: v1.42.0...v1.43.0

Features

  • api: add file search result details to run steps (#1681) (f5449c0)

1.42.0 (2024-08-20)

Full Changelog: v1.41.1...v1.42.0

Features

  • parsing: add support for pydantic dataclasses (#1655) (101bee9)

Chores

1.41.1 (2024-08-19)

Full Changelog: v1.41.0...v1.41.1

Bug Fixes

Chores

  • client: fix parsing union responses when non-json is returned (#1665) (822c37d)

1.41.0 (2024-08-16)

Full Changelog: v1.40.8...v1.41.0

Features

  • client: add uploads.upload_file helper (aae079d)

1.40.8 (2024-08-15)

Full Changelog: v1.40.7...v1.40.8

Chores

... (truncated)

Commits


Updates ollama from 0.3.1 to 0.3.2

Release notes

Sourced from ollama's releases.

v0.3.2

What's Changed

New Contributors

Full Changelog: https://github.com/ollama/ollama-python/compare/v0.3.1...v0.3.2

Commits
  • d98f646 IPv6 support (#262)
  • 981015c Merge pull request #261 from ollama/dependabot/pip/ruff-0.6.2
  • 9c34d81 Bump ruff from 0.5.5 to 0.6.2
  • 9f2832d Merge pull request #260 from ollama/dependabot/pip/pytest-asyncio-0.24.0
  • e220e46 Merge pull request #252 from ollama/dependabot/pip/pytest-httpserver-1.1.0
  • dfdeb7c Add URL path to client URL in in Client._parse_host() (#170)
  • 9e6726e Bump pytest-asyncio from 0.23.8 to 0.24.0
  • 10d0ff2 Bump pytest-httpserver from 1.0.12 to 1.1.0
  • See full diff in compare view


Updates transformers from 4.43.3 to 4.44.2

Release notes

Sourced from transformers's releases.

Release v4.44.2

Patch release v4.44.2, mostly 2 regressions that were not caught for Jamba and for processors!

Patch release v4.44.1

Here are the different fixes, mostly Gemma2 context length, nits here and there, and generation issues

Full Changelog: https://github.com/huggingface/transformers/compare/v4.44.0...v4.44.1

Release v4.44.0: End to end compile generation!!! Gemma2 (with assisted decoding), Codestral (Mistral for code), Nemotron, Efficient SFT training, CPU Offloaded KVCache, torch export for static cache

This release comes a bit early in our cycle because we wanted to ship important and requested models along with improved performances for everyone!

All of these are included with examples in the awesome https://github.com/huggingface/local-gemma repository! 🎈 We tried to share examples of what is now possible with all the shipped features! Kudos to @​gante, @​sanchit-gandhi and @​xenova

💥 End-to-end generation compile

Generate: end-to-end compilation #30788 by @​gante: model.generate now supports compiling! There are a few limitations, but here is a small snippet:

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
import copy

model = AutoModelForCausalLM.from_pretrained(
"meta-llama/Meta-Llama-3.1-8B", torch_dtype=torch.bfloat16, device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3.1-8B")

compile generate

compiled_generate = torch.compile(model.generate, fullgraph=True, mode="reduce-overhead")

compiled generate does NOT accept parameterization except a) model inputs b) a generation config

generation_config = copy.deepcopy(model.generation_config)
</tr></table>

... (truncated)

Commits


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions