hyperonym / basaran

Basaran is an open-source alternative to the OpenAI text completion API. It provides a compatible streaming API for your Hugging Face Transformers-based text generation models.
MIT License
1.29k stars 81 forks source link

build(deps): update huggingface-hub requirement from ~=0.14.1 to ~=0.15.1 #201

Closed dependabot[bot] closed 1 year ago

dependabot[bot] commented 1 year ago

Updates the requirements on huggingface-hub to permit the latest version.

Release notes

Sourced from huggingface-hub's releases.

v0.15.1: InferenceClient and background uploads!

InferenceClient

We introduce InferenceClient, a new client to run inference on the Hub. The objective is to:

  • support both InferenceAPI and Inference Endpoints services in a single client.
  • offer a nice interface with:
    • 1 method per task (e.g. summary = client.summarization("this is a long text"))
    • 1 default model per task (i.e. easy to prototype)
    • explicit and documented parameters
    • convenient binary inputs (from url, path, file-like object,...)
  • be flexible and support custom requests if needed

Check out the Inference guide to get a complete overview.

>>> from huggingface_hub import InferenceClient
>>> client = InferenceClient()

>>> image = client.text_to_image("An astronaut riding a horse on the moon.") >>> image.save("astronaut.png")

>>> client.image_classification("https://upload.wikimedia.org/wikipedia/commons/thumb/4/43/Cute_dog.jpg/320px-Cute_dog.jpg") [{'score': 0.9779096841812134, 'label': 'Blenheim spaniel'}, ...]

The short-term goal is to add support for more tasks (here is the current list), especially text-generation and handle asyncio calls. The mid-term goal is to deprecate and replace InferenceAPI.

Non-blocking uploads

It is now possible to run HfApi calls in the background! The goal is to make it easier to upload files periodically without blocking the main thread during a training. The was previously possible when using Repository but is now available for HTTP-based methods like upload_file, upload_folder and create_commit. If run_as_future=True is passed:

  • the job is queued in a background thread. Only 1 worker is spawned to ensure no race condition. The goal is NOT to speed up a process by parallelizing concurrent calls to the Hub.
  • a Future object is returned to check the job status
  • main thread is not interrupted, even if an exception occurs during the upload

In addition to this parameter, a run_as_future(...) method is available to queue any other calls to the Hub. More details in this guide.

>>> from huggingface_hub import HfApi

>>> api = HfApi() >>> api.upload_file(...) # takes Xs

URL to upload file

>>> future = api.upload_file(..., run_as_future=True) # instant >>> future.result() # wait until complete

URL to upload file

... (truncated)

Commits


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)