allenai / cached_path

A file utility for accessing both local and remote files through a unified interface.
https://cached-path.readthedocs.io/
Apache License 2.0
36 stars 12 forks source link

Update huggingface-hub requirement from <0.17.0,>=0.8.1 to >=0.8.1,<0.18.0 #193

Closed dependabot[bot] closed 1 year ago

dependabot[bot] commented 1 year ago

Updates the requirements on huggingface-hub to permit the latest version.

Release notes

Sourced from huggingface-hub's releases.

v0.17.0: Inference, CLI and Space API

InferenceClient

All tasks are now supported! :boom:

Thanks to a massive community effort, all inference tasks are now supported in InferenceClient. Newly added tasks are:

Documentation, including examples, for each of these tasks can be found in this table.

All those methods also support async mode using AsyncInferenceClient.

Get InferenceAPI status

Sometimes knowing which models are available or not on the Inference API service can be useful. This release introduces two new helpers:

  1. list_deployed_models aims to help users discover which models are currently deployed, listed by task.
  2. get_model_status aims to get the status of a specific model. That's useful if you already know which model you want to use.

Those two helpers are only available for the Inference API, not Inference Endpoints (or any other provider).

>>> from huggingface_hub import InferenceClient
>>> client = InferenceClient()

Discover zero-shot-classification models currently deployed

>>> models = client.list_deployed_models() >>> models["zero-shot-classification"] ['Narsil/deberta-large-mnli-zero-cls', 'facebook/bart-large-mnli', ...]

Get status for a specific model

>>> client.get_model_status("bigcode/starcoder") ModelStatus(loaded=True, state='Loaded', compute_type='gpu', framework='text-generation-inference')

... (truncated)

Commits
  • 7d82cd9 Release: v0.17.0
  • b28a7af Merget branch 'main' into v0.17-release
  • 8a803c8 Minor corrections to the inference guide (#1649)
  • 7bdec81 Release: v0.17.0.rc0
  • 3734d74 Add example for get_model_status
  • 681815c Add list_deployed_models to inference client (#1622)
  • 80558b3 Return whole response from feature extraction endpoint instead of assuming it...
  • a86cb6f update i18n template
  • 52f7e3a shorter is better
  • adb5193 Add documentation for modelcard Metadata. Resolves (#1448) (#1631)
  • Additional commits viewable in compare view


Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)