Skip to content

build(deps): bump huggingface-hub from 0.30.2 to 0.31.2 #189

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github May 14, 2025

Bumps huggingface-hub from 0.30.2 to 0.31.2.

Release notes

Sourced from huggingface-hub's releases.

[v0.31.2] Hot-fix: make hf-xet optional again and bump the min version of the package

Patch release to make hf-xet optional. More context in #3079 and #3078.

Full Changelog: huggingface/huggingface_hub@v0.31.1...v0.31.2

[v0.31.0] LoRAs with Inference Providers, auto mode for provider selection, embeddings models and more

🧑‍🎨 Introducing LoRAs with fal.ai and Replicate providers

We're introducing blazingly fast LoRA inference powered by fal.ai and Replicate through Hugging Face Inference Providers! You can use any compatible LoRA available on the Hugging Face Hub and get generations at lightning fast speed ⚡

from huggingface_hub import InferenceClient
client = InferenceClient(provider="fal-ai") # or provider="replicate"
output is a PIL.Image object
image = client.text_to_image(
"a boy and a girl looking out of a window with a cat perched on the window sill. There is a bicycle parked in front of them and a plant with flowers to the right side of the image. The wall behind them is visible in the background.",
model="openfree/flux-chatgpt-ghibli-lora",
)

⚙️ auto mode for provider selection

You can now automatically select a provider for a model using auto mode — it will pick the first available provider based on your preferred order set in https://hf.co/settings/inference-providers.

from huggingface_hub import InferenceClient
will select the first provider available for the model, sorted by your order.
client = InferenceClient(provider="auto")
completion = client.chat.completions.create(
model="Qwen/Qwen3-235B-A22B",
messages=[
{
"role": "user",
"content": "What is the capital of France?"
}
],
)
print(completion.choices[0].message)

⚠️ Note: This is now the default value for the provider argument. Previously, the default was hf-inference, so this change may be a breaking one if you're not specifying the provider name when initializing InferenceClient or AsyncInferenceClient.

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [huggingface-hub](https://github.com/huggingface/huggingface_hub) from 0.30.2 to 0.31.2.
- [Release notes](https://github.com/huggingface/huggingface_hub/releases)
- [Commits](huggingface/huggingface_hub@v0.30.2...v0.31.2)

---
updated-dependencies:
- dependency-name: huggingface-hub
  dependency-version: 0.31.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels May 14, 2025
Copy link
Contributor Author

dependabot bot commented on behalf of github May 19, 2025

Superseded by #193.

@dependabot dependabot bot closed this May 19, 2025
@dependabot dependabot bot deleted the dependabot/pip/huggingface-hub-0.31.2 branch May 19, 2025 19:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants