Search Criteria
Package Details: python-transformers 4.57.1-1
Package Actions
| Git Clone URL: | https://aur.archlinux.org/python-transformers.git (read-only, click to copy) |
|---|---|
| Package Base: | python-transformers |
| Description: | State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow |
| Upstream URL: | https://github.com/huggingface/transformers |
| Keywords: | huggingface transformers |
| Licenses: | Apache-2.0 |
| Submitter: | filipg |
| Maintainer: | daskol |
| Last Packager: | daskol |
| Votes: | 16 |
| Popularity: | 0.81 |
| First Submitted: | 2021-10-23 09:30 (UTC) |
| Last Updated: | 2025-11-05 20:02 (UTC) |
Dependencies (23)
- python-filelock
- python-huggingface-hub (python-huggingface-hub-gitAUR)
- python-numpy (python-numpy-gitAUR, python-numpy1AUR, python-numpy-mkl-binAUR, python-numpy-mkl-tbbAUR, python-numpy-mklAUR)
- python-packaging
- python-regex (python-regex-gitAUR)
- python-requests
- python-safetensorsAUR (python-safetensors-binAUR)
- python-tokenizersAUR
- python-tqdm
- python-yaml (python-yaml-gitAUR)
- python-build (make)
- python-installer (make)
- python-setuptools (make)
- python-wheel (make)
- python-bitsandbytesAUR (python-bitsandbytes-rocm-gitAUR, python-bitsandbytes-gitAUR, python-bitsandbytesAUR) (optional) – 8-bit support for PyTorch
- python-flaxAUR (optional) – JAX support
- python-hf-xet (optional) – xethub support
- python-keras (python-keras-gitAUR) (optional) – Support for models in Keras 3
- python-onnxconverter-commonAUR (optional) – TensorFlow support
- python-pytorch (python-pytorch-cxx11abiAUR, python-pytorch-cxx11abi-optAUR, python-pytorch-cxx11abi-cudaAUR, python-pytorch-cxx11abi-opt-cudaAUR, python-pytorch-cxx11abi-rocmAUR, python-pytorch-cxx11abi-opt-rocmAUR, python-pytorch-cuda12.9AUR, python-pytorch-opt-cuda12.9AUR, python-pytorch-cuda, python-pytorch-opt, python-pytorch-opt-cuda, python-pytorch-opt-rocm, python-pytorch-rocm) (optional) – PyTorch support
- Show 3 more dependencies...
Required by (60)
- coqui-tts (optional)
- dsnote (optional)
- dsnote-git (optional)
- ik-llama.cpp (optional)
- ik-llama.cpp-cuda (optional)
- ik-llama.cpp-vulkan (optional)
- llama.cpp (optional)
- llama.cpp-cuda (optional)
- llama.cpp-hip (optional)
- llama.cpp-vulkan (optional)
- localai-git
- localai-git-cuda
- localai-git-rocm
- manga-ocr-git
- mokuro
- monailabel (optional)
- open-webui-no-venv
- pix2tex
- python-accelerate (check)
- python-aqlm
- Show 40 more...
Latest Comments
1 2 3 4 5 Next › Last »
shayaknyc commented on 2025-11-17 17:31 (UTC)
@F1729 - thank you so much! This helped! Appreciate you!
F1729 commented on 2025-11-17 17:25 (UTC) (edited on 2025-11-17 17:25 (UTC) by F1729)
@shaynaknyc You can locally resolve the issue by downgrading the package "python-huggingface-hub" to the last compatible version, 0.36.0.
@daskol Would it make sense to specify the dependency to "python-huggingface-hub < 1.0" until the issue is resolved upstream?
shayaknyc commented on 2025-11-12 17:45 (UTC)
Sooo....like....do we just sit and wait for upstream to fix this? Is there no workaround or manually applied fix?
racehd commented on 2025-11-10 14:47 (UTC)
There's more info upstream: https://github.com/huggingface/transformers/issues/41970 . Mainly, huggingfacehub 1.0 is mentioned to have broke a lot of projects that otherwise worked well with the current release of transformers. So this was an intentional choice upstream made at some point to have tight interdependencies.
I do see the upstream
setup.pyhas incremented the huggingface-hub version check. This AUR project is tied to releases which there has not been a new one yet for this change and whatever else they have worked on to make it more compatible with huggingfacehub 1.0+Pyblo commented on 2025-11-10 12:55 (UTC) (edited on 2025-11-10 13:02 (UTC) by Pyblo)
the issue with
huggingface-hubdependency still persists:Do we actually know why originally the
huggingface-hubversion has been limited to be< 1.0?From github i see, that main branch has
"huggingface-hub>=1.0.0,<2.0",defined in dependecies.max2000warlord commented on 2025-11-07 02:06 (UTC)
Adding:
prepare() { cd "$_pkgname-$pkgver"
sed -i 's/"huggingface-hub>=0.34.0,<1.0"/"huggingface-hub>=0.34.0"/g' src/transformers/dependency_versions_table.py
to the PKGBUILD immediately after depends=()
worked for me
racehd commented on 2025-11-06 13:40 (UTC) (edited on 2025-11-06 13:57 (UTC) by racehd)
Hello, thank you for maintaining this. I am getting the same error during
checkstage:Which I realize is an issue with the upstream having the import step check the version of huggingface-hub hardcoded into it. It did seem strange to me that what triggered the python-transformers update was the update to huggingface-hub, even though the two versions are incompatible. But thinking about it differently:
Had python-transformers NOT updated (suggested below), then anything trying to import transformers would still fail because:
I'm not really sure what the best solution there is besides users downgrading huggingface-hub and managing their versions themselves (which is what I did for now).
envolution commented on 2025-07-29 22:30 (UTC)
@daskol I'm confused how you're running checks before bumping the version if the required version of huggingface-hub was released 2 hours ago
daskol commented on 2025-07-29 21:05 (UTC)
@envolution Of course, stage
checkinPKGBUILDaddresses exactly this issue.https://gitlab.archlinux.org/archlinux/packaging/packages/python-huggingface-hub/-/commits/main/PKGBUILD?ref_type=heads
envolution commented on 2025-07-29 01:39 (UTC)
@daskol out of curiosity, are you running build tests before bumping versions? There's a number of packages that depend on this and it may be worthwhile to withhold version increments until upstream dependencies dependencies catch up:
If you would like I can help with this
1 2 3 4 5 Next › Last »