NOTE The next v4.51.0 requires xet-core
(transitive through python-huggingface-hub>=1:0.30.0
). It is now under testing in extra-testing.
Search Criteria
Package Details: python-transformers 4.51.1-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/python-transformers.git (read-only, click to copy) |
---|---|
Package Base: | python-transformers |
Description: | State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow |
Upstream URL: | https://github.com/huggingface/transformers |
Keywords: | huggingface transformers |
Licenses: | Apache-2.0 |
Submitter: | filipg |
Maintainer: | daskol |
Last Packager: | daskol |
Votes: | 15 |
Popularity: | 0.55 |
First Submitted: | 2021-10-23 09:30 (UTC) |
Last Updated: | 2025-04-08 12:28 (UTC) |
Dependencies (23)
- python-filelock
- python-huggingface-hub (python-huggingface-hub-gitAUR)
- python-numpy (python-numpy-gitAUR, python-numpy1AUR, python-numpy-mkl-binAUR, python-numpy-mkl-tbbAUR, python-numpy-mklAUR)
- python-packaging
- python-regex (python-regex-gitAUR)
- python-requests
- python-safetensorsAUR (python-safetensors-binAUR)
- python-tokenizersAUR
- python-tqdm
- python-yaml (python-yaml-gitAUR)
- python-build (make)
- python-installer (make)
- python-setuptools (make)
- python-wheel (make)
- python-bitsandbytes (python-bitsandbytes-rocm-gitAUR, python-bitsandbytes-gitAUR) (optional) – 8-bit support for PyTorch
- python-flaxAUR (optional) – JAX support
- python-hf-xet (optional) – xethub support
- python-keras (python-keras-gitAUR) (optional) – Support for models in Keras 3
- python-onnxconverter-commonAUR (optional) – TensorFlow support
- python-pytorch (python-pytorch-cxx11abiAUR, python-pytorch-cxx11abi-optAUR, python-pytorch-cxx11abi-cudaAUR, python-pytorch-cxx11abi-opt-cudaAUR, python-pytorch-cxx11abi-rocmAUR, python-pytorch-cxx11abi-opt-rocmAUR, python-pytorch-cuda, python-pytorch-opt, python-pytorch-opt-cuda, python-pytorch-opt-rocm, python-pytorch-rocm) (optional) – PyTorch support
- python-tensorflow (python-tensorflow-cuda-keplerAUR, python-tensorflow-computecppAUR, python-tensorflow-rocmAUR, python-tensorflow-opt-rocmAUR, python-tensorflow-cuda, python-tensorflow-opt, python-tensorflow-opt-cuda) (optional) – TensorFlow support
- python-tf-kerasAUR (optional) – Support for models in Keras 2 (e.g. BART)
- python-tf2onnxAUR (optional) – TensorFlow support
Required by (44)
- coqui-tts (optional)
- dsnote (optional)
- dsnote-git (optional)
- localai-git
- localai-git-cuda
- localai-git-rocm
- manga-ocr-git
- mokuro
- monailabel (optional)
- open-webui-no-venv
- pix2tex
- python-accelerate (check)
- python-assistant
- python-auralis
- python-bark-git
- python-bitsandbytes-git
- python-colbert-ai
- python-compressed-tensors
- python-compressed-tensors (check)
- python-deepmultilingualpunctuation
- Show 24 more...
Sources (1)
daskol commented on 2025-04-07 08:09 (UTC) (edited on 2025-04-07 08:09 (UTC) by daskol)
envolution commented on 2025-03-28 20:42 (UTC)
no worries, I think they intended to publish 4.50.3 but somehow used the previous version tag - agree with all of your sentiments on the issue log
daskol commented on 2025-03-28 19:57 (UTC)
@envolution Thank you for reporting the issue.
HF maintainers indeed have changed release v4.50.2 (checksums of my backed up and newly downloaded tarball differ). Here is the issue to track an incident on HF side.
envolution commented on 2025-03-28 16:18 (UTC)
Yeah the new archive tagged 4.50.2 actually builds a wheel called dist/transformers-4.50.3-py3-none-any.whl
- so the package section fails:
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/20250328121057_python-transformers/python-transformers/src/transformers-4.50.2/dist/transformers-4.50.2-*-*.whl'
here's the wheel after build succeeds:
$ ls /tmp/20250328121057_python-transformers/python-transformers/src/transformers-4.50.2/dist/transformers*
/tmp/20250328121057_python-transformers/python-transformers/src/transformers-4.50.2/dist/transformers-4.50.3-py3-none-any.whl
envolution commented on 2025-03-28 16:13 (UTC)
they probably replaced the archive for some reason, the correct shasum is:
sha256sums=('50e286552ddef214d49f75ac2ab5acd68efbc1c57b9930ead2e6d04f197e9024')
trougnouf commented on 2025-03-21 13:08 (UTC) (edited on 2025-03-21 13:11 (UTC) by trougnouf)
Thanks! That's now fixed in python-safetensors-bin
archlinux-ai seemed enticing but I don't feel comfortable adding a Moscow-based maintainer to my trusted keys.
(I realize that's you and I appreciate your work and hope things improve one day.)
daskol commented on 2025-03-17 20:05 (UTC)
@trougnouf It is a packaging issue of python-safetensors-bin
. Arrayprovide
must contains package name with it version (e.g. provides=('python-safetensors-bin=0.5.3')
).
BTW If your are looking for pre-built packages, take a look at this.
trougnouf commented on 2025-03-17 19:06 (UTC)
trougnouf@bbripxlarch ~/D/python-transformers> makepkg
==> Making package: python-transformers 4.49.0-2 (Mon 17 Mar 2025 08:04:42 PM CET)
==> Checking runtime dependencies...
==> Missing dependencies:
-> python-safetensors>=0.4.1
==> Checking buildtime dependencies...
==> ERROR: Could not resolve all dependencies.
trougnouf@bbripxlarch ~/D/python-transformers [8]> pacman -Qs safetensors
local/python-safetensors-bin 0.5.3-2
Simple, safe way to store and distribute tensors. Installed via pypi.
local/python-safetensors-bin-debug 0.5.3-2
Detached debugging symbols for python-safetensors-bin
Is min. version number incompatible w/ different package names?
daskol commented on 2025-02-27 10:44 (UTC)
@mistersmee optdepends
has been updated with python-keras
and python-tf-keras
(aka Keras 2).
mistersmee commented on 2025-02-26 14:42 (UTC)
@daskol, please add python-tf-keras
as dependency, as per my bug report upstream: https://github.com/huggingface/transformers/issues/36410, it is required until upstream fully moves to Keras 3, whenever that may be.
Pinned Comments
daskol commented on 2025-04-07 08:09 (UTC) (edited on 2025-04-07 08:09 (UTC) by daskol)
NOTE The next v4.51.0 requires
xet-core
(transitive throughpython-huggingface-hub>=1:0.30.0
). It is now under testing in extra-testing.