Package Details: python-transformers 4.50.3-1

Git Clone URL: https://aur.archlinux.org/python-transformers.git (read-only, click to copy)
Package Base: python-transformers
Description: State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow
Upstream URL: https://github.com/huggingface/transformers
Keywords: huggingface transformers
Licenses: Apache-2.0
Submitter: filipg
Maintainer: daskol
Last Packager: daskol
Votes: 15
Popularity: 0.66
First Submitted: 2021-10-23 09:30 (UTC)
Last Updated: 2025-03-28 19:42 (UTC)

Sources (1)

Latest Comments

1 2 3 4 Next › Last »

envolution commented on 2025-03-28 20:42 (UTC)

no worries, I think they intended to publish 4.50.3 but somehow used the previous version tag - agree with all of your sentiments on the issue log

daskol commented on 2025-03-28 19:57 (UTC)

@envolution Thank you for reporting the issue.

HF maintainers indeed have changed release v4.50.2 (checksums of my backed up and newly downloaded tarball differ). Here is the issue to track an incident on HF side.

envolution commented on 2025-03-28 16:18 (UTC)

Yeah the new archive tagged 4.50.2 actually builds a wheel called dist/transformers-4.50.3-py3-none-any.whl - so the package section fails:

FileNotFoundError: [Errno 2] No such file or directory: '/tmp/20250328121057_python-transformers/python-transformers/src/transformers-4.50.2/dist/transformers-4.50.2-*-*.whl'

here's the wheel after build succeeds:

$ ls /tmp/20250328121057_python-transformers/python-transformers/src/transformers-4.50.2/dist/transformers*
/tmp/20250328121057_python-transformers/python-transformers/src/transformers-4.50.2/dist/transformers-4.50.3-py3-none-any.whl

envolution commented on 2025-03-28 16:13 (UTC)

they probably replaced the archive for some reason, the correct shasum is:

sha256sums=('50e286552ddef214d49f75ac2ab5acd68efbc1c57b9930ead2e6d04f197e9024')

trougnouf commented on 2025-03-21 13:08 (UTC) (edited on 2025-03-21 13:11 (UTC) by trougnouf)

Thanks! That's now fixed in python-safetensors-bin

archlinux-ai seemed enticing but I don't feel comfortable adding a Moscow-based maintainer to my trusted keys.

(I realize that's you and I appreciate your work and hope things improve one day.)

daskol commented on 2025-03-17 20:05 (UTC)

@trougnouf It is a packaging issue of python-safetensors-bin. Arrayprovide must contains package name with it version (e.g. provides=('python-safetensors-bin=0.5.3')).

BTW If your are looking for pre-built packages, take a look at this.

trougnouf commented on 2025-03-17 19:06 (UTC)

trougnouf@bbripxlarch ~/D/python-transformers> makepkg
==> Making package: python-transformers 4.49.0-2 (Mon 17 Mar 2025 08:04:42 PM CET)
==> Checking runtime dependencies...
==> Missing dependencies:
  -> python-safetensors>=0.4.1
==> Checking buildtime dependencies...
==> ERROR: Could not resolve all dependencies.
trougnouf@bbripxlarch ~/D/python-transformers [8]> pacman -Qs safetensors
local/python-safetensors-bin 0.5.3-2
    Simple, safe way to store and distribute tensors. Installed via pypi.
local/python-safetensors-bin-debug 0.5.3-2
    Detached debugging symbols for python-safetensors-bin

Is min. version number incompatible w/ different package names?

daskol commented on 2025-02-27 10:44 (UTC)

@mistersmee optdepends has been updated with python-keras and python-tf-keras (aka Keras 2).

mistersmee commented on 2025-02-26 14:42 (UTC)

@daskol, please add python-tf-keras as dependency, as per my bug report upstream: https://github.com/huggingface/transformers/issues/36410, it is required until upstream fully moves to Keras 3, whenever that may be.

mistersmee commented on 2025-02-26 05:09 (UTC)

Just to give additional context, I was attempting to create a BART pipeline when it failed, I believe the some of the relevant lines of code that led to this error are: https://github.com/huggingface/transformers/blob/41925e42135257361b7f02aa20e3bbdab3f7b923/src/transformers/modeling_tf_utils.py#L91-L103

And yes, setting the TF_USE_LEGACY_KERAS environment variable did not work, for me, at least. The environment variable and the try: import tf_keras bits seem to try to fix the same problem but in two different ways, I'll try to go to upstream and try to figure the approach out.