Package Details: python-transformers 4.50.3-1

Git Clone URL: https://aur.archlinux.org/python-transformers.git (read-only, click to copy)
Package Base: python-transformers
Description: State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow
Upstream URL: https://github.com/huggingface/transformers
Keywords: huggingface transformers
Licenses: Apache-2.0
Submitter: filipg
Maintainer: daskol
Last Packager: daskol
Votes: 15
Popularity: 0.63
First Submitted: 2021-10-23 09:30 (UTC)
Last Updated: 2025-03-28 19:42 (UTC)

Sources (1)

Latest Comments

« First ‹ Previous 1 2 3 4 Next › Last »

mistersmee commented on 2025-02-26 04:35 (UTC)

transformers gives me an error message about how Keras 3 isn't supported and to install the tf_keras pip package. I don't know if anyone else has had the same error, but I packaged tf-keras as python-tf-keras, so if anyone else has the same issue and the error is reproducable, try installing it and check if it fixes the issue, it does for me. Also, if it does, adding python-tf-keras as a dependency/optdepends to this package might help.

BluePyTheDeer251 commented on 2024-10-04 02:35 (UTC)

The thing failed and threw this:

[code] python-transformers - exit status 8 python-optax - exit status 8 python-flax - exit status 8 python-orbax-checkpoint - exit status 8 python-chex - exit status 8 python-safetensors - exit status 8 python-jax - exit status 8 [/code]

BluePyTheDeer251 commented on 2024-10-03 23:48 (UTC)

I hope this helps an LLM I'm working on, it's a coding AI assistant I'm making as a passion project, like Linus Torvalds said: "It won't be as big as GNU", GNU in this case being GitHub Copilot.

daskol commented on 2024-05-07 09:58 (UTC)

@carsme Typo is fixed.

carsme commented on 2024-05-07 09:53 (UTC)

Tests fail both on my system and in a chroot:

==> Starting check()...
Traceback (most recent call last):
  File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'transformers'

daskol commented on 2023-09-20 20:25 (UTC) (edited on 2023-09-20 20:25 (UTC) by daskol)

@rekman Enforced constraint on python-tokenizers in order to prevent its updates with system update.

Thank you for important news.

rekman commented on 2023-09-20 20:07 (UTC) (edited on 2023-09-20 20:09 (UTC) by rekman)

transformers 4.33.2 requires tokenizers<0.14. Currently the AUR has tokenizers 0.14.0.. Downgrade to 0.13.3 to use transformers for now.

This will be fixed once upstream provides a release including this pull request (it has been merged, just not released yet).

daskol commented on 2023-07-08 12:11 (UTC) (edited on 2023-07-08 12:15 (UTC) by daskol)

@ttc0419 Exactly. Most of dependencies of transformers are optional (see repo). By design HuggingFace is aimed at major deep learning frameworks TF, PT, JAX but the issue is that a user usually adhere to a greater extent to a single framework. So, enumeration of only required dependencies gives a user a freedom to manage installed packages on user's system in more fine-grained way.

The errors I listed are NOT optional.

They are actually optional. Just check transformers repo. It is a nightmare actually. The list of actual dependencies depends on a framework and model. In general it is impossible to manage this kind of projects properly.