Package Details: python-transformers 4.41.2-1

Git Clone URL: (read-only, click to copy)
Package Base: python-transformers
Description: State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow
Upstream URL:
Keywords: huggingface transformers
Licenses: Apache
Submitter: filipg
Maintainer: xiota (daskol)
Last Packager: daskol
Votes: 8
Popularity: 0.37
First Submitted: 2021-10-23 09:30 (UTC)
Last Updated: 2024-06-01 07:55 (UTC)

Dependencies (20)

Sources (1)

Latest Comments

1 2 3 Next › Last »

daskol commented on 2024-05-07 09:58 (UTC)

@carsme Typo is fixed.

carsme commented on 2024-05-07 09:53 (UTC)

Tests fail both on my system and in a chroot:

==> Starting check()...
Traceback (most recent call last):
  File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'transformers'

daskol commented on 2023-09-20 20:25 (UTC) (edited on 2023-09-20 20:25 (UTC) by daskol)

@rekman Enforced constraint on python-tokenizers in order to prevent its updates with system update.

Thank you for important news.

rekman commented on 2023-09-20 20:07 (UTC) (edited on 2023-09-20 20:09 (UTC) by rekman)

transformers 4.33.2 requires tokenizers<0.14. Currently the AUR has tokenizers 0.14.0.. Downgrade to 0.13.3 to use transformers for now.

This will be fixed once upstream provides a release including this pull request (it has been merged, just not released yet).

xiota commented on 2023-07-09 22:29 (UTC)

@ttc0419 Keeping track of hundreds of optional packages is extra work for maintainers that provides no user benefit in this case. Users would still need to figure out which extra packages apply to their specific use cases.

python-safetensors is already in depends.

daskol commented on 2023-07-08 12:11 (UTC) (edited on 2023-07-08 12:15 (UTC) by daskol)

@ttc0419 Exactly. Most of dependencies of transformers are optional (see repo). By design HuggingFace is aimed at major deep learning frameworks TF, PT, JAX but the issue is that a user usually adhere to a greater extent to a single framework. So, enumeration of only required dependencies gives a user a freedom to manage installed packages on user's system in more fine-grained way.

The errors I listed are NOT optional.

They are actually optional. Just check transformers repo. It is a nightmare actually. The list of actual dependencies depends on a framework and model. In general it is impossible to manage this kind of projects properly.

ttc0419 commented on 2023-07-08 12:04 (UTC) (edited on 2023-07-08 12:07 (UTC) by ttc0419)

@xiota Why avoiding "listing hundreds of packages"? One purpose of the package is dependency management. It's not something nice to let users to track down decencies themselves. If it's optional, list it as optional. The errors I listed are NOT optional. It occurs immediately after an import, which is required packages missing. BTW, you can refer to pip for the dependencies of a package, the list of transformers is actually not very long:

├── filelock [required: Any, installed: 3.12.2]
├── huggingface-hub [required: >=0.14.1,<1.0, installed: 0.16.3]
│   ├── filelock [required: Any, installed: 3.12.2]
│   ├── fsspec [required: Any, installed: 2023.6.0]
│   ├── packaging [required: >=20.9, installed: 23.1]
│   ├── PyYAML [required: >=5.1, installed: 6.0]
│   ├── requests [required: Any, installed: 2.31.0]
│   │   ├── certifi [required: >=2017.4.17, installed: 2023.5.7]
│   │   ├── charset-normalizer [required: >=2,<4, installed: 3.1.0]
│   │   ├── idna [required: >=2.5,<4, installed: 3.4]
│   │   └── urllib3 [required: >=1.21.1,<3, installed: 2.0.3]
│   ├── tqdm [required: >=4.42.1, installed: 4.65.0]
│   │   └── colorama [required: Any, installed: 0.4.6]
│   └── typing-extensions [required: >=, installed: 4.7.1]
├── numpy [required: >=1.17, installed: 1.25.0]
├── packaging [required: >=20.0, installed: 23.1]
├── PyYAML [required: >=5.1, installed: 6.0]
├── regex [required: !=2019.12.17, installed: 2023.6.3]
├── requests [required: Any, installed: 2.31.0]
│   ├── certifi [required: >=2017.4.17, installed: 2023.5.7]
│   ├── charset-normalizer [required: >=2,<4, installed: 3.1.0]
│   ├── idna [required: >=2.5,<4, installed: 3.4]
│   └── urllib3 [required: >=1.21.1,<3, installed: 2.0.3]
├── safetensors [required: >=0.3.1, installed: 0.3.1]
├── tokenizers [required: >=0.11.1,<0.14,!=0.11.3, installed: 0.13.3]
└── tqdm [required: >=4.27, installed: 4.65.0]
    └── colorama [required: Any, installed: 0.4.6]

xiota commented on 2023-07-08 06:52 (UTC)

There are far too many dependencies to support everything provided by this package. To avoid listing hundreds of packages, even as optional dependencies, the list has been pared down to those required for core functionality. Since user needs vary, other packages should be installed as needed.

ttc0419 commented on 2023-07-08 06:13 (UTC)

Multiple missing dependencies:

LlamaTokenizer requires the SentencePiece library but it was not found in your environment. Checkout the instructions on the
installation page of its repo: and follow the ones
that match your environment. Please note that you may need to restart your runtime after installation.
Traceback (most recent call last):
  File "/usr/lib/python3.11/site-packages/transformers/utils/", line 1086, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/usr/lib/python3.11/importlib/", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1206, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1178, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1149, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/usr/lib/python3.11/site-packages/transformers/models/llama/", line 31, in <module>
    from ...modeling_utils import PreTrainedModel
  File "/usr/lib/python3.11/site-packages/transformers/", line 40, in <module>
    from .pytorch_utils import (  # noqa: F401
  File "/usr/lib/python3.11/site-packages/transformers/", line 19, in <module>
    from safetensors.torch import storage_ptr, storage_size
ModuleNotFoundError: No module named 'safetensors'