Package Details: python-transformers 4.47.1-1

Git Clone URL: https://aur.archlinux.org/python-transformers.git (read-only, click to copy)
Package Base: python-transformers
Description: State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow
Upstream URL: https://github.com/huggingface/transformers
Keywords: huggingface transformers
Licenses: Apache
Submitter: filipg
Maintainer: daskol
Last Packager: daskol
Votes: 13
Popularity: 2.08
First Submitted: 2021-10-23 09:30 (UTC)
Last Updated: 2024-12-18 10:51 (UTC)

Dependencies (20)

Sources (1)

Latest Comments

« First ‹ Previous 1 2 3

filipg commented on 2022-12-10 08:26 (UTC)

Applied patch, updated version

kIERO commented on 2022-11-30 14:22 (UTC)

Maintainer: @filipg is MIA

kIERO commented on 2022-10-08 21:33 (UTC)

I've successfully built version 4.22.2 with the patch by @ChrisMorgan below

blinry commented on 2022-09-22 17:40 (UTC)

Very nice comments, ChrisMorgan! filipg, I'd love it if these changes were applied! \o/

ChrisMorgan commented on 2022-09-10 10:47 (UTC)

This package’s dependencies don’t correspond very well to the dependencies expressed by the actual Python package (for which refer to /usr/lib/python3.10/site-packages/transformers-4.19.2-py3.10.egg-info/requires.txt).

  • Remove cuda and nccl. (transformers does not depend on them at all; PyTorch may, so packages like python-pytorch-opt-cuda depend on these two.)
  • Keep python-filelock, python-tokenizers and python-tqdm.
  • Remove python-sacremoses. It’s marked as exclusively a test dependency, and there are a bunch of other things missing from the checkdepends if you want to actually test it. (Caveat to this statement: a few children of transformers.models do actually look to use it, some optionally and some mandatorily. Not sure quite what’s up with that, but I think the requirements lists may be at fault and perhaps it should have another feature set for it, and perhaps here it should be optdepends. I don’t know how this stuff is used.)
  • Add python-huggingface-hub, python-numpy, python-packaging, python-yaml, python-regex, python-requests.
  • Start by removing python-pytorch (it’s optional), then probably add it to optdepends as well as the items of at least the flax, torch and tf sets from requires.txt (and there are a lot more optional dependencies that could be worth including, if interested in exhaustiveness).

Here’s a start (with only one fly in the ointment, that python-optax doesn’t exist yet, so JAX support is actually not yet readily attainable):

diff --git a/PKGBUILD b/PKGBUILD
index 4763dde..d808b71 100644
--- a/PKGBUILD
+++ b/PKGBUILD
@@ -7,13 +7,25 @@ pkgdesc="State-of-the-art Natural Language Processing for Jax, PyTorch and Tenso
 arch=('i686' 'x86_64')
 url="https://pypi.org/project/transformers"
 license=('Apache License 2.0')
-depends=('cuda'
-         'nccl'
-         'python-filelock'
-         'python-pytorch'
-         'python-sacremoses'
+depends=('python-filelock'
          'python-tokenizers'
+         'python-huggingface-hub'
+         'python-numpy'
+         'python-packaging'
+         'python-yaml'
+         'python-regex'
+         'python-requests'
          'python-tqdm')
+optdepends=(
+       'python-pytorch: PyTorch support'
+       'python-tensorflow: TensorFlow support'
+       'python-onnxconverter-common: TensorFlow support'
+       'python-tf2onn: TensorFlow support'
+       'python-jax: JAX support'
+       'python-jaxlib: JAX support'
+       'python-flax: JAX support'
+       'python-optax: JAX support'
+)

 source=("https://github.com/huggingface/transformers/archive/refs/tags/v${pkgver}.tar.gz")
 sha256sums=('a1cdffb59b0a409cb5de414fcfaf5208f4526023cd021245f37f309bb15673a9')

jnphilipp commented on 2022-06-08 11:55 (UTC)

you only need one of the three Jax, PyTorch and TensorFlow, these should than be optdepends. And cuda is a depends of these, it can be removed completely as a dependency.