Search Criteria
Package Details: python-transformers 4.47.0-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/python-transformers.git (read-only, click to copy) |
---|---|
Package Base: | python-transformers |
Description: | State-of-the-art Natural Language Processing for Jax, PyTorch and TensorFlow |
Upstream URL: | https://github.com/huggingface/transformers |
Keywords: | huggingface transformers |
Licenses: | Apache |
Submitter: | filipg |
Maintainer: | daskol |
Last Packager: | daskol |
Votes: | 13 |
Popularity: | 2.08 |
First Submitted: | 2021-10-23 09:30 (UTC) |
Last Updated: | 2024-12-06 14:36 (UTC) |
Dependencies (20)
- python-filelock
- python-huggingface-hubAUR (python-huggingface-hub-gitAUR)
- python-numpy (python-numpy-flameAUR, python-numpy-gitAUR, python-numpy1AUR, python-numpy-mkl-binAUR, python-numpy-mklAUR, python-numpy-mkl-tbbAUR)
- python-packaging
- python-regex (python-regex-gitAUR)
- python-requests
- python-safetensorsAUR
- python-tokenizersAUR
- python-tqdm
- python-yaml (python-yaml-gitAUR)
- python-build (make)
- python-installer (python-installer-gitAUR) (make)
- python-setuptools (make)
- python-wheel (make)
- python-bitsandbytes (python-bitsandbytes-rocm-gitAUR, python-bitsandbytes-gitAUR) (optional) – 8-bit support for PyTorch
- python-flaxAUR (optional) – JAX support
- python-onnxconverter-commonAUR (optional) – TensorFlow support
- python-pytorch (python-pytorch-mkl-gitAUR, python-pytorch-cuda-gitAUR, python-pytorch-mkl-cuda-gitAUR, python-pytorch-cxx11abiAUR, python-pytorch-cxx11abi-optAUR, python-pytorch-cxx11abi-cudaAUR, python-pytorch-cxx11abi-opt-cudaAUR, python-pytorch-cxx11abi-rocmAUR, python-pytorch-cxx11abi-opt-rocmAUR, python-pytorch-rocm-binAUR, python-pytorch-cuda, python-pytorch-opt, python-pytorch-opt-cuda, python-pytorch-opt-rocm, python-pytorch-rocm) (optional) – PyTorch support
- python-tensorflow (python-tensorflow-cuda-keplerAUR, python-tensorflow-computecppAUR, python-tensorflow-rocmAUR, python-tensorflow-opt-rocmAUR, python-tensorflow-cuda, python-tensorflow-opt, python-tensorflow-opt-cuda) (optional) – TensorFlow support
- python-tf2onnxAUR (optional) – TensorFlow support
Required by (30)
- coqui-tts (optional)
- dsnote (optional)
- dsnote-git (optional)
- localai-git
- localai-git-cuda
- localai-git-rocm
- manga-ocr-git
- monailabel (optional)
- pix2tex
- python-assistant
- python-auralis
- python-bark-git
- python-bitsandbytes-git
- python-colbert-ai
- python-deepmultilingualpunctuation
- python-deepspeed
- python-dmt
- python-evaluate (optional)
- python-monai (optional)
- python-open-clip-torch
- Show 10 more...
Latest Comments
1 2 3 Next › Last »
BluePyTheDeer251 commented on 2024-10-04 02:35 (UTC)
The thing failed and threw this:
[code] python-transformers - exit status 8 python-optax - exit status 8 python-flax - exit status 8 python-orbax-checkpoint - exit status 8 python-chex - exit status 8 python-safetensors - exit status 8 python-jax - exit status 8 [/code]
BluePyTheDeer251 commented on 2024-10-03 23:48 (UTC)
I hope this helps an LLM I'm working on, it's a coding AI assistant I'm making as a passion project, like Linus Torvalds said: "It won't be as big as GNU", GNU in this case being GitHub Copilot.
daskol commented on 2024-05-07 09:58 (UTC)
@carsme Typo is fixed.
carsme commented on 2024-05-07 09:53 (UTC)
Tests fail both on my system and in a chroot:
daskol commented on 2023-09-20 20:25 (UTC) (edited on 2023-09-20 20:25 (UTC) by daskol)
@rekman Enforced constraint on
python-tokenizers
in order to prevent its updates with system update.Thank you for important news.
rekman commented on 2023-09-20 20:07 (UTC) (edited on 2023-09-20 20:09 (UTC) by rekman)
transformers
4.33.2 requirestokenizers<0.14
. Currently the AUR hastokenizers 0.14.0.
. Downgrade to0.13.3
to usetransformers
for now.This will be fixed once upstream provides a release including this pull request (it has been merged, just not released yet).
daskol commented on 2023-07-08 12:11 (UTC) (edited on 2023-07-08 12:15 (UTC) by daskol)
@ttc0419 Exactly. Most of dependencies of
transformers
are optional (see repo). By design HuggingFace is aimed at major deep learning frameworks TF, PT, JAX but the issue is that a user usually adhere to a greater extent to a single framework. So, enumeration of only required dependencies gives a user a freedom to manage installed packages on user's system in more fine-grained way.They are actually optional. Just check
transformers
repo. It is a nightmare actually. The list of actual dependencies depends on a framework and model. In general it is impossible to manage this kind of projects properly.ttc0419 commented on 2023-07-08 12:04 (UTC) (edited on 2023-07-08 12:07 (UTC) by ttc0419)
@xiota Why avoiding "listing hundreds of packages"? One purpose of the package is dependency management. It's not something nice to let users to track down decencies themselves. If it's optional, list it as optional. The errors I listed are NOT optional. It occurs immediately after an import, which is required packages missing. BTW, you can refer to pip for the dependencies of a package, the list of transformers is actually not very long:
1 2 3 Next › Last »