Per EndlessEden. This doesn't work with ROCm as a result of downloading whatever is the default pip torch package.
Search Criteria
Package Details: stable-diffusion-webui-git 1.9.4.r0.gfeee37d-2
Package Actions
Git Clone URL: | https://aur.archlinux.org/stable-diffusion-webui-git.git (read-only, click to copy) |
---|---|
Package Base: | stable-diffusion-webui-git |
Description: | Stable Diffusion Web UI (AUTOMATIC1111) |
Upstream URL: | https://github.com/AUTOMATIC1111/stable-diffusion-webui |
Licenses: | AGPL3 |
Provides: | stable-diffusion-ui |
Submitter: | bendavis78 |
Maintainer: | bendavis78 |
Last Packager: | bendavis78 |
Votes: | 8 |
Popularity: | 0.27 |
First Submitted: | 2024-06-29 22:44 (UTC) |
Last Updated: | 2024-07-10 23:33 (UTC) |
Required by (0)
Sources (3)
trougnouf commented on 2024-10-25 15:41 (UTC)
EndlessEden commented on 2024-10-12 14:32 (UTC) (edited on 2024-10-12 14:34 (UTC) by EndlessEden)
Is there some way to get this to use the system version of pytorch? Also python3.11 is working and has better performance.
seeseemelk commented on 2024-10-06 12:48 (UTC)
The models directory in /var/opt/stable-diffusion-webui/data/models is only readable by the sdwebui user. Is this intentional?
bhill commented on 2024-08-11 20:35 (UTC)
I've switched distros since originally making this and have not been using this package, so I'm disowning it.
My hope is that someone who uses this package and cares deeply will adopt and foster it.
Kanishk598 commented on 2024-07-12 12:41 (UTC)
The package requires a specific PyTorch version which isn't available on PyPI
kentechgeek commented on 2024-05-12 23:08 (UTC)
For those who struggle, stabilitymatrix in AUR works. Google it what it is.
rufusreal commented on 2024-05-03 18:57 (UTC)
Thanks @rabcor whit this works for me:
# Download python 3.11 from AUR
yay -S python311
# remove previous virtual environment
rm -rf /opt/stable-diffusion-web-ui/venv
# Create python virtual environment and enters Python interpreter
source /opt/stable-diffusion-web-ui/venv/bin/activate
pip install wrapt # install wrapt as dependency
python3.11 -m venv .
# Install Stable Diffusion web UI on the path and start the server
cd "/opt/stable-diffusion-web-ui" && ./webui.sh
rabcor commented on 2024-04-30 07:54 (UTC) (edited on 2024-05-03 14:16 (UTC) by rabcor)
I got it to work. You gotta use python 3.10 or 3.11 or you run into errors, I use 3.11 because it is faster. Even if you bypass the torch version error you get on 3.12, you'll just run into a tokenizers error instead so this is the most straightforward way.
yay -S python311
rm -rf /opt/stable-diffusion-web-ui/venv
python3.11 -m venv /opt/stable-diffusion-web-ui/venv
cd "/opt/stable-diffusion-web-ui" && ./webui.sh
#Wait for the script to set up the venv, then close it and continue
/opt/stable-diffusion-web-ui/venv/bin/pip install fastapi==0.110.3
cd "/opt/stable-diffusion-web-ui" && ./webui.sh
The pkgbuild could be updated to require python310 or python311 but it wouldn't really fix the problem since stable-diffusion-web-ui itself won't use it by default (it tries to use the system default python3 version always, to avoid this we'd need to patch it), and it too itself installs a combination of albumentations, fastapi and pydantic that aren't cross compatible. Upgrading fastapi was the best solution to that problem.
Issue for that here https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/15662
PS: people with low vram should use the --medvram or --lowvram options, they make a huge difference.
PPS: This is optional since it's not strictly necessary (certain components and extensions require it tho) Onnxruntime needs the correct version of cuda to work right https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html here's how to fix it for the current version:
/opt/stable-diffusion-web-ui/venv/bin/python3 -m pip uninstall onnxruntime-gpu
/opt/stable-diffusion-web-ui/venv/bin/python3 -m pip install onnxruntime-gpu==1.17.1 --extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/onnxruntime-cuda-12/pypi/simple/
rufusreal commented on 2024-04-28 12:21 (UTC) (edited on 2024-04-28 12:23 (UTC) by rufusreal)
There's no option to install torch version 2.1.2
ERROR: Could not find a version that satisfies the requirement torch==2.1.2 (from versions: 2.2.0, 2.2.0+cu121, 2.2.1, 2.2.1+cu121, 2.2.2, 2.2.2+cu121, 2.3.0, 2.3.0+cu121)
ERROR: No matching distribution found for torch==2.1.2
Traceback (most recent call last):
File "/opt/stable-diffusion-web-ui/launch.py", line 48, in <module>
main()
File "/opt/stable-diffusion-web-ui/launch.py", line 39, in main
prepare_environment()
File "/opt/stable-diffusion-web-ui/modules/launch_utils.py", line 380, in prepare_environment
run(f'"{python}" -m {torch_command}', "Installing torch and torchvision", "Couldn't install torch", live=True)
File "/opt/stable-diffusion-web-ui/modules/launch_utils.py", line 115, in run
raise RuntimeError("\n".join(error_bits))
RuntimeError: Couldn't install torch.
Command: "/opt/stable-diffusion-web-ui/venv/bin/python3" -m pip install torch==2.1.2 torchvision==0.16.2 --extra-index-url https://download.pytorch.org/whl/cu121
Error code: 1
Pinned Comments
bhill commented on 2023-11-21 09:06 (UTC)
Make sure to consider your hardware before installing AI programs: They're very resource intense and require capable hardware just to run.
For example: If your intended device has less than 6 GB of VRAM, I would advise against installing this. Everything would download and the UI would boot, but once you try to generate an image, you'll receive an error saying there's not enough VRAM to generate images.