Search Criteria
Package Details: ollama-vulkan-git 0.3.9+5.r3417.20240902.ad3eb00b-2
Package Actions
Git Clone URL: | https://aur.archlinux.org/ollama-nogpu-git.git (read-only, click to copy) |
---|---|
Package Base: | ollama-nogpu-git |
Description: | Create, run and share large language models (LLMs). With vulkan backend. |
Upstream URL: | https://github.com/jmorganca/ollama |
Licenses: | MIT |
Conflicts: | ollama |
Provides: | ollama, ollama-git |
Submitter: | dreieck |
Maintainer: | None |
Last Packager: | dreieck |
Votes: | 5 |
Popularity: | 0.47 |
First Submitted: | 2024-04-17 15:09 (UTC) |
Last Updated: | 2024-09-03 11:26 (UTC) |
Dependencies (9)
- gcc-libs (gcc-libs-gitAUR, gccrs-libs-gitAUR, gcc11-libsAUR, gcc-libs-snapshotAUR)
- glibc (glibc-gitAUR, glibc-linux4AUR, glibc-eacAUR, glibc-eac-binAUR, glibc-eac-rocoAUR)
- openssl (openssl-gitAUR, openssl-staticAUR)
- bash (bash-devel-static-gitAUR, bash-devel-gitAUR, busybox-coreutilsAUR, bash-gitAUR) (make)
- cmake (cmake-gitAUR) (make)
- git (git-gitAUR, git-glAUR) (make)
- go (go-gitAUR, gcc-go-gitAUR, go-sylixosAUR, gcc-go-snapshotAUR, gcc-go) (make)
- openblas (openblas-lapackAUR) (make)
- openmpi (openmpi-gitAUR) (make)
Required by (24)
- ai-writer (requires ollama)
- alpaca-ai (requires ollama)
- alpaca-git (requires ollama)
- alpaka-git (requires ollama)
- anythingllm-desktop-bin (requires ollama)
- chatd (requires ollama)
- chatd-bin (requires ollama)
- gollama (requires ollama) (optional)
- gollama-git (requires ollama) (optional)
- hoarder (requires ollama) (optional)
- hollama-bin (requires ollama)
- litellm (requires ollama) (optional)
- litellm-ollama (requires ollama)
- llocal-bin (requires ollama)
- lobe-chat (requires ollama) (optional)
- maestro (requires ollama) (optional)
- maestro-git (requires ollama) (optional)
- ollamamodelupdater (requires ollama) (optional)
- ollamamodelupdater-bin (requires ollama) (optional)
- python-ollama (requires ollama)
- Show 4 more...
Latest Comments
« First ‹ Previous 1 2
brianwo commented on 2024-06-07 14:50 (UTC)
@dreieck, I have no idea about that
dreieck commented on 2024-06-07 13:42 (UTC) (edited on 2024-06-07 14:46 (UTC) by dreieck)
@brianwo,
Do you have an idea which packages/ which upstream project provides those executables? I have no idea what they are.
--
Hacky workaround:
Disabled upstream-added ONEAPI build by setting
OLLAMA_ROOT
to a hopefully non-existend directory. See https://github.com/ollama/ollama/issues/4511#issuecomment-2154973327.Regards!
brianwo commented on 2024-06-07 11:32 (UTC) (edited on 2024-07-29 10:38 (UTC) by brianwo)
Unable to build, looks like it builds with oneAPI. It was working for
0.1.39+12.r2800.20240528.ad897080-1
previously.dreieck commented on 2024-05-22 08:38 (UTC)
Reactivated Vulkan build by deactivating testing options.
dreieck commented on 2024-05-21 21:12 (UTC) (edited on 2024-05-21 21:12 (UTC) by dreieck)
Disabled Vulkan build since it currently fails.
dreieck commented on 2024-05-21 20:44 (UTC)
Upstream has implemented CUDA and ROCm skip variables 🎉 — implementing it and uploading fixed
PKGBUILD
…nmanarch commented on 2024-05-18 09:41 (UTC)
Ok ! This sad. Perhaps they agree to accept your code changes to theirs master ? So many thanks for all of this.
dreieck commented on 2024-05-18 09:32 (UTC) (edited on 2024-05-18 09:38 (UTC) by dreieck)
Ahoj @nmanarch,
it seems upstream is moving too fast, needing to change the patch too often.
If I do not find an easier way to not build with ROCm or CUDA even if some of their files are installed, I might just give up.
↗ Upstream feature request to add a "kill switch" to force-off ROCm and CUDA.
nmanarch commented on 2024-05-18 09:17 (UTC) (edited on 2024-05-18 09:37 (UTC) by nmanarch)
Hello @derieck. I want to try your ollama vulkan . But the patch failed to apply: I have try to add --fuzz 3 and --ignore-whitespace but not better. Thanks for a trick.
Submodule path 'llm/llama.cpp': checked out '614d3b914e1c3e02596f869649eb4f1d3b68614d' pplying patch disable-rocm-cuda.gen_linux.sh.patch ...
patching file llm/generate/gen_linux.sh Hunk #5 FAILED at 143. Hunk #6 FAILED at 220. 2 out of 6 hunks FAILED -- saving rejects to file llm/generate/gen_linux.sh.rej ==> ERROR: A failure occurred in prepare(). Aborting...
« First ‹ Previous 1 2