@agilob it's a '-git' package. Version updates automatically during installation process
Search Criteria
Package Details: ollama-rocm-git 0.5.8.git+0189bdd0-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/ollama-rocm-git.git (read-only, click to copy) |
---|---|
Package Base: | ollama-rocm-git |
Description: | Create, run and share large language models (LLMs) with ROCm |
Upstream URL: | https://github.com/ollama/ollama |
Licenses: | MIT |
Conflicts: | ollama |
Provides: | ollama |
Submitter: | sr.team |
Maintainer: | wgottwalt |
Last Packager: | wgottwalt |
Votes: | 5 |
Popularity: | 0.56 |
First Submitted: | 2024-02-28 00:40 (UTC) |
Last Updated: | 2025-02-11 12:05 (UTC) |
Dependencies (26)
- comgr (opencl-amdAUR)
- gcc-libs (gcc-libs-gitAUR, gccrs-libs-gitAUR, gcc11-libsAUR, gcc-libs-snapshotAUR)
- hip-runtime-amd (opencl-amdAUR)
- hipblas (opencl-amd-devAUR)
- hsa-rocr (opencl-amdAUR)
- libdrm (libdrm-gitAUR)
- libelf (elfutils-gitAUR)
- numactl (numactl-gitAUR)
- rocblas (opencl-amd-devAUR)
- rocsolver (opencl-amd-devAUR)
- rocsparse (rocsparse-gfx1010AUR, opencl-amd-devAUR)
- gcc-libs (gcc-libs-gitAUR, gccrs-libs-gitAUR, gcc11-libsAUR, gcc-libs-snapshotAUR) (make)
- git (git-gitAUR, git-glAUR) (make)
- go (go-gitAUR, gcc-go-gitAUR, gcc-go-snapshotAUR, gcc-go) (make)
- hip-runtime-amd (opencl-amdAUR) (make)
- hipblas (opencl-amd-devAUR) (make)
- hsa-rocr (opencl-amdAUR) (make)
- libdrm (libdrm-gitAUR) (make)
- libelf (elfutils-gitAUR) (make)
- numactl (numactl-gitAUR) (make)
- Show 6 more dependencies...
Required by (36)
- ai-writer (requires ollama)
- alpaca-ai (requires ollama)
- alpaca-git (requires ollama) (optional)
- alpaka-git (requires ollama)
- anythingllm-desktop-bin (requires ollama)
- calt-git (requires ollama)
- chatd (requires ollama)
- chatd-bin (requires ollama)
- codename-goose (requires ollama) (optional)
- codename-goose-bin (requires ollama) (optional)
- gollama (requires ollama) (optional)
- gollama-git (requires ollama) (optional)
- hoarder (requires ollama) (optional)
- hollama-bin (requires ollama)
- litellm (requires ollama) (optional)
- litellm-ollama (requires ollama)
- llocal-bin (requires ollama)
- lobe-chat (requires ollama) (optional)
- lumen (requires ollama) (optional)
- maestro (requires ollama) (optional)
- maestro-git (requires ollama) (optional)
- nyarchassistant (requires ollama) (optional)
- ollama-chat-desktop-git (requires ollama)
- ollama-lab (requires ollama) (optional)
- ollama-lab-bin (requires ollama) (optional)
- ollama-runner-git (requires ollama)
- ollamamodelupdater (requires ollama) (optional)
- ollamamodelupdater-bin (requires ollama) (optional)
- open-webui (requires ollama) (optional)
- open-webui-no-venv (requires ollama) (optional)
- pulse-browser-git (requires ollama)
- python-ollama (requires ollama)
- python-ollama-git (requires ollama)
- screenshot_llm (requires ollama) (optional)
- screenshot_llm-git (requires ollama) (optional)
- tlm (requires ollama) (optional)
Sources (5)
Latest Comments
« First ‹ Previous 1 2 3
sr.team commented on 2024-03-16 15:57 (UTC)
kyngs commented on 2024-03-08 16:51 (UTC)
Hi, I cannot build this package due to the following error:
error: option 'cf-protection=return' cannot be specified on this target
error: option 'cf-protection=branch' cannot be specified on this target
192 warnings and 2 errors generated when compiling for gfx1010.
make[3]: *** [CMakeFiles/ggml.dir/build.make:132: CMakeFiles/ggml.dir/ggml-cuda.cu.o] Error 1
make[3]: *** Waiting for unfinished jobs....
make[2]: *** [CMakeFiles/Makefile2:742: CMakeFiles/ggml.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:2906: examples/server/CMakeFiles/ext_server.dir/rule] Error 2
make: *** [Makefile:1193: ext_server] Error 2
llm/generate/generate_linux.go:3: running "bash": exit status 2
==> ERROR: A failure occurred in build().
Lindenk commented on 2024-03-07 03:50 (UTC)
Even with the pinned changes, the build is failing for me with:
CMake Error at /usr/share/cmake/Modules/CMakeTestCCompiler.cmake:67 (message):
The C compiler
"/usr/bin/cc"
is not able to compile a simple test program.
It fails with the following output:
Change Dir: '/home/lindenk/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/cpu/CMakeFiles/CMakeScratch/TryCompile-den2H4'
Run Build Command(s): /usr/bin/cmake -E env VERBOSE=1 /usr/bin/make -f Makefile cmTC_5cbab/fast
/usr/bin/make -f CMakeFiles/cmTC_5cbab.dir/build.make CMakeFiles/cmTC_5cbab.dir/build
make[1]: Entering directory '/home/lindenk/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/cpu/CMakeFiles/CMakeScratch/TryCompile-den2H4'
Building C object CMakeFiles/cmTC_5cbab.dir/testCCompiler.c.o
/usr/bin/cc -march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security # -fstack-clash-protection -fcf-protection -g -ffile-prefix-map=/home/lindenk/.cache/yay/ollama-rocm-git/src=/usr/src/debug/ollama-rocm-git -flto=auto -fPIE -o CMakeFiles/cmTC_5cbab.dir/testCCompiler.c.o -c /home/lindenk/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/cpu/CMakeFiles/CMakeScratch/TryCompile-den2H4/testCCompiler.c
cc: fatal error: no input files
compilation terminated.
make[1]: *** [CMakeFiles/cmTC_5cbab.dir/build.make:78: CMakeFiles/cmTC_5cbab.dir/testCCompiler.c.o] Error 1
make[1]: Leaving directory '/home/lindenk/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/cpu/CMakeFiles/CMakeScratch/TryCompile-den2H4'
make: *** [Makefile:127: cmTC_5cbab/fast] Error 2
CMake will not be able to correctly generate this project.
Call Stack (most recent call first):
CMakeLists.txt:2 (project)
ZappaBoy commented on 2024-03-05 12:28 (UTC)
Same compile error of @nameiwillforget, fixed using the @bullet92 workaround commenting the line in /etc/makepkg.conf
45 # -fstack-clash-protection -fcf-protection
bullet92 commented on 2024-03-02 13:00 (UTC) (edited on 2024-03-02 15:50 (UTC) by bullet92)
Hi, without the package hipblas it does not install with the following compilation error:
CMake Error at CMakeLists.txt:500 (find_package):
By not providing "Findhipblas.cmake" in CMAKE_MODULE_PATH this project has
asked CMake to find a package configuration file provided by "hipblas", but
CMake did not find one.
Could not find a package configuration file provided by "hipblas" with any
of the following names:
hipblasConfig.cmake
hipblas-config.cmake
Add the installation prefix of "hipblas" to CMAKE_PREFIX_PATH or set
"hipblas_DIR" to a directory containing one of the above files. If
"hipblas" provides a separate development package or SDK, be sure it has
been installed.
Suggestion: add hipblas as dependencies
In any case compile fail, so I have to modify my /etc/makepkg.conf and commented out this line:
45 # -fstack-clash-protection -fcf-protection
After starting ollama with systemctl and checking the output with status I obtained:
level=INFO source=routes.go:1044 msg="no GPU detected"
So I edited the service adding "HSA_OVERRIDE_GFX_VERSION=10.3.0" to the current environment in ollama.service
sudo systemctl edit ollama.service
[Service]
Environment="HOME=/var/lib/ollama" "GIN_MODE=release" "HSA_OVERRIDE_GFX_VERSION=10.3.0"
and installed some dependencies:
pacman -S rocm-hip-sdk rocm-opencl-sdk clblast go
( in my case rocm-hip-sdk rocm-opencl-sdk were missings ) And that gives me "Radeon GPU detected"
sr.team commented on 2024-03-01 17:16 (UTC)
@nameiwillforget you can build ROCm version in docker, without a CUDA
nameiwillforget commented on 2024-03-01 11:18 (UTC) (edited on 2024-03-01 16:22 (UTC) by nameiwillforget)
@sr.team No, I installed cuda earlier because I thought I needed it, but once I realized I didn't I uninstalled it using yay -Rcs. That was before I tried to install ollama-rocm-git.
Edit: I re-installed cuda, and after I added it to my PATH, the compilation fails with the same error.
sr.team commented on 2024-02-29 17:11 (UTC)
@nameiwillforget you are having cuda installed? The ollama generator tried to build laama.cpp with CUDA for you
nameiwillforget commented on 2024-02-29 17:09 (UTC)
Compilation fails when installed with yay, with the following error message:
/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/ggml-cuda.cu:9673:5: note: in instantiation of function template specialization 'pool2d_nchw_kernel<float, float>' requested here
pool2d_nchw_kernel<<<block_nums, CUDA_IM2COL_BLOCK_SIZE, 0, main_stream>>>(IH, IW, OH, OW, k1, k0, s1, s0, p1, p0, parallel_elements, src0_dd, dst_dd, op);
^
/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/ggml-cuda.cu:6947:25: warning: enumeration value 'GGML_OP_POOL_COUNT' not handled in switch [-Wswitch]
switch (op) {
^~
error: option 'cf-protection=return' cannot be specified on this target
error: option 'cf-protection=branch' cannot be specified on this target
184 warnings and 2 errors generated when compiling for gfx1010.
make[3]: *** [CMakeFiles/ggml.dir/build.make:135: CMakeFiles/ggml.dir/ggml-cuda.cu.o] Error 1
make[3]: *** Waiting for unfinished jobs....
make[3]: Leaving directory '/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/rocm_v1'
make[2]: *** [CMakeFiles/Makefile2:745: CMakeFiles/ggml.dir/all] Error 2
make[2]: Leaving directory '/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/rocm_v1'
make[1]: *** [CMakeFiles/Makefile2:2910: examples/server/CMakeFiles/ext_server.dir/rule] Error 2
make[1]: Leaving directory '/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/rocm_v1'
make: *** [Makefile:1196: ext_server] Error 2
llm/generate/generate_linux.go:3: running "bash": exit status 2
==> ERROR: A failure occurred in build().
Pinned Comments
wgottwalt commented on 2024-11-09 10:46 (UTC) (edited on 2024-11-26 15:23 (UTC) by wgottwalt)
Looks like the ROCm 6.2.2-1 SDK has a malfunctioning compiler. It produces a broken ollama binary (fp16 issues). You may need to stay with ROCm 6.0.2 for now. I don't know if this got fixed in a newer build release. But the initial SDK version "-1" is broken.
ROCm 6.2.4 fixes this issue completely.