Package Details: ollama-rocm-git 0.5.7.git+42cf4db6-1

Git Clone URL: https://aur.archlinux.org/ollama-rocm-git.git (read-only, click to copy)
Package Base: ollama-rocm-git
Description: Create, run and share large language models (LLMs) with ROCm
Upstream URL: https://github.com/ollama/ollama
Licenses: MIT
Conflicts: ollama
Provides: ollama
Submitter: sr.team
Maintainer: wgottwalt
Last Packager: wgottwalt
Votes: 5
Popularity: 0.62
First Submitted: 2024-02-28 00:40 (UTC)
Last Updated: 2025-01-16 08:36 (UTC)

Required by (35)

Sources (5)

Pinned Comments

wgottwalt commented on 2024-11-09 10:46 (UTC) (edited on 2024-11-26 15:23 (UTC) by wgottwalt)

Looks like the ROCm 6.2.2-1 SDK has a malfunctioning compiler. It produces a broken ollama binary (fp16 issues). You may need to stay with ROCm 6.0.2 for now. I don't know if this got fixed in a newer build release. But the initial SDK version "-1" is broken.

ROCm 6.2.4 fixes this issue completely.

Latest Comments

« First ‹ Previous 1 2 3

kyngs commented on 2024-03-08 16:51 (UTC)

Hi, I cannot build this package due to the following error:

error: option 'cf-protection=return' cannot be specified on this target
error: option 'cf-protection=branch' cannot be specified on this target
192 warnings and 2 errors generated when compiling for gfx1010.
make[3]: *** [CMakeFiles/ggml.dir/build.make:132: CMakeFiles/ggml.dir/ggml-cuda.cu.o] Error 1
make[3]: *** Waiting for unfinished jobs....
make[2]: *** [CMakeFiles/Makefile2:742: CMakeFiles/ggml.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:2906: examples/server/CMakeFiles/ext_server.dir/rule] Error 2
make: *** [Makefile:1193: ext_server] Error 2
llm/generate/generate_linux.go:3: running "bash": exit status 2
==> ERROR: A failure occurred in build().

Lindenk commented on 2024-03-07 03:50 (UTC)

Even with the pinned changes, the build is failing for me with:

CMake Error at /usr/share/cmake/Modules/CMakeTestCCompiler.cmake:67 (message):
  The C compiler

    "/usr/bin/cc"

  is not able to compile a simple test program.

  It fails with the following output:

    Change Dir: '/home/lindenk/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/cpu/CMakeFiles/CMakeScratch/TryCompile-den2H4'

    Run Build Command(s): /usr/bin/cmake -E env VERBOSE=1 /usr/bin/make -f Makefile cmTC_5cbab/fast
    /usr/bin/make  -f CMakeFiles/cmTC_5cbab.dir/build.make CMakeFiles/cmTC_5cbab.dir/build
    make[1]: Entering directory '/home/lindenk/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/cpu/CMakeFiles/CMakeScratch/TryCompile-den2H4'
    Building C object CMakeFiles/cmTC_5cbab.dir/testCCompiler.c.o
    /usr/bin/cc   -march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions         -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security #        -fstack-clash-protection -fcf-protection -g -ffile-prefix-map=/home/lindenk/.cache/yay/ollama-rocm-git/src=/usr/src/debug/ollama-rocm-git -flto=auto  -fPIE -o CMakeFiles/cmTC_5cbab.dir/testCCompiler.c.o -c /home/lindenk/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/cpu/CMakeFiles/CMakeScratch/TryCompile-den2H4/testCCompiler.c
    cc: fatal error: no input files
    compilation terminated.
    make[1]: *** [CMakeFiles/cmTC_5cbab.dir/build.make:78: CMakeFiles/cmTC_5cbab.dir/testCCompiler.c.o] Error 1
    make[1]: Leaving directory '/home/lindenk/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/cpu/CMakeFiles/CMakeScratch/TryCompile-den2H4'
    make: *** [Makefile:127: cmTC_5cbab/fast] Error 2





  CMake will not be able to correctly generate this project.
Call Stack (most recent call first):
  CMakeLists.txt:2 (project)

ZappaBoy commented on 2024-03-05 12:28 (UTC)

Same compile error of @nameiwillforget, fixed using the @bullet92 workaround commenting the line in /etc/makepkg.conf

45 # -fstack-clash-protection -fcf-protection

bullet92 commented on 2024-03-02 13:00 (UTC) (edited on 2024-03-02 15:50 (UTC) by bullet92)

Hi, without the package hipblas it does not install with the following compilation error:

CMake Error at CMakeLists.txt:500 (find_package):
  By not providing "Findhipblas.cmake" in CMAKE_MODULE_PATH this project has
  asked CMake to find a package configuration file provided by "hipblas", but
  CMake did not find one.

  Could not find a package configuration file provided by "hipblas" with any
  of the following names:

hipblasConfig.cmake
hipblas-config.cmake

  Add the installation prefix of "hipblas" to CMAKE_PREFIX_PATH or set
  "hipblas_DIR" to a directory containing one of the above files.  If
  "hipblas" provides a separate development package or SDK, be sure it has
  been installed.

Suggestion: add hipblas as dependencies

In any case compile fail, so I have to modify my /etc/makepkg.conf and commented out this line:

 45 # -fstack-clash-protection -fcf-protection

After starting ollama with systemctl and checking the output with status I obtained:

level=INFO source=routes.go:1044 msg="no GPU detected"

So I edited the service adding "HSA_OVERRIDE_GFX_VERSION=10.3.0" to the current environment in ollama.service

sudo systemctl edit ollama.service

    [Service]
    Environment="HOME=/var/lib/ollama" "GIN_MODE=release" "HSA_OVERRIDE_GFX_VERSION=10.3.0"

and installed some dependencies:

  pacman -S rocm-hip-sdk rocm-opencl-sdk clblast go

( in my case rocm-hip-sdk rocm-opencl-sdk were missings ) And that gives me "Radeon GPU detected"

sr.team commented on 2024-03-01 17:16 (UTC)

@nameiwillforget you can build ROCm version in docker, without a CUDA

nameiwillforget commented on 2024-03-01 11:18 (UTC) (edited on 2024-03-01 16:22 (UTC) by nameiwillforget)

@sr.team No, I installed cuda earlier because I thought I needed it, but once I realized I didn't I uninstalled it using yay -Rcs. That was before I tried to install ollama-rocm-git.

Edit: I re-installed cuda, and after I added it to my PATH, the compilation fails with the same error.

sr.team commented on 2024-02-29 17:11 (UTC)

@nameiwillforget you are having cuda installed? The ollama generator tried to build laama.cpp with CUDA for you

nameiwillforget commented on 2024-02-29 17:09 (UTC)

Compilation fails when installed with yay, with the following error message:

/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/ggml-cuda.cu:9673:5: note: in instantiation of function template specialization 'pool2d_nchw_kernel<float, float>' requested here
    pool2d_nchw_kernel<<<block_nums, CUDA_IM2COL_BLOCK_SIZE, 0, main_stream>>>(IH, IW, OH, OW, k1, k0, s1, s0, p1, p0, parallel_elements, src0_dd, dst_dd, op);
    ^
/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/ggml-cuda.cu:6947:25: warning: enumeration value 'GGML_OP_POOL_COUNT' not handled in switch [-Wswitch]
                switch (op) {
                        ^~
error: option 'cf-protection=return' cannot be specified on this target
error: option 'cf-protection=branch' cannot be specified on this target
184 warnings and 2 errors generated when compiling for gfx1010.
make[3]: *** [CMakeFiles/ggml.dir/build.make:135: CMakeFiles/ggml.dir/ggml-cuda.cu.o] Error 1
make[3]: *** Waiting for unfinished jobs....
make[3]: Leaving directory '/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/rocm_v1'
make[2]: *** [CMakeFiles/Makefile2:745: CMakeFiles/ggml.dir/all] Error 2
make[2]: Leaving directory '/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/rocm_v1'
make[1]: *** [CMakeFiles/Makefile2:2910: examples/server/CMakeFiles/ext_server.dir/rule] Error 2
make[1]: Leaving directory '/home/alex/.cache/yay/ollama-rocm-git/src/ollama/llm/llama.cpp/build/linux/x86_64/rocm_v1'
make: *** [Makefile:1196: ext_server] Error 2

llm/generate/generate_linux.go:3: running "bash": exit status 2
==> ERROR: A failure occurred in build().