Search Criteria
Package Details: ollama-nogpu-git 0.5.4+r3746+ga72f2dce4-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/ollama-nogpu-git.git (read-only, click to copy) |
---|---|
Package Base: | ollama-nogpu-git |
Description: | Create, run and share large language models (LLMs) |
Upstream URL: | https://github.com/ollama/ollama |
Licenses: | MIT |
Conflicts: | ollama |
Provides: | ollama |
Submitter: | dreieck |
Maintainer: | envolution |
Last Packager: | envolution |
Votes: | 5 |
Popularity: | 0.29 |
First Submitted: | 2024-04-17 15:09 (UTC) |
Last Updated: | 2024-12-18 07:53 (UTC) |
Dependencies (3)
- cmake (cmake-gitAUR) (make)
- git (git-gitAUR, git-glAUR) (make)
- go (go-gitAUR, gcc-go-gitAUR, gcc-go-snapshotAUR, gcc-go) (make)
Required by (25)
- ai-writer (requires ollama)
- alpaca-ai (requires ollama)
- alpaca-git (requires ollama) (optional)
- alpaka-git (requires ollama)
- anythingllm-desktop-bin (requires ollama)
- chatd (requires ollama)
- chatd-bin (requires ollama)
- gollama (requires ollama) (optional)
- gollama-git (requires ollama) (optional)
- hoarder (requires ollama) (optional)
- hollama-bin (requires ollama)
- litellm (requires ollama) (optional)
- litellm-ollama (requires ollama)
- llocal-bin (requires ollama)
- lobe-chat (requires ollama) (optional)
- lumen (requires ollama) (optional)
- maestro (requires ollama) (optional)
- maestro-git (requires ollama) (optional)
- ollamamodelupdater (requires ollama) (optional)
- ollamamodelupdater-bin (requires ollama) (optional)
- Show 5 more...
Latest Comments
1 2 Next › Last »
envolution commented on 2024-12-16 02:19 (UTC)
@lavilao you can try this new build, it should work without avx
nmanarch commented on 2024-10-29 18:30 (UTC)
For @lavilao i just come to see your ask for run without avx hope you have success ? So if not you have to apply this : https://github.com/ollama/ollama/issues/2187#issuecomment-2262876198 So you have to download this ollama vulkan aur version ..change the line of code and do a new build package with the change and install this new build.
nmanarch commented on 2024-10-29 18:20 (UTC)
Hi @dreieck ! This is sad ! this is the only one i found for ollama run with vulkan. So perhaps i can take it but could you provide help if problems occurs with futur upgrade which become too hard to apply ?
dreieck commented on 2024-09-03 11:29 (UTC)
Does anyone want to take over?
I notice that I completely don't use this software, so I disown it.
Regards!
dreieck commented on 2024-09-03 11:29 (UTC)
I now just changed
go get
togo mod download
.Regards!
lavilao commented on 2024-08-18 21:10 (UTC)
Hi, does anyone knows how to build this package without avx? I have added -DGGML_AVX=off, -DLLAMA_AVX=OFF -DLLAMA_AVX2=OFF -DLLAMA_AVX512=OFF -DLLAMA_AVX512_VBMI=OFF -DLLAMA_AVX512_VNNI=OFF -DLLAMA_F16C=OFF -DLLAMA_FMA=OFF yet when building llama.cpp server it still shows -DGGML_AVX=on.Am trying to build it on GitHub actions because my machine is to weak. Thanks in advice!
dreieck commented on 2024-08-03 11:36 (UTC) (edited on 2024-09-03 11:30 (UTC) by dreieck)
@beatboxchad,
Please discuss/report this with upstream.
The comments here are just for packaging issues, and yours is not a packaging issue as it seems.
@kkbs,
I am going to look into that sometime later, thanks for the information!
beatboxchad commented on 2024-08-03 04:37 (UTC)
Warm greetings,
I'm having this problem on two different laptops with Intel hardware:
I accidentally biked down to the coffeeshop without the other one, and my build output is obfuscated by that other issue y'all are discussing, but I've been fiddling with the build for a little while and don't see any obvious low-hanging fruit. There is one fact about the output I find peculiar, where although the Vulkan code appears to build it echos that only the CPU variant is built. I'll come back with that output soon.
Meanwhile, here's some basic diagnostic info about the machine I have:
vulkaninfo
is too long to provide.Yeaxi commented on 2024-08-03 02:33 (UTC) (edited on 2024-08-03 02:33 (UTC) by Yeaxi)
@dreieck hello dude. After viewing the pkgbuild and doing some search, I get some idea. Maybe the problem is caused by "go get" in pkgbuild, and this cause the permission of those go mod files are rrr, so when aurhelper try to delete them it crashes. I've found that people recommend "go mod download" instead of "go get" to prevent this, but I'm not pretty sure about it, maybe a pro like you can tell.btw, paru is a rust version of yay started by the same guy.
dreieck commented on 2024-07-28 09:41 (UTC)
Ahoj @kkbs,
I do not have any idea about go specific stuff.
In the
PKGBUILD
there is no explicit rm.So it seems that upstream's build script want to delete stuff.(?)
I do not understand why the build user does not have permission to delete the files that the same user has created.
What is "paru"? I do not see any reference to it in the
PKGBUILD
.I have no idea, I did not encounter any such issue so far.
So maybe someone else who has an idea give a hint?
Regards!
1 2 Next › Last »