Search Criteria
Package Details: ollama-cuda-git 0.4.7+r3696+gff6c2d6dc-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/ollama-cuda-git.git (read-only, click to copy) |
---|---|
Package Base: | ollama-cuda-git |
Description: | Create, run and share large language models (LLMs) with CUDA |
Upstream URL: | https://github.com/ollama/ollama |
Licenses: | MIT |
Conflicts: | ollama |
Provides: | ollama, ollama-cuda |
Submitter: | sr.team |
Maintainer: | envolution |
Last Packager: | envolution |
Votes: | 4 |
Popularity: | 0.92 |
First Submitted: | 2024-02-22 23:22 (UTC) |
Last Updated: | 2024-12-01 01:18 (UTC) |
Dependencies (7)
- cuda (cuda11.1AUR, cuda-12.2AUR, cuda12.0AUR, cuda11.4AUR, cuda11.4-versionedAUR, cuda12.0-versionedAUR)
- clblast (clblast-gitAUR) (make)
- cmake (cmake-gitAUR) (make)
- git (git-gitAUR, git-glAUR) (make)
- go (go-gitAUR, gcc-go-gitAUR, go-sylixosAUR, gcc-go-snapshotAUR, gcc-go) (make)
- pigz (pigz-gitAUR) (make)
- nvidia-utils (nvidia-410xx-utilsAUR, nvidia-340xx-utilsAUR, nvidia-440xx-utilsAUR, nvidia-430xx-utilsAUR, nvidia-vulkan-utilsAUR, nvidia-535xx-utilsAUR, nvidia-470xx-utilsAUR, nvidia-550xx-utilsAUR, nvidia-390xx-utilsAUR, nvidia-utils-teslaAUR, nvidia-utils-betaAUR, nvidia-525xx-utilsAUR, nvidia-510xx-utilsAUR) (optional) – monitor GPU usage with nvidia-smi
Required by (26)
- ai-writer (requires ollama)
- alpaca-ai (requires ollama)
- alpaca-git (requires ollama) (optional)
- alpaka-git (requires ollama)
- anythingllm-desktop-bin (requires ollama)
- chatd (requires ollama)
- chatd-bin (requires ollama)
- gollama (requires ollama) (optional)
- gollama-git (requires ollama) (optional)
- hoarder (requires ollama) (optional)
- hollama-bin (requires ollama)
- litellm (requires ollama) (optional)
- litellm-ollama (requires ollama)
- llocal-bin (requires ollama)
- lobe-chat (requires ollama) (optional)
- lumen (requires ollama-cuda) (optional)
- lumen (requires ollama) (optional)
- maestro (requires ollama) (optional)
- maestro-git (requires ollama) (optional)
- ollamamodelupdater (requires ollama) (optional)
- Show 6 more...
Latest Comments
1 2 Next › Last »
envolution commented on 2024-12-05 15:38 (UTC)
until they merge https://github.com/ollama/ollama/pull/7499 there isn't a good way to manage this git package
envolution commented on 2024-12-01 02:18 (UTC)
@jfiguero @sarudosi
https://github.com/ollama/ollama/pull/7499 https://gitlab.archlinux.org/archlinux/packaging/packages/ollama-cuda/-/commits/main?ref_type=HEADS
It's being worked on. If anyone has a working build() to enable cuda, flag OOD and add it to comments please
jfiguero commented on 2024-11-29 18:41 (UTC) (edited on 2024-11-29 18:50 (UTC) by jfiguero)
I have installed ollama-cuda-git 0.4.6+r3691+gce7455a8e-1 and it does not use my GTX 1070 GPU but defaults to CPU. using ollama-cuda from extra and extra-testing does use it, but both packages are outdated.
I confirmed this using nvidia-smi, which won't show ollama as a running process, and see no change in power/RAM consumption while generating a response when using this package.
Here's my output for
systemctl status ollama
. Any suggestions on what I can look for to further debug?sarudosi commented on 2024-11-26 06:48 (UTC)
I can not use GPU, using this pkg and extra/ollama-cuda. Do you have any idea to fix this issue?
sarudosi commented on 2024-11-26 06:43 (UTC) (edited on 2024-11-26 06:44 (UTC) by sarudosi)
Mr. brauliobo. You shuold modify PKGBUILD & makpkg -si.
brauliobo commented on 2024-11-14 18:59 (UTC)
got the error:
sr.team commented on 2024-08-21 04:42 (UTC)
@JamesMowery you need preinstall makedepends before build package
JamesMowery commented on 2024-08-21 04:28 (UTC)
Getting the following error when installing on Nvidia 555 + Wayland + KDE Plasma.
nmanarch commented on 2024-04-17 07:49 (UTC) (edited on 2024-04-29 10:12 (UTC) by nmanarch)
I have found a little trick showed by others on ollama git issue. So for those want ollama cuda run without avx try :
https://github.com/ollama/ollama/issues/2187#issuecomment-2082334649
Thanks to @sr.team and to all.
Hello. I apologized. Since 1.29 the gpu support without avx cpu is blocked in ollama. Did someone can help to have this to work again? https://github.com/ollama/ollama/issues/2187 and the bypass propose by dbzoo which work but not apply at the main. https://github.com/dbzoo/ollama/commit/45eb1048496780a78ed07cf39b3ce6b62b5a72e3 Many thanks.have a nice days.
nmanarch commented on 2024-04-04 14:23 (UTC)
Yes it is fixed and build and run.thanks.
1 2 Next › Last »