Search Criteria
Package Details: ollama-cuda-git 0.5.5+r3779+g6982e9cc9-3
Package Actions
| Git Clone URL: | https://aur.archlinux.org/ollama-cuda-git.git (read-only, click to copy) |
|---|---|
| Package Base: | ollama-cuda-git |
| Description: | Create, run and share large language models (LLMs) |
| Upstream URL: | https://github.com/ollama/ollama |
| Licenses: | MIT |
| Conflicts: | ollama |
| Provides: | ollama |
| Submitter: | sr.team |
| Maintainer: | None |
| Last Packager: | envolution |
| Votes: | 5 |
| Popularity: | 0.004200 |
| First Submitted: | 2024-02-22 23:22 (UTC) |
| Last Updated: | 2025-01-14 06:03 (UTC) |
Dependencies (5)
Required by (64)
- ai-assistant-studio-bin (requires ollama)
- ai-writer (requires ollama)
- aingdesk (requires ollama) (optional)
- aingdesk-git (requires ollama) (optional)
- alpaca-ai (requires ollama)
- alpaca-git (requires ollama) (optional)
- alpaka-git (requires ollama)
- anythingllm-desktop-bin (requires ollama)
- calt-git (requires ollama)
- chatbox-git (requires ollama) (optional)
- chatd (requires ollama)
- chatd-bin (requires ollama)
- cherry-studio (requires ollama) (optional)
- cherry-studio-electron-bin (requires ollama) (optional)
- clara-verse (requires ollama) (optional)
- clippy-bin (requires ollama)
- cloudtolocalllm (requires ollama) (optional)
- codename-goose (requires ollama) (optional)
- codename-goose-bin (requires ollama) (optional)
- docspedia-git (requires ollama)
- Show 44 more...
Latest Comments
1 2 Next › Last »
LaptopDev commented on 2025-01-24 14:09 (UTC)
Can you inform me how to update this without removal?
envolution commented on 2024-12-28 02:55 (UTC)
@wanxp can you please try the new version?
Wanxp commented on 2024-12-28 00:05 (UTC) (edited on 2024-12-28 00:07 (UTC) by Wanxp)
i has already install package
nvidia-dkms,nvidia,nvidia-utilsafter install
opencl-nvidia, install ollama-cuba-git, get next error:Wanxp commented on 2024-12-27 23:42 (UTC)
install get error and not success
envolution commented on 2024-12-16 01:55 (UTC) (edited on 2024-12-16 05:36 (UTC) by envolution)
This seems to be working now. If it fails to compile, please comment with your GPU/nvidia-smi and just the error line from the compilation
envolution commented on 2024-12-05 15:38 (UTC)
until they merge https://github.com/ollama/ollama/pull/7499 there isn't a good way to manage this git package
envolution commented on 2024-12-01 02:18 (UTC)
@jfiguero @sarudosi
https://github.com/ollama/ollama/pull/7499 https://gitlab.archlinux.org/archlinux/packaging/packages/ollama-cuda/-/commits/main?ref_type=HEADS
It's being worked on. If anyone has a working build() to enable cuda, flag OOD and add it to comments please
jfiguero commented on 2024-11-29 18:41 (UTC) (edited on 2024-11-29 18:50 (UTC) by jfiguero)
I have installed ollama-cuda-git 0.4.6+r3691+gce7455a8e-1 and it does not use my GTX 1070 GPU but defaults to CPU. using ollama-cuda from extra and extra-testing does use it, but both packages are outdated.
I confirmed this using nvidia-smi, which won't show ollama as a running process, and see no change in power/RAM consumption while generating a response when using this package.
Here's my output for
systemctl status ollama. Any suggestions on what I can look for to further debug?sarudosi commented on 2024-11-26 06:48 (UTC)
I can not use GPU, using this pkg and extra/ollama-cuda. Do you have any idea to fix this issue?
sarudosi commented on 2024-11-26 06:43 (UTC) (edited on 2024-11-26 06:44 (UTC) by sarudosi)
Mr. brauliobo. You shuold modify PKGBUILD & makpkg -si.
1 2 Next › Last »