Search Criteria
Package Details: ollama-cuda13-bin 0.12.10-1
Package Actions
| Git Clone URL: | https://aur.archlinux.org/ollama-bin.git (read-only, click to copy) |
|---|---|
| Package Base: | ollama-bin |
| Description: | Create, run and share large language models (LLMs) with CUDA 13 |
| Upstream URL: | https://github.com/ollama/ollama |
| Keywords: | ai llms local |
| Licenses: | MIT |
| Conflicts: | ollama-cuda |
| Provides: | ollama-cuda |
| Submitter: | Dominiquini |
| Maintainer: | Dominiquini |
| Last Packager: | Dominiquini |
| Votes: | 3 |
| Popularity: | 1.32 |
| First Submitted: | 2025-09-26 06:29 (UTC) |
| Last Updated: | 2025-11-06 23:59 (UTC) |
Dependencies (4)
- gcc-libs (gcc-libs-gitAUR, gccrs-libs-gitAUR, gcc-libs-snapshotAUR)
- glibc (glibc-gitAUR, glibc-eacAUR)
- ollama-binAUR
- ollama-cuda (ollama-cuda12-binAUR, ollama-cuda13-binAUR) (optional) – NVIDIA GPU Support
Required by (6)
- anythingllm-desktop-bin (requires ollama-cuda)
- lumen (requires ollama-cuda) (optional)
- mpvcovergrabber-git (requires ollama-cuda) (optional)
- ollama-bin (requires ollama-cuda) (optional)
- ollama-cuda12-bin (requires ollama-cuda) (optional)
- ollama-cuda13-bin (requires ollama-cuda) (optional)
Latest Comments
omnigenous commented on 2025-09-30 02:02 (UTC)
Thank you so much!
Dominiquini commented on 2025-09-29 20:10 (UTC)
@omnigenous For Pascal, replace both ollama and ollama-cuda from the main repo and install ollama-bin and ollama-cuda12-bin (these packages conflict with each other, so replacement is automatic). These packages do not depend on the user having CUDA 13 installed on the system, but they also do not conflict, so you can keep them if you want!
In short. Just run
to have ollama running on the GPU for Pascal cards!omnigenous commented on 2025-09-29 10:10 (UTC)
Do I delete both cuda and ollama packages and install this for Pascal card?