Search Criteria
Package Details: ollama-cuda-git 0.1.30.gc2712b55-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/ollama-cuda-git.git (read-only, click to copy) |
---|---|
Package Base: | ollama-cuda-git |
Description: | Create, run and share large language models (LLMs) with CUDA |
Upstream URL: | https://github.com/jmorganca/ollama |
Licenses: | MIT |
Conflicts: | ollama, ollama-cuda |
Provides: | ollama |
Submitter: | sr.team |
Maintainer: | sr.team |
Last Packager: | sr.team |
Votes: | 1 |
Popularity: | 0.82 |
First Submitted: | 2024-02-22 23:22 (UTC) |
Last Updated: | 2024-04-01 12:42 (UTC) |
Dependencies (4)
Required by (10)
- alpaka-git (requires ollama)
- chatd (requires ollama)
- chatd-bin (requires ollama)
- litellm (requires ollama) (optional)
- litellm-ollama (requires ollama)
- ollamamodelupdater (requires ollama) (optional)
- ollamamodelupdater-bin (requires ollama) (optional)
- python-ollama (requires ollama)
- python-ollama-git (requires ollama)
- tlm (requires ollama) (optional)
Latest Comments
nmanarch commented on 2024-04-17 07:49 (UTC) (edited on 2024-04-17 07:53 (UTC) by nmanarch)
Hello. I apologized. Since 1.29 the gpu support without avx cpu is blocked in ollama. Did someone can help to have this to work again? https://github.com/ollama/ollama/issues/2187 and the bypass propose by dbzoo which work but not apply at the main. https://github.com/dbzoo/ollama/commit/45eb1048496780a78ed07cf39b3ce6b62b5a72e3 Many thanks.have a nice days.
nmanarch commented on 2024-04-04 14:23 (UTC)
Yes it is fixed and build and run.thanks.
sr.team commented on 2024-04-01 12:43 (UTC)
@nmanarch thanks for report. This problem must be fixed now
nmanarch commented on 2024-03-31 23:00 (UTC) (edited on 2024-03-31 23:59 (UTC) by nmanarch)
Hi ! the build failed..? someone have some tricks to solve ?
So i have found on : https://aur.archlinux.org/packages/ollama-rocm-git?O=10 So i remove my fcf-protection in my /etc/makepkg.conf .
But now i have