Package Base Details: ollama-nogpu-git

Git Clone URL: https://aur.archlinux.org/ollama-nogpu-git.git (read-only, click to copy)
Submitter: dreieck
Maintainer: dreieck
Last Packager: dreieck
Votes: 0
Popularity: 0.000000
First Submitted: 2024-04-17 15:09 (UTC)
Last Updated: 2024-05-12 10:11 (UTC)

Latest Comments

nmanarch commented on 2024-05-18 09:41 (UTC)

Ok ! This sad. Perhaps they agree to accept your code changes to theirs master ? So many thanks for all of this.

dreieck commented on 2024-05-18 09:32 (UTC) (edited on 2024-05-18 09:38 (UTC) by dreieck)

Ahoj @nmanarch,

it seems upstream is moving too fast, needing to change the patch too often.

If I do not find an easier way to not build with ROCm or CUDA even if some of their files are installed, I might just give up.

↗ Upstream feature request to add a "kill switch" to force-off ROCm and CUDA.

nmanarch commented on 2024-05-18 09:17 (UTC) (edited on 2024-05-18 09:37 (UTC) by nmanarch)

Hello @derieck. I want to try your ollama vulkan . But the patch failed to apply: I have try to add --fuzz 3 and --ignore-whitespace but not better. Thanks for a trick.

Submodule path 'llm/llama.cpp': checked out '614d3b914e1c3e02596f869649eb4f1d3b68614d' pplying patch disable-rocm-cuda.gen_linux.sh.patch ...

patching file llm/generate/gen_linux.sh Hunk #5 FAILED at 143. Hunk #6 FAILED at 220. 2 out of 6 hunks FAILED -- saving rejects to file llm/generate/gen_linux.sh.rej ==> ERROR: A failure occurred in prepare(). Aborting...