Package Details: ik-llama.cpp-hip r3826.ae0ba31f-1

Git Clone URL: https://aur.archlinux.org/ik-llama.cpp-hip.git (read-only, click to copy)
Package Base: ik-llama.cpp-hip
Description: llama.cpp fork with additional SOTA quants and improved performance (Rocm Backend)
Upstream URL: https://github.com/ikawrakow/ik_llama.cpp
Licenses: MIT
Conflicts: ggml, ik-llama.cpp, ik-llama.cpp-cuda, ik-llama.cpp-vulkan, libggml, llama.cpp, llama.cpp-cuda, llama.cpp-hip, llama.cpp-vulkan
Provides: llama.cpp
Submitter: Orion-zhen
Maintainer: None
Last Packager: Orion-zhen
Votes: 0
Popularity: 0.000000
First Submitted: 2025-08-01 10:02 (UTC)
Last Updated: 2025-08-01 10:02 (UTC)

Latest Comments

Orion-zhen commented on 2025-08-01 13:02 (UTC)

It seems that currently ik_llama.cpp doesn't support ROCm. So I would orphan this package.