Package Details: ik-llama.cpp-vulkan r3888.45afaf33-1

Git Clone URL: https://aur.archlinux.org/ik-llama.cpp-vulkan.git (read-only, click to copy)
Package Base: ik-llama.cpp-vulkan
Description: llama.cpp fork with additional SOTA quants and improved performance (Vulkan Backend)
Upstream URL: https://github.com/ikawrakow/ik_llama.cpp
Licenses: MIT
Conflicts: ggml, ik-llama.cpp, ik-llama.cpp-cuda, libggml, llama.cpp, llama.cpp-cuda, llama.cpp-hip, llama.cpp-vulkan
Provides: llama.cpp
Submitter: Orion-zhen
Maintainer: Orion-zhen
Last Packager: Orion-zhen
Votes: 3
Popularity: 1.53
First Submitted: 2025-08-01 10:02 (UTC)
Last Updated: 2025-09-24 00:20 (UTC)

Pinned Comments

Orion-zhen commented on 2025-09-02 03:19 (UTC) (edited on 2025-09-02 13:20 (UTC) by Orion-zhen)

I couldn't receive notifications from AUR in real-time, so if you have problems that require immediate feedback or communication, please consider submitting an issue in this GitHub repository, at which I maintain all my AUR packages. Thank you for your understanding.

Latest Comments

Orion-zhen commented on 2025-09-02 03:19 (UTC) (edited on 2025-09-02 13:20 (UTC) by Orion-zhen)

I couldn't receive notifications from AUR in real-time, so if you have problems that require immediate feedback or communication, please consider submitting an issue in this GitHub repository, at which I maintain all my AUR packages. Thank you for your understanding.