Package Details: llama.cpp-vulkan b6558-1

Git Clone URL: https://aur.archlinux.org/llama.cpp-vulkan.git (read-only, click to copy)
Package Base: llama.cpp-vulkan
Description: Port of Facebook's LLaMA model in C/C++ (with Vulkan GPU optimizations)
Upstream URL: https://github.com/ggerganov/llama.cpp
Licenses: MIT
Conflicts: ggml, libggml, llama.cpp, stable-diffusion.cpp
Provides: llama.cpp
Submitter: txtsd
Maintainer: Orion-zhen
Last Packager: Orion-zhen
Votes: 12
Popularity: 1.75
First Submitted: 2024-10-26 20:10 (UTC)
Last Updated: 2025-09-24 00:20 (UTC)

Pinned Comments

Orion-zhen commented on 2025-09-02 03:17 (UTC) (edited on 2025-09-02 13:20 (UTC) by Orion-zhen)

I couldn't receive notifications from AUR in real-time, so if you have problems that require immediate feedback or communication, please consider submitting an issue in this GitHub repository, at which I maintain all my AUR packages. Thank you for your understanding.

txtsd commented on 2024-10-26 20:15 (UTC) (edited on 2024-12-06 14:15 (UTC) by txtsd)

Alternate versions

llama.cpp
llama.cpp-vulkan
llama.cpp-sycl-fp16
llama.cpp-sycl-fp32
llama.cpp-cuda
llama.cpp-cuda-f16
llama.cpp-hip

Latest Comments

1 2 3 Next › Last »

Crandel commented on 2025-09-24 06:39 (UTC)

Original url https://github.com/ggerganov/llama.cpp currently is an alias to current url https://github.com/ggml-org/llama.cpp. I guess would be better for visibility to use current url in PKGBUILD

Orion-zhen commented on 2025-09-02 10:31 (UTC) (edited on 2025-09-02 13:17 (UTC) by Orion-zhen)

Hi, @envolution. Thank you for emphasizing that. I have realized that shallow cloning is discouraged by Arch Linux manner. I have spent some time searching and learning how to maintain an AUR package properly. Currently, I'm having difficulty to get pkgctl to work with GitHub Actions environment. I will fix this issue as soon as I have everything settled.

EDIT: After some searching and debugging, I have managed to get nvchecker working, and now the PKGBUILD have updated with sources array. BTW, I have looked through @envolution's AUR repository. It has a really powerful management system, as well as a commit history length over 10k items. Impressive. Thank you, @envolution, again for the inspiration from your repository.

envolution commented on 2025-09-02 07:55 (UTC)

@Orion-zhen again, the repo/archive should be in the sources array (https://wiki.archlinux.org/title/PKGBUILD secton 7.1) - the way this package checks upstream for the latest tag for applying versioning (that is not tracked by AUR) is more suitable to a -git package. There are many reasons why shallow cloning is a rejected feature for makepkg (https://wiki.archlinux.org/title/User:Apg#makepkg:_shallow_git_clones).

Upstream provides an archive file (~25mb) if you're looking to avoid a ~200mb clone, and would then allow shasums to be populated appropriately. aur/llama.cpp can be used as a reference for inserting the build number to cmake

Orion-zhen commented on 2025-09-02 06:50 (UTC) (edited on 2025-09-02 06:56 (UTC) by Orion-zhen)

Hi, @quickes.

Thank you for your report, that's incorrect indeed. But there is still one problem in your code. Assume the situation where GitHub Actions haven't updated the PKGBUILD file for days (interval 7 days), the pkgver is older than the latest one.

The execution sequence is:

  1. run prepare()
  2. run pkgver()

So with your code, in prepare(), we clone the old tag. And then in pkgver(), we update the tag to the latest one. In this case, only the pkgver is updated.

The right way is to fetch the latest tag right in prepare(). However, there is still an extremely minute probability of error, where llama.cpp happens to update a version midway between executing the prepare() and pkgver() functions. But the probability is so infinitesimally small that it can basically be disregarded. I have updated the PKGBUILD file, thank you for your cooperation.

quickes commented on 2025-09-02 06:07 (UTC) (edited on 2025-09-02 06:09 (UTC) by quickes)

The current version from the master branch is always installed, not the version specified in the package. No matter which version I install, the latest one will always be installed as of the installation date. This means that installing the same version on different computers on different dates will install different versions of the application. This is incorrect. You can fix this behavior by using

git clone --depth 1 --single-branch --branch "${pkgver}" "${url}" "${_pkgname}"

In this case, the release version specified in the package will always be installed.

You take the last commit of the master branch. But you assign the version by getting the last tag that does not correspond to the received version of the master branch.

prepare() {
  cd "$srcdir"
  git clone --depth 1 --single-branch --branch master "${url}" "${_pkgname}"
}

pkgver() {
  # 使用 API 获取最新发布的标签
  curl -s "https://api.github.com/repos/ggml-org/llama.cpp/releases/latest" | \
    grep '"tag_name":' | \
    sed -E 's/.*"([^"]+)".*/\1/'
}

Orion-zhen commented on 2025-09-02 03:17 (UTC) (edited on 2025-09-02 13:20 (UTC) by Orion-zhen)

I couldn't receive notifications from AUR in real-time, so if you have problems that require immediate feedback or communication, please consider submitting an issue in this GitHub repository, at which I maintain all my AUR packages. Thank you for your understanding.

Orion-zhen commented on 2025-09-02 02:54 (UTC) (edited on 2025-09-02 03:12 (UTC) by Orion-zhen)

Hi, @markg85.

This approach is absolutely brilliant! It perfectly solves the problem. I have already updated the PKGBUILD file, adopting the first method you mentioned.

The reason I adopted this awkward approach in the PKGBUILD was that I wanted to obtain the very latest source code while maintaining the original version number format and avoiding the large storage footprint that comes with a full repository clone. Also, given that this PKGBUILD file is maintained on GitHub, I don't want to update pkgver every day or the git commit history would overflow, that's the reason why I am not using source() with release archives proposed by @envolution. Using git clone depth was a necessary compromise. Your solution now perfectly addresses my needs, thank you!

markg85 commented on 2025-09-01 13:10 (UTC)

I too was having issues here (fatal: No names found, cannot describe anything.)

While the current update (depth increased) might work, might i suggest an alternative approach? 1. define a dummy pkgver (overwritten in the pkgver function) 2. in okgver() obtain the last released tag using: curl -s "https://api.github.com/repos/ggml-org/llama.cpp/releases/latest" | grep '"tag_name":' | sed -E 's/."([^"]+)"./\1/' 3. in prepare just shallow clone with that tag you just obtained. 4. proceed from there on.

Or you can just get the latest archive url (probably also in pkgver() as you want to set that version) curl -s https://api.github.com/repos/ggml-org/llama.cpp/releases/latest | jq -r ".tarball_url" In this case it would be nicer to do this in source(...) but unfortunately you have to figure out that you don't know beforehand making the use of source(...) fairly impossible.

Orion-zhen commented on 2025-09-01 11:49 (UTC)

Hi, @AnotherAURUser. Thank you for your message. This issue is caused by the limited depth in prepare(). I have increased it to 20, which is sufficient to find at least one tag. Please try again, thanks.

AnotherAURUser commented on 2025-09-01 04:40 (UTC) (edited on 2025-09-01 04:41 (UTC) by AnotherAURUser)


Cloning into 'llama.cpp'...
remote: Enumerating objects: 1633, done.
remote: Counting objects: 100% (1633/1633), done.
remote: Compressing objects: 100% (1268/1268), done.
remote: Total 1633 (delta 344), reused 1087 (delta 310), pack-reused 0 (from 0)
Receiving objects: 100% (1633/1633), 24.08 MiB | 10.60 MiB/s, done.
Resolving deltas: 100% (344/344), done.
==> Starting pkgver()...
fatal: No names found, cannot describe anything.
==> ERROR: A failure occurred in pkgver().
    Aborting...
 -> error making: llama.cpp-vulkan-exit status 4
 -> Failed to install the following packages. Manual intervention is required:
llama.cpp-vulkan - exit status 4