Package Details: llama.cpp-hip b5845-1

Git Clone URL: https://aur.archlinux.org/llama.cpp-hip.git (read-only, click to copy)
Package Base: llama.cpp-hip
Description: Port of Facebook's LLaMA model in C/C++ (with AMD ROCm optimizations)
Upstream URL: https://github.com/ggerganov/llama.cpp
Licenses: MIT
Conflicts: ggml, libggml, llama.cpp
Provides: llama.cpp
Submitter: txtsd
Maintainer: txtsd
Last Packager: txtsd
Votes: 7
Popularity: 1.72
First Submitted: 2024-10-26 19:54 (UTC)
Last Updated: 2025-07-08 13:51 (UTC)

Pinned Comments

txtsd commented on 2024-10-26 20:15 (UTC) (edited on 2024-12-06 14:15 (UTC) by txtsd)

Alternate versions

llama.cpp
llama.cpp-vulkan
llama.cpp-sycl-fp16
llama.cpp-sycl-fp32
llama.cpp-cuda
llama.cpp-cuda-f16
llama.cpp-hip

Latest Comments

1 2 3 Next › Last »

Jawzper commented on 2025-07-05 13:08 (UTC)

Thank you @edtoml, those PKGBUILD changes seem to have done the trick.

I removed libggml-hip-git during the installation to avoid conflicts, and now I'm not able to install it again (because of said conflicts). I only had it in the first place because it was a dependency for llama.cpp-hip. Is it fine?

edtoml commented on 2025-07-05 11:49 (UTC) (edited on 2025-07-05 11:50 (UTC) by edtoml)

edtoml commented on 2025-07-05 02:02 (UTC)

This has broken AGAIN due to libggml-hip-git being out of sync.

It can be made by changing option in the build() section (ON has been changed to OFF)

-DLLAMA_USE_SYSTEM_GGML=OFF

If you are on RDNA3 or RDNA4 adding the line below speeds up flash attention

-DGGML_HIP_ROCWMMA_FATTN=ON

You also need to remove the depend for libggml-hip in the depends=( list

There are dependencies you will need to resolve as files added by libggml-hip-git are going to conflict.

Jawzper commented on 2025-07-04 08:17 (UTC)

I am not able to update... can anyone help me?

https://pastebin.com/rqdQNXDq

edtoml commented on 2025-07-01 17:01 (UTC)

Looks like libggml-hip has been updated and after it rebuilds this package builds correctly.

vhnvn commented on 2025-06-30 15:07 (UTC)

You need to build without LLAMA_USE_SYSTEM_GGML, libggml is missing these functions even on master.

txtsd commented on 2025-06-30 00:56 (UTC)

@edtoml I'm seeing the same error. Could you report it upstream?

edtoml commented on 2025-06-29 19:00 (UTC) (edited on 2025-06-29 19:41 (UTC) by edtoml)

At b5780 building with the system ggml (libggml-hip.git) fails with:

/home/ed/.cache/yay/llama.cpp-hip/src/llama.cpp/src/llama-graph.cpp:564:23: error: use of undeclared identifier 'ggml_swiglu_split'
  564 |                 cur = ggml_swiglu_split(ctx0, cur, tmp);
      |                       ^
/home/ed/.cache/yay/llama.cpp-hip/src/llama.cpp/src/llama-graph.cpp:573:23: error: use of undeclared identifier 'ggml_geglu_split'
  573 |                 cur = ggml_geglu_split(ctx0, cur, tmp);
      |                       ^
[ 25%] Building CXX object src/CMakeFiles/llama.dir/llama-mmap.cpp.o
/home/ed/.cache/yay/llama.cpp-hip/src/llama.cpp/src/llama-graph.cpp:586:23: error: use of undeclared identifier 'ggml_reglu_split'
  586 |                 cur = ggml_reglu_split(ctx0, cur, tmp);

dependencies aside, it works when system ggml is disabled

txtsd commented on 2025-06-16 11:54 (UTC)

This package now uses system libggml so it should work alongside whisper.cpp

Tests and examples building has been turned off.
kompute is removed.

eSPiYa commented on 2025-06-11 13:07 (UTC) (edited on 2025-06-11 13:16 (UTC) by eSPiYa)

I'm getting an error when installing

error: unresolvable package conflicts detected error: failed to prepare transaction (conflicting dependencies) :: blas-openblas-0.3.29-2 and blas-3.12.1-2.1 are in conflict

But blas is required by the following packages: cblas lapack lib32-blas proton-cachyos suitesparse

I can't remove it because proton-cachyos is using it, and also it seems blas is higher than blas-openblas.