Search Criteria
Package Details: ik-llama.cpp-hip r3826.ae0ba31f-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/ik-llama.cpp-hip.git (read-only, click to copy) |
---|---|
Package Base: | ik-llama.cpp-hip |
Description: | llama.cpp fork with additional SOTA quants and improved performance (Rocm Backend) |
Upstream URL: | https://github.com/ikawrakow/ik_llama.cpp |
Licenses: | MIT |
Conflicts: | ggml, ik-llama.cpp, ik-llama.cpp-cuda, ik-llama.cpp-vulkan, libggml, llama.cpp, llama.cpp-cuda, llama.cpp-hip, llama.cpp-vulkan |
Provides: | llama.cpp |
Submitter: | Orion-zhen |
Maintainer: | None |
Last Packager: | Orion-zhen |
Votes: | 0 |
Popularity: | 0.000000 |
First Submitted: | 2025-08-01 10:02 (UTC) |
Last Updated: | 2025-08-01 10:02 (UTC) |
Dependencies (16)
- curl (curl-gitAUR, curl-c-aresAUR)
- gcc-libs (gcc-libs-gitAUR, gccrs-libs-gitAUR, gcc-libs-snapshotAUR)
- glibc (glibc-gitAUR, glibc-linux4AUR, glibc-eacAUR)
- hip-runtime-amd (opencl-amdAUR)
- hipblas (opencl-amd-devAUR)
- openmp
- python (python37AUR)
- rocblas (rocblas-gfx1103AUR, opencl-amd-devAUR)
- cmake (cmake3AUR, cmake-gitAUR) (make)
- git (git-gitAUR, git-glAUR) (make)
- rocm-hip-sdk (opencl-amd-devAUR) (make)
- python-numpy (python-numpy-gitAUR, python-numpy1AUR, python-numpy-mkl-tbbAUR, python-numpy-mklAUR, python-numpy-mkl-binAUR) (optional) – needed for convert_hf_to_gguf.py
- python-pytorch (python-pytorch-cxx11abiAUR, python-pytorch-cxx11abi-optAUR, python-pytorch-cxx11abi-cudaAUR, python-pytorch-cxx11abi-opt-cudaAUR, python-pytorch-cxx11abi-rocmAUR, python-pytorch-cxx11abi-opt-rocmAUR, python-pytorch-cuda, python-pytorch-opt, python-pytorch-opt-cuda, python-pytorch-opt-rocm, python-pytorch-rocm) (optional) – needed for convert_hf_to_gguf.py
- python-safetensorsAUR (python-safetensors-binAUR) (optional) – needed for convert_hf_to_gguf.py
- python-sentencepieceAUR (python-sentencepiece-gitAUR) (optional) – needed for convert_hf_to_gguf.py
- python-transformersAUR (optional) – needed for convert_hf_to_gguf.py
Latest Comments
Orion-zhen commented on 2025-08-01 13:02 (UTC)
It seems that currently ik_llama.cpp doesn't support ROCm. So I would orphan this package.