Unfortunately, due to missing Vulkan backend, I'm not interesting to maintain this package and corresponding Ollama fork anymore. If somebody want to do it, just ping me and I will set you as co-maintainer.
Search Criteria
Package Details: ollama-amd-igpu-docs 0.11.4-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/ollama-amd-igpu.git (read-only, click to copy) |
---|---|
Package Base: | ollama-amd-igpu |
Description: | Documentation for Ollama |
Upstream URL: | https://github.com/Crandel/ollama-amd-igpu |
Licenses: | MIT |
Submitter: | Crandel |
Maintainer: | None |
Last Packager: | Crandel |
Votes: | 1 |
Popularity: | 0.163361 |
First Submitted: | 2025-06-28 15:57 (UTC) |
Last Updated: | 2025-08-10 20:39 (UTC) |
Dependencies (7)
- clblast (clblast-gitAUR) (make)
- cmake (cmake3AUR, cmake-gitAUR) (make)
- cuda (cuda11.1AUR, cuda-12.2AUR, cuda12.0AUR, cuda11.4AUR, cuda11.4-versionedAUR, cuda12.0-versionedAUR, cuda-12.5AUR) (make)
- git (git-gitAUR, git-glAUR) (make)
- go (go-gitAUR, gcc-go-gitAUR, gcc-go-snapshotAUR, gcc-go) (make)
- hipblas (opencl-amd-devAUR) (make)
- ninja (ninja-kitwareAUR, ninja-fuchsia-gitAUR, ninja-gitAUR, ninja-memAUR, ninja-noemacs-gitAUR, ninja-jobserverAUR) (make)
Required by (0)
Sources (5)
Latest Comments
Crandel commented on 2025-08-10 20:45 (UTC)
Crandel commented on 2025-08-05 17:51 (UTC)
Is it possible to split ollama-amd-igpu-rocm and ollama-amd-igpu-cuda and thus avoid extra compilation?
I just check on two different laptops. Without Nvidia GPU it has only 203 compilation steps and with Nvidia GPU it has 293 steps and compile for at least 30 minutes longer. So I assume it automatically check if machine has Nvidia GPU and add extra steps for compilation process
Crandel commented on 2025-08-02 18:38 (UTC)
Greetings! Is it possible to split ollama-amd-igpu-rocm and ollama-amd-igpu-cuda and thus avoid extra compilation? Thanks.
They are already split. And they are dependent on regular ollama-amd-igpu package, there all compilation happened. I'm not a true developer, I just use official ollama receipt and adapt it to use my repo and my branch with patch.
https://gitlab.archlinux.org/archlinux/packaging/packages/ollama
Crandel commented on 2025-08-02 18:21 (UTC)
Would you please help to figure out which package is missing?
Which package are you trying to install? Make sure you have hip-runtime-amd
and hipblas
packages installed
flatmoll commented on 2025-08-02 18:05 (UTC)
Greetings! Is it possible to split ollama-amd-igpu-rocm and ollama-amd-igpu-cuda and thus avoid extra compilation? Thanks.
feng commented on 2025-08-02 10:31 (UTC)
I came across this compilation error:
-- Looking for a HIP compiler
-- Looking for a HIP compiler - /usr/bin/clang++
CMake Error at CMakeLists.txt:100 (find_package):
By not providing "Findhip.cmake" in CMAKE_MODULE_PATH this project has
asked CMake to find a package configuration file provided by "hip", but
CMake did not find one.
Could not find a package configuration file provided by "hip" with any of
the following names:
hipConfig.cmake
hip-config.cmake
Add the installation prefix of "hip" to CMAKE_PREFIX_PATH or set "hip_DIR"
to a directory containing one of the above files. If "hip" provides a
separate development package or SDK, be sure it has been installed.
-- Configuring incomplete, errors occurred!
==> ERROR: A failure occurred in build().
Aborting...
Would you please help to figure out which package is missing?
Crandel commented on 2025-06-28 19:46 (UTC)
May I suggest adding ollama to provides (and for sub-packages too) so that other packages that depend on stock ollama can use these builds as a substitute.
I've tried to do this, but I got conflicts between packages. If you can do this, feel free to send me a patch.
lightdot commented on 2025-06-28 19:44 (UTC)
May I suggest adding ollama to provides (and for sub-packages too) so that other packages that depend on stock ollama can use these builds as a substitute.
Thanks for keeping this patch up to date.
Pinned Comments
Crandel commented on 2025-08-10 20:45 (UTC)
Unfortunately, due to missing Vulkan backend, I'm not interesting to maintain this package and corresponding Ollama fork anymore. If somebody want to do it, just ping me and I will set you as co-maintainer.