Please change
source=("${pkgname}-${pkgver}::git+${url}.git#tag=v${pkgver}"
to
source=("${pkgname}::git+${url}.git#tag=v${pkgver}"
so that the repo can actually be reused for different tags.
| Git Clone URL: | https://aur.archlinux.org/local-ai.git (read-only, click to copy) |
|---|---|
| Package Base: | local-ai |
| Description: | Free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first |
| Upstream URL: | https://github.com/mudler/LocalAI |
| Licenses: | MIT |
| Submitter: | robertfoster |
| Maintainer: | robertfoster |
| Last Packager: | robertfoster |
| Votes: | 3 |
| Popularity: | 0.011594 |
| First Submitted: | 2023-11-02 18:27 (UTC) |
| Last Updated: | 2025-12-25 10:17 (UTC) |
Please change
source=("${pkgname}-${pkgver}::git+${url}.git#tag=v${pkgver}"
to
source=("${pkgname}::git+${url}.git#tag=v${pkgver}"
so that the repo can actually be reused for different tags.
I can't get it to run models on an AMD gpu. Running on CPU works. I think it might be an issue with arch rather than upstream, but I'm not sure how to check.
nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libc.so.6: version `GLIBC_2.38' not found (required by /usr/lib/libelf.so.1) nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libc.so.6: version `GLIBC_ABI_DT_RELR' not found (required by /usr/lib/libdrm.so.2) nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libc.so.6: version `GLIBC_2.38' not found (required by /usr/lib/libdrm.so.2) nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libc.so.6: version `GLIBC_ABI_DT_RELR' not found (required by /usr/lib/libdrm_amdgpu.so.1) nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libc.so.6: version `GLIBC_2.38' not found (required by /usr/lib/libdrm_amdgpu.so.1) nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libc.so.6: version `GLIBC_ABI_DT_RELR' not found (required by /usr/lib/libgflags.so.2.2) nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libc.so.6: version `GLIBC_2.38' not found (required by /usr/lib/libgflags.so.2.2) nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libstdc++.so.6: version `GLIBCXX_3.4.32' not found (required by /usr/lib/libgflags.so.2.2) nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libc.so.6: version `GLIBC_ABI_DT_RELR' not found (required by /usr/lib/libz.so.1) nov 08 12:25:13 PCDesktop local-ai[35014]: 12:25PM DBG GRPC(gemma-3-12b-it-127.0.0.1:46315): stderr /var/lib/local-ai/backends/rocm-llama-cpp/llama-cpp-fallback: /var/lib/local-ai/backends/rocm-llama-cpp/lib/libc.so.6: version `GLIBC_ABI_DT_RELR' not found (required by /usr/lib/libzstd.so.1) nov 08 12:25:51 PCDesktop local-ai[35014]: 12:25PM ERR failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:46315: connect: connection refused\"" nov 08 12:25:53 PCDesktop local-ai[35014]: 12:25PM DBG GRPC Service NOT ready nov 08 12:25:53 PCDesktop local-ai[35014]: 12:25PM ERR Failed to load model gemma-3-12b-it with backend llama-cpp error="failed to load model with internal loader: grpc service not ready" modelID=gemma-3-12b-it
i'm unable to build:
cp ~/.cache/yay/local-ai-vulkan/src/LocalAI-2.29.0/backend/go/image/stablediffusion-ggml/build/libstable-diffusion.a ./libsd.a
ar rcs libsd.a gosd.o
make[1]: Leaving directory '~/.cache/yay/local-ai-vulkan/src/LocalAI-2.29.0/backend/go/image/stablediffusion-ggml'
==> ERROR: A failure occurred in build().
Aborting...
-> error making: local-ai-vulkan-exit status 4
i'm having this error:
[100%] Built target server
make[1]: Leaving directory '/home/ciros/.cache/yay/local-ai-cublas/src/LocalAI-2.29.0/sources/bark.cpp/build'
==> ERROR: A failure occurred in build().
Aborting...
-> error making: local-ai-cublas-exit status 4
checking dependencies...
Packages (5) go-2:1.24.3-1 go.rice-1.0.3-1 grpc-1.72.0-1 protoc-gen-go-1.36.6-1 protoc-gen-go-grpc-1:1.5.1-2
Total Removed Size: 274,27 MiB
:: Do you want to remove these packages? [Y/n]
:: Processing package changes...
(1/5) removing go [##########################################################################################] 100%
(2/5) removing grpc [##########################################################################################] 100%
(3/5) removing protoc-gen-go-grpc [##########################################################################################] 100%
(4/5) removing protoc-gen-go [##########################################################################################] 100%
(5/5) removing go.rice [##########################################################################################] 100%
:: Running post-transaction hooks...
(1/2) Arming ConditionNeedsUpdate...
(2/2) Refreshing PackageKit...
-> Failed to install the following packages. Manual intervention is required:
local-ai-cublas - exit status 4
Pinned Comments