Package Details: local-ai-cublas 3.1.1-1

Git Clone URL: https://aur.archlinux.org/local-ai-cublas.git (read-only, click to copy)
Package Base: local-ai-cublas
Description: Free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first (with NVIDIA CUDA optimizations)
Upstream URL: https://github.com/mudler/LocalAI
Licenses: MIT
Conflicts: local-ai
Provides: local-ai
Submitter: robertfoster
Maintainer: robertfoster
Last Packager: robertfoster
Votes: 1
Popularity: 0.52
First Submitted: 2024-12-06 23:29 (UTC)
Last Updated: 2025-06-30 10:15 (UTC)

Latest Comments

ciros commented on 2025-06-10 10:43 (UTC)

i'm having this error:

[100%] Built target server
make[1]: Leaving directory '/home/ciros/.cache/yay/local-ai-cublas/src/LocalAI-2.29.0/sources/bark.cpp/build'
==> ERROR: A failure occurred in build().
    Aborting...
 -> error making: local-ai-cublas-exit status 4
checking dependencies...

Packages (5) go-2:1.24.3-1  go.rice-1.0.3-1  grpc-1.72.0-1  protoc-gen-go-1.36.6-1  protoc-gen-go-grpc-1:1.5.1-2

Total Removed Size:  274,27 MiB

:: Do you want to remove these packages? [Y/n] 
:: Processing package changes...
(1/5) removing go                                                                                                                                  [##########################################################################################] 100%
(2/5) removing grpc                                                                                                                                [##########################################################################################] 100%
(3/5) removing protoc-gen-go-grpc                                                                                                                  [##########################################################################################] 100%
(4/5) removing protoc-gen-go                                                                                                                       [##########################################################################################] 100%
(5/5) removing go.rice                                                                                                                             [##########################################################################################] 100%
:: Running post-transaction hooks...
(1/2) Arming ConditionNeedsUpdate...
(2/2) Refreshing PackageKit...
 -> Failed to install the following packages. Manual intervention is required:
local-ai-cublas - exit status 4

Moxon commented on 2024-12-21 16:01 (UTC)

Earlier in the build log, I find this:

-- Generating done (0.0s)
-- Build files have been written to: /tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/stablediffusion-ggml.cpp/build
make[1]: Entering directory '/tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/stablediffusion-ggml.cpp/build'
[  4%] Building C object thirdparty/CMakeFiles/zip.dir/zip.c.o
In file included from /tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/stablediffusion-ggml.cpp/thirdparty/zip.c:40:
/tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/stablediffusion-ggml.cpp/thirdparty/miniz.h:4988:9: note: ‘#pragma message: Using fopen, ftello, fseeko, stat() etc. path for file I/O - this path may not support large files.’
 4988 | #pragma message(                                                               \
      |         ^~~~~~~
[  4%] Built target zip
[  8%] Building C object ggml/src/CMakeFiles/ggml-base.dir/ggml.c.o
[ 13%] Building C object ggml/src/CMakeFiles/ggml-base.dir/ggml-alloc.c.o
[ 17%] Building CXX object ggml/src/CMakeFiles/ggml-base.dir/ggml-backend.cpp.o
[ 21%] Building CXX object ggml/src/CMakeFiles/ggml-base.dir/ggml-opt.cpp.o
[ 26%] Building CXX object ggml/src/CMakeFiles/ggml-base.dir/ggml-threading.cpp.o
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/clamp.o] Error 1
make[1]: *** Waiting for unfinished jobs....
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/acc.o] Error 1
[ 30%] Building C object ggml/src/CMakeFiles/ggml-base.dir/ggml-quants.c.o
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/argmax.o] Error 1
[ 34%] Building C object ggml/src/CMakeFiles/ggml-base.dir/ggml-aarch64.c.o
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/argsort.o] Error 1
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/binbcast.o] Error 1
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:626: ggml/src/ggml-cuda.o] Error 1
[ 39%] Linking CXX static library libggml-base.a
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/arange.o] Error 1

Moxon commented on 2024-12-21 15:56 (UTC) (edited on 2024-12-21 15:58 (UTC) by Moxon)

Cool package ... but unfortunately it does not build on my machine:

...
cd build && cp -rf CMakeFiles/ggml.dir/k_quants.c.o ../llama.cpp/k_quants.o
cd build && cp -rf CMakeFiles/ggml.dir/ggml-alloc.c.o ../llama.cpp/ggml-alloc.o
cd build && cp -rf examples/CMakeFiles/common.dir/common.cpp.o ../llama.cpp/common.o
cd build && cp -rf examples/CMakeFiles/common.dir/grammar-parser.cpp.o ../llama.cpp/grammar-parser.o
cd build && cp -rf CMakeFiles/llama.dir/llama.cpp.o ../llama.cpp/llama.o
cd build && cp -rf CMakeFiles/ggml.dir/ggml-cuda.cu.o ../llama.cpp/ggml-cuda.o
ar src libbinding.a llama.cpp/ggml.o llama.cpp/k_quants.o llama.cpp/ggml-alloc.o llama.cpp/common.o llama.cpp/grammar-parser.o llama.cpp/llama.o binding.o llama.cpp/ggml-cuda.o
make[1]: Leaving directory '/tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/go-llama.cpp'
==> ERROR: A failure occurred in build().
    Aborting...
 -> error making: local-ai-cublas-exit status 4
 -> Failed to install the following packages. Manual intervention is required:
local-ai-cublas - exit status 4