Package Details: local-ai-cublas 2.24.2-1

Git Clone URL: https://aur.archlinux.org/local-ai-cublas.git (read-only, click to copy)
Package Base: local-ai-cublas
Description: Free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first (with NVIDIA CUDA optimizations)
Upstream URL: https://github.com/mudler/LocalAI
Licenses: MIT
Conflicts: local-ai
Provides: local-ai
Submitter: robertfoster
Maintainer: robertfoster
Last Packager: robertfoster
Votes: 0
Popularity: 0.000000
First Submitted: 2024-12-06 23:29 (UTC)
Last Updated: 2024-12-11 23:20 (UTC)

Latest Comments

Moxon commented on 2024-12-21 16:01 (UTC)

Earlier in the build log, I find this:

-- Generating done (0.0s)
-- Build files have been written to: /tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/stablediffusion-ggml.cpp/build
make[1]: Entering directory '/tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/stablediffusion-ggml.cpp/build'
[  4%] Building C object thirdparty/CMakeFiles/zip.dir/zip.c.o
In file included from /tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/stablediffusion-ggml.cpp/thirdparty/zip.c:40:
/tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/stablediffusion-ggml.cpp/thirdparty/miniz.h:4988:9: note: ‘#pragma message: Using fopen, ftello, fseeko, stat() etc. path for file I/O - this path may not support large files.’
 4988 | #pragma message(                                                               \
      |         ^~~~~~~
[  4%] Built target zip
[  8%] Building C object ggml/src/CMakeFiles/ggml-base.dir/ggml.c.o
[ 13%] Building C object ggml/src/CMakeFiles/ggml-base.dir/ggml-alloc.c.o
[ 17%] Building CXX object ggml/src/CMakeFiles/ggml-base.dir/ggml-backend.cpp.o
[ 21%] Building CXX object ggml/src/CMakeFiles/ggml-base.dir/ggml-opt.cpp.o
[ 26%] Building CXX object ggml/src/CMakeFiles/ggml-base.dir/ggml-threading.cpp.o
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/clamp.o] Error 1
make[1]: *** Waiting for unfinished jobs....
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/acc.o] Error 1
[ 30%] Building C object ggml/src/CMakeFiles/ggml-base.dir/ggml-quants.c.o
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/argmax.o] Error 1
[ 34%] Building C object ggml/src/CMakeFiles/ggml-base.dir/ggml-aarch64.c.o
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/argsort.o] Error 1
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/binbcast.o] Error 1
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
g++-13: error: unrecognized command-line option ‘-Wp’; did you mean ‘-W’?
make[1]: *** [Makefile:626: ggml/src/ggml-cuda.o] Error 1
[ 39%] Linking CXX static library libggml-base.a
make[1]: *** [Makefile:616: ggml/src/ggml-cuda/arange.o] Error 1

Moxon commented on 2024-12-21 15:56 (UTC) (edited on 2024-12-21 15:58 (UTC) by Moxon)

Cool package ... but unfortunately it does not build on my machine:

...
cd build && cp -rf CMakeFiles/ggml.dir/k_quants.c.o ../llama.cpp/k_quants.o
cd build && cp -rf CMakeFiles/ggml.dir/ggml-alloc.c.o ../llama.cpp/ggml-alloc.o
cd build && cp -rf examples/CMakeFiles/common.dir/common.cpp.o ../llama.cpp/common.o
cd build && cp -rf examples/CMakeFiles/common.dir/grammar-parser.cpp.o ../llama.cpp/grammar-parser.o
cd build && cp -rf CMakeFiles/llama.dir/llama.cpp.o ../llama.cpp/llama.o
cd build && cp -rf CMakeFiles/ggml.dir/ggml-cuda.cu.o ../llama.cpp/ggml-cuda.o
ar src libbinding.a llama.cpp/ggml.o llama.cpp/k_quants.o llama.cpp/ggml-alloc.o llama.cpp/common.o llama.cpp/grammar-parser.o llama.cpp/llama.o binding.o llama.cpp/ggml-cuda.o
make[1]: Leaving directory '/tmp/makepkg/local-ai-cublas/src/LocalAI-2.24.2/sources/go-llama.cpp'
==> ERROR: A failure occurred in build().
    Aborting...
 -> error making: local-ai-cublas-exit status 4
 -> Failed to install the following packages. Manual intervention is required:
local-ai-cublas - exit status 4