Package Details: whisper.cpp-clblas 1.5.5-3

Git Clone URL: https://aur.archlinux.org/whisper.cpp.git (read-only, click to copy)
Package Base: whisper.cpp
Description: Port of OpenAI's Whisper model in C/C++ (with OpenCL optimizations)
Upstream URL: https://github.com/ggerganov/whisper.cpp
Licenses: MIT
Conflicts: whisper.cpp
Provides: whisper.cpp
Submitter: robertfoster
Maintainer: robertfoster
Last Packager: robertfoster
Votes: 7
Popularity: 0.32
First Submitted: 2023-03-10 17:32 (UTC)
Last Updated: 2024-04-27 14:59 (UTC)

Dependencies (9)

Required by (1)

Sources (1)

Latest Comments

1 2 Next › Last »

solarisfire commented on 2024-04-26 21:32 (UTC)

Build seems to be broken with latest rocm-llvm?

-- Build files have been written to: /home/solarisfire/.cache/yay/whisper.cpp/src/whisper.cpp-hipblas/build
[  0%] Built target json_cpp
[  5%] Building CXX object CMakeFiles/ggml-rocm.dir/ggml-cuda.cu.o
c++: error: language hip not recognized
c++: error: language hip not recognized
make[2]: *** [CMakeFiles/ggml-rocm.dir/build.make:76: CMakeFiles/ggml-rocm.dir/ggml-cuda.cu.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:211: CMakeFiles/ggml-rocm.dir/all] Error 2
make: *** [Makefile:146: all] Error 2
==> ERROR: A failure occurred in build().
    Aborting...
 -> error making: whisper.cpp-exit status 4
 -> Failed to install the following packages. Manual intervention is required:
whisper.cpp-cublas - exit status 4

Melon_Bread commented on 2024-04-25 00:48 (UTC)

Is there any chance we can get a whisper.cpp-hipblas package since there is rocm/hipblas support for whisper.cpp in their cmake files? (Thank you for your packages)

dreieck commented on 2024-04-17 15:33 (UTC)

ccache does not need to be an optional dependency.

When the software is built, it does not make sense.

The user can control CCACHE build with the ccache-option to makepkg.

Regards!

dreieck commented on 2024-03-27 09:35 (UTC)

Since several packages use ggml I made an extra package for it:
libggml-git.

Regards!

Marzal commented on 2024-03-26 12:05 (UTC)

Hi, thanks for adding openblas support. 2 questions:

dreieck commented on 2024-03-25 22:31 (UTC) (edited on 2024-03-26 19:40 (UTC) by dreieck)

llama-cpp and whisper.cpp need to conflict with each other:

error: failed to commit transaction (conflicting files)
llama-cpp-rocm-git: /usr/include/ggml.h exists in filesystem (owned by whisper.cpp-clblas)

(Maybe even better to not install /usr/include/ggml.h but use a separate package for this.)

Regards and thanks for maintaining!

kearneyBack commented on 2024-02-24 12:28 (UTC)

@lyoneel Thanks for your kindly advice. I try it and got better result(11 mins).

lyoneel commented on 2024-02-21 07:38 (UTC)

@kearneyBack if you are using CPU only, this build as far I understand does not enable openblas, and may is affecting your performance, see: https://github.com/ggerganov/whisper.cpp?tab=readme-ov-file#blas-cpu-support-via-openblas

kearneyBack commented on 2023-12-07 09:43 (UTC) (edited on 2023-12-27 07:42 (UTC) by kearneyBack)

thanks for you kindly packing work. I use ggml-large-v2.bin to do 1 mins audio(-t 12 -p 6) takes almost 35mins. Is that normal?