Package Details: llama.cpp-sycl-f16 b4153-1

Git Clone URL: https://aur.archlinux.org/llama.cpp-sycl-f16.git (read-only, click to copy)
Package Base: llama.cpp-sycl-f16
Description: Port of Facebook's LLaMA model in C/C++ (with Intel SYCL GPU optimizations and F16)
Upstream URL: https://github.com/ggerganov/llama.cpp
Licenses: MIT
Conflicts: llama.cpp
Provides: llama.cpp
Submitter: txtsd
Maintainer: txtsd
Last Packager: txtsd
Votes: 1
Popularity: 0.58
First Submitted: 2024-10-26 18:11 (UTC)
Last Updated: 2024-11-22 11:24 (UTC)

Pinned Comments

txtsd commented on 2024-10-26 20:15 (UTC)

Alternate versions

llama.cpp
llama.cpp-vulkan
llama.cpp-sycl-fp16
llama.cpp-sycl-fp32
llama.cpp-opencl
llama.cpp-cuda
llama.cpp-hip

Latest Comments

pepijndevos commented on 2024-11-22 07:32 (UTC)

I was told they wont update it until after the Python 3.13 migration.

I have updated it locally and am now getting

/usr/lib64/gcc/x86_64-pc-linux-gnu/14.2.1/../../../../include/c++/14.2.1/array:217:2: error: SYCL kernel cannot call an undefined function without SYCL_EXTERNAL attribute

txtsd commented on 2024-11-22 06:44 (UTC)

It's the intel-oneapi-basekit package. It's been outdated for a long time now. I'll send a patch and see if they'll accept it.

ioctl commented on 2024-11-22 06:37 (UTC)

I have the same problem with 'syclcompat/math.hpp' file not found .

The latest intel-oneapi-basekit-2024.1.0.596-3 is installed.

txtsd commented on 2024-11-22 05:34 (UTC)

@pepijndevos Thanks for reporting! I'm trying to figure out if it's an upstream issue or if it's because of the outdated intel-oneapi-basekit package.

pepijndevos commented on 2024-11-21 17:20 (UTC)

I'm getting an error

/home/pepijn/aur/llama.cpp-sycl-f16/src/llama.cpp/ggml/src/ggml-sycl/../ggml-sycl/dpct/helper.hpp:18:10: fatal error: 'syclcompat/math.hpp' file not found
   18 | #include <syclcompat/math.hpp>
      |          ^~~~~~~~~~~~~~~~~~~~~

txtsd commented on 2024-10-26 20:15 (UTC)

Alternate versions

llama.cpp
llama.cpp-vulkan
llama.cpp-sycl-fp16
llama.cpp-sycl-fp32
llama.cpp-opencl
llama.cpp-cuda
llama.cpp-hip