Package Details: ollama-nogpu-git 0.5.5+r3779+g6982e9cc9-1

Git Clone URL: https://aur.archlinux.org/ollama-nogpu-git.git (read-only, click to copy)
Package Base: ollama-nogpu-git
Description: Create, run and share large language models (LLMs)
Upstream URL: https://github.com/ollama/ollama
Licenses: MIT
Conflicts: ollama
Provides: ollama
Submitter: dreieck
Maintainer: None
Last Packager: envolution
Votes: 5
Popularity: 0.153731
First Submitted: 2024-04-17 15:09 (UTC)
Last Updated: 2025-01-14 12:40 (UTC)

Dependencies (3)

Required by (29)

Sources (5)

Latest Comments

« First ‹ Previous 1 2 3 Next › Last »

dreieck commented on 2024-09-03 11:29 (UTC)

Maybe the problem is caused by "go get" in pkgbuild, and this cause the permission of those go mod files are rrr, so when aurhelper try to delete them it crashes. I've found that people recommend "go mod download" instead of "go get" to prevent this

I now just changed go get to go mod download.

Regards!

lavilao commented on 2024-08-18 21:10 (UTC)

Hi, does anyone knows how to build this package without avx? I have added -DGGML_AVX=off, -DLLAMA_AVX=OFF -DLLAMA_AVX2=OFF -DLLAMA_AVX512=OFF -DLLAMA_AVX512_VBMI=OFF -DLLAMA_AVX512_VNNI=OFF -DLLAMA_F16C=OFF -DLLAMA_FMA=OFF yet when building llama.cpp server it still shows -DGGML_AVX=on.Am trying to build it on GitHub actions because my machine is to weak. Thanks in advice!

dreieck commented on 2024-08-03 11:36 (UTC) (edited on 2024-09-03 11:30 (UTC) by dreieck)

@beatboxchad,

Please discuss/report this with upstream.

The comments here are just for packaging issues, and yours is not a packaging issue as it seems.


@kkbs,
I am going to look into that sometime later, thanks for the information!

beatboxchad commented on 2024-08-03 04:37 (UTC)

Warm greetings,

I'm having this problem on two different laptops with Intel hardware:

Aug 02 23:14:54 myhostname ollama[1504679]: time=2024-08-02T23:14:54.987-05:00 level=INFO source=gpu.go:204 msg="looking for compatible GPUs"
Aug 02 23:14:55 myhostname ollama[1504679]: time=2024-08-02T23:14:55.043-05:00 level=INFO source=gpu.go:347 msg="no compatible GPUs were discovered"

I accidentally biked down to the coffeeshop without the other one, and my build output is obfuscated by that other issue y'all are discussing, but I've been fiddling with the build for a little while and don't see any obvious low-hanging fruit. There is one fact about the output I find peculiar, where although the Vulkan code appears to build it echos that only the CPU variant is built. I'll come back with that output soon.

Meanwhile, here's some basic diagnostic info about the machine I have:

$ lsgpu
card1                    Intel Kabylake (Gen9)             drm:/dev/dri/card1
└─renderD128                                               drm:/dev/dri/renderD128

vulkaninfo is too long to provide.

Yeaxi commented on 2024-08-03 02:33 (UTC) (edited on 2024-08-03 02:33 (UTC) by Yeaxi)

@dreieck hello dude. After viewing the pkgbuild and doing some search, I get some idea. Maybe the problem is caused by "go get" in pkgbuild, and this cause the permission of those go mod files are rrr, so when aurhelper try to delete them it crashes. I've found that people recommend "go mod download" instead of "go get" to prevent this, but I'm not pretty sure about it, maybe a pro like you can tell.btw, paru is a rust version of yay started by the same guy.

dreieck commented on 2024-07-28 09:41 (UTC)

Ahoj @kkbs,

Every time when I'm trying to upgrade this package, it shows that rm command cannot rm go pkg(permission denied), and I have to manually rm the paru cache to fix it, what could be wrong?

I do not have any idea about go specific stuff.
In the PKGBUILD there is no explicit rm.

So it seems that upstream's build script want to delete stuff.(?)

I do not understand why the build user does not have permission to delete the files that the same user has created.

What is "paru"? I do not see any reference to it in the PKGBUILD.

I have no idea, I did not encounter any such issue so far.

So maybe someone else who has an idea give a hint?

Regards!

Yeaxi commented on 2024-07-28 02:35 (UTC)

Every time when I'm trying to upgrade this package, it shows that rm command cannot rm go pkg(permission denied), and I have to manually rm the paru cache to fix it, what could be wrong?

brianwo commented on 2024-06-07 14:50 (UTC)

@dreieck, I have no idea about that

dreieck commented on 2024-06-07 13:42 (UTC) (edited on 2024-06-07 14:46 (UTC) by dreieck)

@brianwo,

icx […] icpx

Do you have an idea which packages/ which upstream project provides those executables? I have no idea what they are.

--

Hacky workaround:
Disabled upstream-added ONEAPI build by setting OLLAMA_ROOT to a hopefully non-existend directory. See https://github.com/ollama/ollama/issues/4511#issuecomment-2154973327.

Regards!

brianwo commented on 2024-06-07 11:32 (UTC) (edited on 2024-07-29 10:38 (UTC) by brianwo)

Unable to build, looks like it builds with oneAPI. It was working for 0.1.39+12.r2800.20240528.ad897080-1 previously.

++++ '[' '' = /opt/intel/oneapi/compiler/2024.1/opt/compiler/lib:/opt/intel/oneapi/compiler/2024.1/lib ']'
++++ printf %s /opt/intel/oneapi/tbb/2021.12/env/../lib/intel64/gcc4.8:/opt/intel/oneapi/compiler/2024.1/opt/compiler/lib:/opt/intel/oneapi/compiler/2024.1/lib
+++ LD_LIBRARY_PATH=/opt/intel/oneapi/tbb/2021.12/env/../lib/intel64/gcc4.8:/opt/intel/oneapi/compiler/2024.1/opt/compiler/lib:/opt/intel/oneapi/compiler/2024.1/lib
+++ export LD_LIBRARY_PATH
++++ prepend_path /opt/intel/oneapi/tbb/2021.12/env/../include ''
++++ path_to_add=/opt/intel/oneapi/tbb/2021.12/env/../include
++++ path_is_now=
++++ '[' '' = '' ']'
++++ printf %s /opt/intel/oneapi/tbb/2021.12/env/../include
+++ CPATH=/opt/intel/oneapi/tbb/2021.12/env/../include
+++ export CPATH
++++ prepend_path /opt/intel/oneapi/tbb/2021.12/env/.. /opt/intel/oneapi/compiler/2024.1
++++ path_to_add=/opt/intel/oneapi/tbb/2021.12/env/..
++++ path_is_now=/opt/intel/oneapi/compiler/2024.1
++++ '[' '' = /opt/intel/oneapi/compiler/2024.1 ']'
++++ printf %s /opt/intel/oneapi/tbb/2021.12/env/..:/opt/intel/oneapi/compiler/2024.1
+++ CMAKE_PREFIX_PATH=/opt/intel/oneapi/tbb/2021.12/env/..:/opt/intel/oneapi/compiler/2024.1
+++ export CMAKE_PREFIX_PATH
++++ prepend_path /opt/intel/oneapi/tbb/2021.12/env/../lib/pkgconfig /opt/intel/oneapi/compiler/2024.1/lib/pkgconfig
++++ path_to_add=/opt/intel/oneapi/tbb/2021.12/env/../lib/pkgconfig
++++ path_is_now=/opt/intel/oneapi/compiler/2024.1/lib/pkgconfig
++++ '[' '' = /opt/intel/oneapi/compiler/2024.1/lib/pkgconfig ']'
++++ printf %s /opt/intel/oneapi/tbb/2021.12/env/../lib/pkgconfig:/opt/intel/oneapi/compiler/2024.1/lib/pkgconfig
+++ PKG_CONFIG_PATH=/opt/intel/oneapi/tbb/2021.12/env/../lib/pkgconfig:/opt/intel/oneapi/compiler/2024.1/lib/pkgconfig
+++ export PKG_CONFIG_PATH
++ temp_var=2
++ '[' 2 -eq 0 ']'
++ echo ':: oneAPI environment initialized ::'
:: oneAPI environment initialized ::
++ echo ' '

++ '[' 0 -ne 0 ']'
++ eval set -- ''\''--force'\'' \
 '
+++ set -- --force
++ prep_for_exit 0
++ script_return_code=0
++ unset -v SETVARS_CALL
++ unset -v SETVARS_ARGS
++ unset -v SETVARS_VARS_PATH
++ '[' 0 = '' ']'
++ '[' 0 -eq 0 ']'
++ SETVARS_COMPLETED=1
++ export SETVARS_COMPLETED
++ return 0
++ return
+ CC=icx
+ CMAKE_DEFS='-DCMAKE_POSITION_INDEPENDENT_CODE=on -DLLAMA_NATIVE=off -DLLAMA_AVX=on -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_FMA=off -DLLAMA_F16C=off -DCMAKE_BUILD_TYPE=Release -DLLAMA_SERVER_VERBOSE=off  -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx -DLLAMA_SYCL=ON -DLLAMA_SYCL_F16=OFF'
+ BUILD_DIR=../build/linux/x86_64/oneapi
+ EXTRA_LIBS='-fsycl -Wl,-rpath,/opt/intel/oneapi/compiler/latest/lib,-rpath,/opt/intel/oneapi/mkl/latest/lib,-rpath,/opt/intel/oneapi/tbb/latest/lib,-rpath,/opt/intel/oneapi/compiler/latest/opt/oclfpga/linux64/lib -lOpenCL -lmkl_core -lmkl_sycl_blas -lmkl_intel_ilp64 -lmkl_tbb_thread -ltbb'
+ DEBUG_FLAGS=
+ build
+ cmake -S ../llama.cpp -B ../build/linux/x86_64/oneapi -DCMAKE_POSITION_INDEPENDENT_CODE=on -DLLAMA_NATIVE=off -DLLAMA_AVX=on -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_FMA=off -DLLAMA_F16C=off -DCMAKE_BUILD_TYPE=Release -DLLAMA_SERVER_VERBOSE=off -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx -DLLAMA_SYCL=ON -DLLAMA_SYCL_F16=OFF
-- The C compiler identification is unknown
-- The CXX compiler identification is unknown
CMake Error at CMakeLists.txt:2 (project):
  The CMAKE_C_COMPILER:

    icx

  is not a full path and was not found in the PATH.

  Tell CMake where to find the compiler by setting either the environment
  variable "CC" or the CMake cache entry CMAKE_C_COMPILER to the full path to
  the compiler, or to the compiler name if it is in the PATH.


CMake Error at CMakeLists.txt:2 (project):
  The CMAKE_CXX_COMPILER:

    icpx

  is not a full path and was not found in the PATH.

  Tell CMake where to find the compiler by setting either the environment
  variable "CXX" or the CMake cache entry CMAKE_CXX_COMPILER to the full path
  to the compiler, or to the compiler name if it is in the PATH.


-- Configuring incomplete, errors occurred!
llm/generate/generate_linux.go:3: running "bash": exit status 1
==> ERROR: A failure occurred in build().
    Aborting...
 -> error making: ollama-nogpu-git-exit status 4
 -> Failed to install the following packages. Manual intervention is required:
ollama-vulkan-git - exit status 4