@mrwsl Thanks for your feedback, it's my mistake, fixed it.
Search Criteria
Package Details: anythingllm-desktop-bin 1.7.8-1
Package Actions
Git Clone URL: | https://aur.archlinux.org/anythingllm-desktop-bin.git (read-only, click to copy) |
---|---|
Package Base: | anythingllm-desktop-bin |
Description: | The all-in-one AI application, tool suite, and API for RAG & Agents for Docker & Desktop.(Prebuilt version.Use system-wide electron) |
Upstream URL: | https://useanything.com/ |
Keywords: | agents ai-agents-framework chroma desktop-application document-chat langchain-app llama lmstudio local-llm localai ollama openai-chatgpt pinecone rag retrieval-augmented-generation vector-database |
Licenses: | MIT |
Conflicts: | anythingllm, anythingllm-desktop |
Provides: | anythingllm-desktop |
Submitter: | zxp19821005 |
Maintainer: | zxp19821005 (impulse) |
Last Packager: | zxp19821005 |
Votes: | 5 |
Popularity: | 0.60 |
First Submitted: | 2024-04-17 14:36 (UTC) |
Last Updated: | 2025-03-28 01:56 (UTC) |
Dependencies (5)
- electron26AUR (electron26-binAUR)
- nodejs (nodejs-gitAUR, python-nodejs-wheelAUR, nodejs-lts-iron, nodejs-lts-jod)
- asar (make)
- fuse2 (make)
- ollama (ollama-cuda-gitAUR, ollama-nogpu-gitAUR, ollama-for-amd-gitAUR, ollama-rocm-gitAUR, ollama-gitAUR) (optional) – Use your local AI model to generate answers
Required by (0)
Sources (2)
zxp19821005 commented on 2025-03-20 10:30 (UTC)
mrwsl commented on 2025-03-20 10:19 (UTC)
With the latest update I get:
error: failed to commit transaction (conflicting files)
anythingllm-desktop-bin: /usr/share/pixmaps exists in filesystem
Phaotee commented on 2025-01-31 02:22 (UTC)
Source no longer works.
==> Retrieving sources...
-> Downloading anythingllm-desktop-1.7.2.AppImage...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (22) The requested URL returned error: 403
==> ERROR: Failure while downloading https://s3.us-west-1.amazonaws.com/public.useanything.com/latest/AnythingLLMDesktop.AppImage
Aborting...
When I build it with a new source, I get an issue when creating a new Workspace (which must be done before any chat can happen).
zxp19821005 commented on 2025-01-15 04:33 (UTC)
The upstream has utilized the third download links. Despite the file being updated, the filename remains unchanged. Therefore, if there are errors during package installation, please make the package outdated.
Traace commented on 2024-12-26 11:27 (UTC)
AppImage checksum has changed again. it is now: fd3d8478682904905a1989816f3fa9549ea1f38ecd01fb3e4d996549daf100fd
impulse commented on 2024-10-08 14:07 (UTC)
Can i be a co-maintainer? we may want to update the electron version for example, i can do that for you.
zxp19821005 commented on 2024-09-06 08:18 (UTC) (edited on 2024-09-06 08:19 (UTC) by zxp19821005)
@obamna Maybe you can install it again, start the app, wait the program to downloand the debian-openssl-1.1.x, and do this: sudo cp /~/.cache/prisma/master/61e140623197a131c2a6189271ffee05a7aa9a59/debian-openssl-1.0.x/libquery-engine' '/usr/lib/anythingllm-desktop/backend/node_modules/prisma/libquery_engine-debian-openssl-1.0.x.so.node
obamna commented on 2024-09-06 07:05 (UTC)
i am getting the same issue @zcyaya is getting
Error: Invalid prisma.workspaces.create()
invocation: Prisma Client could not locate the Query Engine for runtime "debian-openssl-1.1.x". This happened because Prisma Client was generated for "debian-openssl-3.0.x", but the actual deployment required "debian-openssl-1.1.x". Add "debian-openssl-1.1.x" to binaryTargets
in the "schema.prisma" file and run prisma generate
after saving it: generator client { provider = "prisma-client-js" binaryTargets = ["native", "debian-openssl-1.1.x"] } The following locations have been searched: /usr/lib/anythingllm-desktop/backend/node_modules/.prisma/client /usr/lib/anythingllm-desktop/backend/node_modules/@prisma/client /home/tim/Documents/anything-llm-desktop/anything-llm/server/node_modules/@prisma/client /tmp/prisma-engines /usr/lib/anythingllm-desktop/backend/prisma
zxp19821005 commented on 2024-07-12 05:04 (UTC)
@draptik Thanks for your feedback, fixed it.
Pinned Comments
zxp19821005 commented on 2025-01-15 04:33 (UTC)
The upstream has utilized the third download links. Despite the file being updated, the filename remains unchanged. Therefore, if there are errors during package installation, please make the package outdated.