claude-code
|
0.2.56-1 |
5 |
3.66
|
An agentic coding tool that lives in your terminal |
cg505
|
2025-03-27 22:50 (UTC) |
pupu-bin
|
0.0.3-1 |
1 |
0.57
|
A simple and easy to use UI for the Ollama.(Use system-wide electron) |
zxp19821005
|
2025-03-03 07:26 (UTC) |
howto-bin
|
0.0.0-1 |
1 |
0.49
|
A terminal helper for querying LLM |
Dominiquini
|
2025-02-19 14:08 (UTC) |
hollama-bin
|
0.30.0-1 |
1 |
0.32
|
A minimal web-UI for talking to Ollama servers.(Prebuilt version.Use system-wide electron) |
zxp19821005
|
2025-03-25 01:10 (UTC) |
psource-git
|
0.4.3.f2eae02-1 |
2 |
0.16
|
CLI tool to pretty print source code to stdout or directly to the clipboard. |
frederikstroem
|
2024-09-15 09:09 (UTC) |
codai
|
1.8.4-2 |
2 |
0.15
|
AI code assistant that helps developers through a session CLI |
envolution
|
2025-03-02 01:21 (UTC) |
llama.cpp-bin
|
b4882-1 |
1 |
0.07
|
LLM inference in C/C++ (precompiled Linux binaries) |
neitsab
|
2025-03-13 15:46 (UTC) |
localai-git-cuda-python
|
2.24.0.93.g0eb2911a-2 |
3 |
0.05
|
Self-hosted OpenAI API alternative - Open Source, community-driven and local-first. |
wuxxin
|
2025-03-05 01:30 (UTC) |
localai-git
|
2.24.0.93.g0eb2911a-2 |
3 |
0.05
|
Self-hosted OpenAI API alternative - Open Source, community-driven and local-first. |
wuxxin
|
2025-03-05 01:30 (UTC) |
localai-git-rocm-python
|
2.24.0.93.g0eb2911a-2 |
3 |
0.05
|
Self-hosted OpenAI API alternative - Open Source, community-driven and local-first. |
wuxxin
|
2025-03-05 01:30 (UTC) |
localai-git-cuda
|
2.24.0.93.g0eb2911a-2 |
3 |
0.05
|
Self-hosted OpenAI API alternative - Open Source, community-driven and local-first. (with CUDA support) |
wuxxin
|
2025-03-05 01:30 (UTC) |
localai-git-rocm
|
2.24.0.93.g0eb2911a-2 |
3 |
0.05
|
Self-hosted OpenAI API alternative - Open Source, community-driven and local-first. (with ROCM support) |
wuxxin
|
2025-03-05 01:30 (UTC) |
localai-git-python
|
2.24.0.93.g0eb2911a-2 |
3 |
0.05
|
Self-hosted OpenAI API alternative - Open Source, community-driven and local-first. |
wuxxin
|
2025-03-05 01:30 (UTC) |
aichat-ng
|
0.28.0-1 |
2 |
0.04
|
OpenAI, ChatGPT, ollama and more in your terminal. Fork with advanced features. |
blob42
|
2025-03-02 23:57 (UTC) |
ollamaurl
|
1.0.2-1 |
1 |
0.03
|
See what ollama pull would have fetched |
xyproto
|
2024-10-05 12:58 (UTC) |
gpt4all-chat
|
3.10.0-1 |
7 |
0.02
|
run open-source LLMs anywhere |
ZhangHua
|
2025-02-25 02:21 (UTC) |
jan-appimage
|
0.5.16-1 |
2 |
0.02
|
An open source alternative to ChatGPT that runs 100% offline on your computer |
redponike
|
2025-03-24 22:10 (UTC) |
lsp-ai
|
0.7.1-1 |
1 |
0.02
|
A language server that performs completion using large language models (LLMs) |
repsac
|
2024-09-29 12:16 (UTC) |
local-llama-git
|
1.0.2.r0.g8ef4209-1 |
1 |
0.01
|
Local Llama also known as L³ is designed to be easy to use, with a user-friendly interface and advanced settings. |
zxp19821005
|
2024-09-26 10:57 (UTC) |
freedomgpt-git
|
3.0.2.r0.gd138af7-1 |
1 |
0.01
|
A desktop application that allows users to run alpaca models on their local machine. |
orphan
|
2024-05-18 03:37 (UTC) |
litellm
|
1.63.11-1 |
2 |
0.00
|
Call all LLM APIs using the OpenAI format |
AlphaJack
|
2025-03-23 22:34 (UTC) |
litellm-ollama
|
4-3 |
2 |
0.00
|
Metapackage to setup ollama models with OpenAI API locally |
orphan
|
2024-01-30 03:59 (UTC) |
shibuya-bin
|
0.2.8-1 |
0 |
0.00
|
A project built Electron + React.js, to dig out the potential of cross platform AI completion.Prebuilt version.Use system-wide electron. |
zxp19821005
|
2024-10-25 07:02 (UTC) |
risuai-bin
|
158.2.0-1 |
0 |
0.00
|
Make your own story. User-friendly software for LLM roleplaying.(Prebuilt version) |
zxp19821005
|
2025-03-26 14:56 (UTC) |
python-sacrebleu
|
2.5.0-1 |
0 |
0.00
|
Reference BLEU implementation that auto-downloads test sets |
daskol
|
2025-02-02 16:19 (UTC) |
pupu-git
|
0.0.1.r57.g8bbe904-1 |
0 |
0.00
|
A simple and easy to use UI for the Ollama.(Use system-wide electron) |
zxp19821005
|
2025-02-18 08:45 (UTC) |
pinac-workspace-git
|
2.0.9.r71.gc2e5198-1 |
0 |
0.00
|
OpenSource & Cross-Platform alternative of "Copilot for Windows".Use system-wide electron. |
zxp19821005
|
2024-12-25 10:25 (UTC) |
localchat-bin
|
0.11.0-3 |
0 |
0.00
|
Chat with generative language models locally on your computer with zero setup. |
zxp19821005
|
2024-07-23 02:25 (UTC) |
local-llama-bin
|
1.0.2-1 |
0 |
0.00
|
Local Llama also known as L³ is designed to be easy to use, with a user-friendly interface and advanced settings. |
zxp19821005
|
2024-09-23 02:42 (UTC) |
local-ai-vulkan-openblas
|
2.26.0-2 |
0 |
0.00
|
The free, Open Source alternative to OpenAI, Claude and others. (Vulkan acceleration and openblas as fallback) |
ZhangHua
|
2025-03-29 04:04 (UTC) |
llocal-bin
|
1.0.0_beta.9-1 |
0 |
0.00
|
Aiming to provide a seamless and privacy driven chatting experience with open-sourced technologies(Ollama), particularly open sourced LLM's(eg. Llama3, Phi-3, Mistral). Focused on ease of use.(Prebuilt version.Use system-wide electron) |
zxp19821005
|
2025-03-17 02:10 (UTC) |
lightrail-core-bin
|
0.3.9-7 |
0 |
0.00
|
An open-source AI command bar that seeks to simplifies software development. It is designed to be a general-purpose, extensible platform for integrating LLM-based tooling into engineering/development workflows. |
zxp19821005
|
2024-07-08 05:10 (UTC) |
flowtestai-git
|
1.2.0.r25.g3b9ca3a-1 |
0 |
0.00
|
GenAI powered OpenSource IDE for API first workflows.Use system-wide electron. |
zxp19821005
|
2024-10-21 04:43 (UTC) |
elia-git
|
1.8.0.r0.gd265aa3-1 |
0 |
0.00
|
A powerful terminal user interface for interacting with large language models |
orphan
|
2024-08-11 20:06 (UTC) |
elia
|
1.8.0-1 |
0 |
0.00
|
A powerful terminal user interface for interacting with large language models |
vitaliikuzhdin
|
2024-08-11 19:51 (UTC) |
docspedia-git
|
r32.f348a53-1 |
0 |
0.00
|
OLLAMA client, Chat with your documents with your local LLM.Use system-wide electron. |
zxp19821005
|
2024-10-10 19:34 (UTC) |
converse
|
0.2.0-1 |
0 |
0.00
|
Desktop app to chat and create tasks with various LLMs.(Use system-wide electron) |
zxp19821005
|
2025-03-26 23:24 (UTC) |
chieapp-bin
|
0.2.11-7 |
0 |
0.00
|
An extensible desktop app for large language models like ChatGPT and Bing Chat. |
zxp19821005
|
2024-08-30 03:49 (UTC) |
cherry-studio-git
|
0.9.10.r3.g3ef0a0a-1 |
0 |
0.00
|
🍒 Cherry Studio is a desktop client that supports for multiple LLM providers.(Use system-wide electron) |
zxp19821005
|
2025-02-10 07:22 (UTC) |
cherry-studio-electron-bin
|
1.1.10-1 |
0 |
0.00
|
🍒 Cherry Studio is a desktop client that supports for multiple LLM providers.(Prebuilt version.Use system-wide electron) |
zxp19821005
|
2025-03-24 03:54 (UTC) |
chatd-bin
|
1.1.1-2 |
0 |
0.00
|
Chat with your documents using local AI.(Prebuilt version.Use system-wide electron) |
zxp19821005
|
2025-03-05 02:20 (UTC) |
chatd
|
1.1.2-3 |
0 |
0.00
|
Chat with your documents using local AI.(Use system-wide electron) |
zxp19821005
|
2025-03-05 02:21 (UTC) |
calt-git
|
0.1.1.r0.ga4925f9-1 |
0 |
0.00
|
Context-aware LLM Translator (CALT).(Use system-wide electron) |
zxp19821005
|
2025-01-03 07:19 (UTC) |
albert-launcher-git
|
2.0.1.r0.g9a27946-1 |
0 |
0.00
|
Your AI-powered file launcher and search assistant. Think Spotlight or Alfred, but with the intelligence to understand what you're looking for.(Use system-wide electron) |
zxp19821005
|
2024-12-26 08:01 (UTC) |
aihub-git
|
1.8.9.r4.g5b413ca-1 |
0 |
0.00
|
A collection of large model capabilities of the client.(Use system-wide electron)一款集合多家大模型能力的客户端 |
zxp19821005
|
2024-12-27 09:29 (UTC) |
aihub-bin
|
1.8.9-2 |
0 |
0.00
|
A collection of large model capabilities of the client.(Prebuilt version.Use system-wide electron)一款集合多家大模型能力的客户端. |
zxp19821005
|
2024-12-27 09:04 (UTC) |
aihub
|
1.8.9-3 |
0 |
0.00
|
A collection of large model capabilities of the client.(Use system-wide electron)一款集合多家大模型能力的客户端. |
zxp19821005
|
2024-12-27 09:27 (UTC) |
ai-chat-bin
|
1.2.5-3 |
0 |
0.00
|
A cross-platform desktop application that provides quick access to chatbots like OpenAI ChatGPT from the menu bar (tray).(Prebuilt version.Use system-wide electron) |
zxp19821005
|
2024-12-27 08:59 (UTC) |