Search Criteria
Package Details: transformerlab-bin 0.27.8-1
Package Actions
| Git Clone URL: | https://aur.archlinux.org/transformerlab-bin.git (read-only, click to copy) |
|---|---|
| Package Base: | transformerlab-bin |
| Description: | Open Source Application for Advanced LLM Engineering: interact, train, fine-tune, and evaluate large language models on your own computer.(Prebuilt version.Use system-wide electron) |
| Upstream URL: | https://transformerlab.ai/ |
| Keywords: | electron llama llms lora mlx rlhf transformers |
| Licenses: | MIT |
| Conflicts: | transformerlab |
| Provides: | transformerlab |
| Submitter: | zxp19821005 |
| Maintainer: | zxp19821005 |
| Last Packager: | zxp19821005 |
| Votes: | 1 |
| Popularity: | 0.000618 |
| First Submitted: | 2024-08-01 08:39 (UTC) |
| Last Updated: | 2026-01-19 04:16 (UTC) |
Dependencies (3)
- electron26AUR (electron26-binAUR)
- asar (make)
- ollama (ollama-nogpu-gitAUR, ollama-for-amd-gitAUR, ollama-for-amdAUR, ollama-gitAUR, ollama-binAUR) (optional) – server GGUF models instead of llama.cpp
Latest Comments
zxp19821005 commented on 2025-02-17 03:49 (UTC)
@Korialo This packaga was built from
electron26(See : https://github.com/transformerlab/transformerlab-app/blob/46901823a4604a0809cc74f71936feb78d274dbf/package.json#L165). Use the other electron's version to run the package may cause some unknown issues. You can useelectron26-binto run it,too.Korialo commented on 2025-02-15 14:23 (UTC)
@zxp19821005 Any reasons to not use
electronfromextrainstead of buildingelectron26, save bandwidth and build time?