Package Details: dar 2.5.6-2

Git Clone URL: (read-only)
Package Base: dar
Description: A full featured command-line backup tool, short for Disk ARchive
Upstream URL:
Keywords: archive backup dar disk
Licenses: GPL
Submitter: xyproto
Maintainer: MarcinWieczorek
Last Packager: MarcinWieczorek
Votes: 41
Popularity: 0.377106
First Submitted: 2011-12-14 16:48
Last Updated: 2016-10-22 15:34

Latest Comments

MarcinWieczorek commented on 2016-10-22 15:48

I didn't see that the pkgrel is equal 2 :(

izmntuk commented on 2016-06-01 15:46

Disowned as I no longer have enough time for maintenaunce. (and I cannot access a environment for testing the new release now...) So feel free to adopt it if you are interested.

flyingDavid commented on 2016-02-19 17:24

Could you please re-add the --enable-mode={64,32} compile flags?
After removing them, dar uses a special integer format to store arbitrary big integers, but this needs much more memory for me (We're talking about gigabytes here)
This is needed for backups of zetabytes of data [1], but I guess that's much more than most of the users need.


encbladexp commented on 2014-12-29 16:46

Yes, samples and docs should be in /usr/share/doc/dar, and not in /usr/share/dar ;)

Fixing your /etc/darrc should work, maybe i fix this with the next package update so /etc/darrc points to the right destination for new installations.

MONOmah commented on 2014-12-24 03:52

Integration with par seems to be broken. And anything else, that require files in /usr/share/dar. These files really reside in /usr/share/doc/dar.

svg1234 commented on 2014-04-11 19:51

Just a note to anyone w/ the i686 o/s considering doing the build w/ 32 bit integers: 1) The infinint is quite a bit slower than the int32, so if you an use int32, it is worth it. 2) If you do use int32, not only is the archive limited to 4GB per slice, but if you plan on using encryption you will not be able to restore in certain cases! It depends on how many files have in the encrypted datase. The encryption works fine, but when you try to restore your files...DAR will issue an "out of memory" error.

(I will be upgrading my o/s to 64 bit over the weekend in order to use int64 w/ DAR).

encbladexp commented on 2014-04-11 17:48


I upload this with the next dar release.

svg1234 commented on 2014-04-11 01:52

"You also might want to speed up the compilation process running ./configure
with the --disable-static option".

Might want to add --disable-static to the pkgbuild?

svg1234 commented on 2014-04-11 00:36

re: the 4GB, I found out it's for the ARCHIVE size. I tested a backup w/ low compression, and it went over 4GB, and then erred out.

I have a 64 bit cpu but only 2GB ram (don't need 64 bit to address memory), so a couple of years ago when I did my original arch install, I went with the 32 bit. One of these days I'll get around to switching to the 64 bit version.

I guess I'll try infinint and see what the difference is versus the 32 bit integer build that I did last night. I currently have about 120K files on my system. I have plenty of swap space, so that isn't an issue. Guess it just comes down to speed.


raw commented on 2014-04-10 05:55

i dont know, as i havent tried it on 32 bit os, so almost everything i write here is speculation.

IMHO, as a 32 bit linux-kernel is actually built for a 32 bit machine, it does not allow access to 64 bit operations. If you have a 64 bit CPU, there is no reason to not use a 64 bit Linux.

While i still dont know exactly, i think your whole backup can be larger than 4GB. In worst case you have to split your archive into multiple slices. Just give it a try, if you hit any limit, DAR will tell you. If you do not have many files to backup, you are probably fine with infinitint anyways. If you are backing up millions of files (like E-Mails, many many images) AND want to use dar_manager, i strongly recommend a 64 bit OS, as it makes a serious difference in memory usage and performance.

This integer-mode also applies to the dar_manager, so the maximum file count applies to the dardb-archive too. if you have 1,000,000 files per backup and use dar_manager for easy restore, your dardb could only cover 4 backups until the file limit is hit.

All comments