343

This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. While you are encouraged to help maintain its answers, please understand that "big list" questions are not generally allowed on Ask Ubuntu and will be closed per the help center.

Backup is incredibly important. Obviously there's no best backup tool, but a comparison of the options would be very interesting.

  • Graphical Interface? Command line?
  • Incremental backups?
  • Automatic backups?
  • Install method: In standard repositories? PPA?
8128
  • 28,740
  • 4
    I would say the backup solution depends on what you are using the machine you are backing up for. A collection of work/school critical projects/code has a far different set of needs from a computer storing an ungodly amount of porn and music. On my home setup I have a small script that backs up a couple of folders I wouldn't like to lose, it does this incrementally. My work laptop gets everything backed up to a server and never has mission critical stuff left on it anyway. – Toby Aug 18 '10 at 21:15
  • It's not a features comparison, but this poll might help: http://www.webupd8.org/2010/05/best-linux-backup-tool-software.html Read the comments too! – Alin Andrei Aug 18 '10 at 21:19

38 Answers38

150

Déjà Dup Install Déjà Dup

Déjà Dup is (from Ubuntu 11.10) installed by default. It is a GNOME tool intended for the casual Desktop user that aims to be a "simple backup tool that hides the complexity of doing backups the Right Way".

It is a front end to duplicity that performs incremental backups, where only changes since the prior backup was made are stored. It has options for encrypted and automated backups. It can backup to local folders, Amazon S3, or any server to which Nautilus can connect.

Integration with Nautilus is superb, allowing for the restoration of files deleted from a directory and for the restoration of an old version of an individual file.

Main Window Screenshot

Restore earlier version of file

Note that as of February 2016 this project appears to be almost completely ignoring bug reports with only minor triage activity and the last bugfix dates back to 2014, though there are new releases with minor changes.

8128
  • 28,740
  • 4
    I don't quite understand? You can't restore specific versions of individual files very easily.

    However you can restore the entire backed up content to a specific backup. For instance I can restore to last week, or to the week before, or the week before that, etc

    – 8128 Aug 30 '10 at 07:12
  • What remote options does it have for back up? ssh, samba ... ? – Hamish Downer Sep 08 '10 at 19:15
  • 2
    It can connect to anything nautilus can see. So if you can mount it in the file system that's one option. There's also then the ability to connect to ftp, ssh, webdav or a windows share. My samba knowledge is limited I'm afraid. – 8128 Sep 08 '10 at 19:28
  • 9
    You can restore specific versions of individual files. It includes a nautilus extension. All you need to do is right click on a file and select "Revert to previous version." – andrewsomething Oct 13 '10 at 21:44
  • 2
    is there a command line interface for Deja Dup? – brillout Oct 24 '11 at 20:18
  • 3
    @brillout.com Deja Dup is based on Duplicity, which provides a command line interface. Another choice is duply. – nealmcb Jun 29 '12 at 05:46
  • 2
    Deja Dup is not designed for running as root in order to back up files that cannot be read by current user. If you use gksu to run it, the notification window at the end will crash and you have no idea if it worked. For a single user backup it is great. – Chris Good Mar 28 '13 at 11:02
  • 1
    Does Duplicity (and Deja Dup) save backed-up files as normal files in normal directories? Or does it use its own specific data format that you can only read with the program itself? – Lii Mar 09 '16 at 15:18
  • @Lii - the latter I'm afraid – 8128 Mar 10 '16 at 15:58
  • "keep files at least 6 months" - wtf?! – Martin Pfeffer Feb 19 '17 at 18:15
  • 1
    @Lii The former (mostly). From the doc: "The files used by duplicity to store backup data are tarfiles in GNU tar format. They can be produced independently by rdiffdir(1). For incremental backups, new files are saved normally in the tarfile. But when a file changes, instead of storing a complete copy of the file, only a diff is stored, as generated by rdiff(1). If a file is deleted, a 0 length file is stored in the tar. It is possible to restore a duplicity archive "manually" by using tar and then cp, rdiff, and rm as necessary. These duplicity archives have the extension difftar." – bernie Aug 03 '19 at 22:30
113

Back in Time

I have been using Back in Time for some time, and I'm very satisfied with it.

All you have to do is configure:

  • Where to save snapshot
  • What directories to backup
  • When backup should be done (manual, every hour, every day, every week, every month)

and forget about it.

To install Back in Time on Ubuntu 14.04-18.04:

sudo apt install backintime-gnome

To install Back in Time on Ubuntu 20.04 and later:

sudo apt install backintime-qt

The program GUI can be opened via Ubuntu search for "backintime".

alt text

The project is active as of August 2019.

karel
  • 114,770
Decio Lira
  • 7,866
  • 2
    Is there a way to get this to backup to a remote server? When you select a target directory, all non-local directories are hidden, and typing it into the location bar doesn't work. – zacharyliu Dec 05 '10 at 07:23
  • you can mount a remote share and do the backup on that one. for more info look a samba solution: https://vollkorn.cryptobitch.de/index.php?/archives/87-My-Backup-Solution-using-BackInTime.html or a ssh solution : http://www.oak-tree.us/blog/index.php/2009/07/20/back-in-time2 – gourgi Feb 11 '11 at 23:51
  • 23
    There's a "gotcha" with backintime - "dot" files are excluded by default. If you want your home directory's dot files, use backintime's Settings->Exclude and remove .* –  Feb 16 '11 at 17:49
  • 1
    To backup to a remote server you can use the ~/.gvfs folder, witch is where remote server is mounted by nautilus. But Déjà-Dup can do backup faster then back-in-time, while back-in-time is better to see files individually. – desgua Mar 27 '11 at 15:33
  • 1
    I like the feature to define separate profiles. This helps me define different profiles for different partitions of my drive and update the backups of only the partitions I need to. Also the first backup operation will take less time. – Chethan S. May 18 '11 at 12:28
  • 1
    Does Back in Time save backed-up files as normal files in normal directories? Or does it use its own specific data format that you can only read with the program itself? – Lii Mar 09 '16 at 15:18
  • 3
    @Lii BackInTime uses plain file copies which are hard-linked between snapshots. You can browse them with every tool you like. – Germar Mar 12 '16 at 00:25
  • Are you using gnome 2? Is that even possible? – Gowtham Jul 21 '16 at 14:14
73

rsnapshot vs. rdiff-backup

I often refer to this comparison of rsnapshot and rdiff-backup:

Similarities:

  • both use an rsync-like algorithm to transfer data (rsnapshot actually uses rsync; rdiff-backup uses the python librsync library)
  • both can be used over ssh (though rsnapshot cannot push over ssh without some extra scripting)
  • both use a simple copy of the source for the current backup

Differences in disk usage:

  • rsnapshot uses actual files and hardlinks to save space. For small files, storage size is similar.
  • rdiff-backup stores previous versions as compressed deltas to the current version similar to a version control system. For large files that change often, such as logfiles, databases, etc., rdiff-backup requires significantly less space for a given number of versions.

Differences in speed:

  • rdiff-backup is slower than rsnapshot because of its need to calculate delta files. There are ways to speed it up, though, like the --no-fsync and --no-compression options.

Differences in metadata storage:

  • rdiff-backup stores file metadata, such as ownership, permissions, and dates, separately.

Differences in file transparency:

  • For rsnapshot, all versions of the backup are accessible as plain files.
  • For rdiff-backup, only the current backup is accessible as plain files. Previous versions are stored as rdiff deltas.

Differences in backup levels made:

  • rsnapshot supports multiple levels of backup such as monthly, weekly, and daily.
  • rdiff-backup can only delete snapshots earlier than a given date; it cannot delete snapshots in between two dates.

Differences in support community:

  • rdiff-backup has seen a lot of recent development and bugfixing activity. From December 2019 till spring 2020, rdiff-backup was re-worked into version 2, which supports Python 3.

Supported file systems:

  • rdiff-backup supports all unixoid file systems. FAT32, NTFS and HFS+ are supported too. As of today (July 2020), there are still problems with exFAT.
user258532
  • 1,258
  • 2
  • 16
  • 25
ændrük
  • 76,794
  • Do either support data deduplication? – intuited Feb 05 '11 at 21:48
  • So it sounds like rsnapshot is just generally better. – mlissner Apr 30 '11 at 06:26
  • 2
    librsync is not a Python library but a C library. It is based of the rsync algorithm and used by rdiff-backup directoy from Python so it doesn't have to call an external utility and parse the output as rsnapshot does. – Anthon Feb 21 '14 at 07:05
  • A huge pro of rdiff-backup is the accessibility of the files in the current backup, so you can abuse rdiff-backup as a file transfer tool. If you have two computers, you can back-up the Desktop directories to two folders on a (sufficiently large) USB stick, "Desktop A" and "Desktop B". To edit files on the other computer, you simply copy the file from the backup, and put it into the active Desktop folder. – user258532 Mar 30 '19 at 13:57
66

rsync Install rsync

If you're familiar with command-line tools, you can use rsync to create (incremental) backups automatically. It can mirror your directories to other machines. There are lot of scripts available on the net how to do it. Set it up as recurring task in your crontab. There is also a GUI frontend for rsync called Grsync that makes manual backups easier.

One very useful example is:

rsync -vahP --delete --backup-dir ../$(date --iso-8601=minutes) <source directory> <destination directory>

Among -vahP, the -a flag is important, as this preserves file permissions and recurses into subdirectories. --backup-dir stores changed and deleted files in the specified backup directory, which is conveniently named after the current date and time.

The idea below stores changed/deleted files with a suffix, which carries the current time/date:

rsync -vahP --delete --backup-dir ../backup --suffix .$(date --iso-8601=minutes) <source directory> <destination directory>

Though rsync is very fast and very versatile, only the last backup can be easily restored in an obvious way.

Another way to preserve deleted files would be using hard links.

See:

Roalt
  • 223
  • 6
    rsync is a useful tool, but it isn't great for backup. It doesn't keep historic versions. – Erigami Aug 19 '10 at 18:32
  • I've changed this to talk about rsnapshot, which is what I think the author was referring to. – 8128 Aug 19 '10 at 18:53
  • @fluteflute: No, I did not mean rsnapshot. So your changes completely changes the meaning of my post. I replaced rsnapshot by a link explaining a bit more about rsync using as a backup. – Roalt Aug 23 '10 at 11:00
  • I apologise sincerely. What would be the advantages of rsync over rsnapshot? (serious question, I hope it doesn't come across as aggressive) – 8128 Aug 23 '10 at 12:40
  • 1
    Using "cp --archive --link --verbose /MAKE_SNAPSHOT{,_date '+%Y-%m-%d'}/" and "rsync -avz --link-dest=../OLD_BACKUP_DIR SOURCE_DIR NEW_BACKUP_DIR" ist just plain simple. rsnapshot adds some convenience, but maybe you don't need it. personal preference.. – webwurst Aug 23 '10 at 12:53
  • 3
    There is GUI frontend for rsync called Grsync (http://www.opbyte.it/grsync/) that makes manual backups easier. I use it for making backups to my portable hard drive. – Dmitry Jun 11 '11 at 17:58
44

Duplicity Install Duplicity

Duplicity is a feature-rich command line backup tool.

Duplicity backs up directories by producing encrypted tar-format volumes and uploading them to a remote or local. It uses librsync to record incremental changes to files; gzip to compress them; and gpg to encrypt them.

Duplicity's command line can be intimidating, but there are many frontends to duplicity, from command line (duply), to GNOME (deja-dup), to KDE (time-drive).

vh1
  • 1,433
  • 1
    There are also a number of GUI frontends to duplicity, such as Time Drive – Ryan C. Thompson Aug 25 '10 at 23:10
  • Time-Drive no longer has ppa's for current versions of Ubuntu (precise) and source only seems to be available if you donate.This stopped me from evaluating and I now use 'duplicity' from the command line to do backups as root (as Deja-Dup doesn't handle root backups well) and can still use deja-dup's nice restore gui options (from within Nautilus). – Chris Good Mar 29 '13 at 06:07
  • According to the duplicity website, it is still in beta. Not sure I'll recommend that anyone use beta software to backup or restore critical data, even if its family photos. – bloudraak May 28 '13 at 04:05
37

Dropbox

A cross-platform (proprietary) cloud sync for Windows, Mac, and Linux. 2GB of online storage is free, with paid options. Advertised as a way to "store, sync, and, share files online" but could be used for backup purposes too.

Note that even on paid accounts revision history is limited to one year and on free accounts it is only one month.

Note also that restoring large amount of files may be very time-consuming as Dropbox was not built as a backup tool.

Dropbox in use on Ubuntu

user258532
  • 1,258
  • 2
  • 16
  • 25
Derek
  • 249
32

luckyBackup Install LuckyBackup

It's not been mentioned before, so I'll pitch in that "LuckyBackup" is a superb GUI front end on rsync and makes taking simple or complex backups and clones a total breeze.

Note that this tool is no longer developed.

The all important screenshots are found here on their website with one shown below:

luckyBackup

Note: As of March 2024, the last release of luckyBackup was in November 2018.

Scaine
  • 11,139
  • For me it is the most configurable option and includes an option to backup to a remote FAT32 partition (for those who have old and poor made NAS like me...). Wonderful! – desgua Jun 23 '11 at 16:15
27

BackupPC

If you want to back up your entire home network, I would recommend BackupPC running on an always-on server in your basement/closet/laundry room. From the backup server, it can connect via ssh, rsync, SMB, and other methods to any other computer (not just Linux computers), and back up all of them to the server. It implements incremental storage by merging identical files via hard links, even if the identical files were backed up from separate computers.

BackupPC runs a web interface that you can use to customize it, including adding new computers to be backed up, initiating immediate backups, and most importantly, restoring single files or entire folders. If the BackupPC server has write permissions to the computer that you are restoring to, it can restore the files directly to where they were, which is really nice.

BackupPC Web Interface - Server Status Page

karel
  • 114,770
25

bup

A "highly efficient file backup system based on the git packfile format. Capable of doing fast incremental backups of virtual machine images."

Highlights:

  • It uses a rolling checksum algorithm (similar to rsync) to split large files into chunks. The most useful result of this is you can backup huge virtual machine (VM) disk images, databases, and XML files incrementally, even though they're typically all in one huge file, and not use tons of disk space for multiple versions.
  • Data is "automagically" shared between incremental backups without having to know which backup is based on which other one - even if the backups are made from two different computers that don't even know about each other. You just tell bup to back stuff up, and it saves only the minimum amount of data needed.
  • Bup can use "par2" redundancy to recover corrupted backups even if your disk has undetected bad sectors.
  • You can mount your bup repository as a FUSE filesystem and access the content that way, and even export it over Samba.
ændrük
  • 76,794
  • Some nice features, for sure. But note that so far it doesn't save file metadata (ownership, permissions, dates) and that you can't delete old backups so it eventually runs out of space. See a review: Git-based backup with bup -LWN.net and the README: apenwarr/bup - GitHub – nealmcb Jul 01 '11 at 20:28
  • Now metadata seems to be supported, see https://github.com/apenwarr/bup: 'bup save' and 'bup restore' have immature metadata support.

    On the plus side, they actually do have support now, but it's new, and not remotely as well tested as tar/rsync/whatever's. If you'd like to help test, please do (see t/compare-trees for one comparison method).

    – student Mar 20 '13 at 18:22
  • Bup is not available on Ubuntu 20.04 (and other recent Debian-based distros), apparently because it uses Python 2, which is no longer supported. – Matthew Mar 13 '21 at 08:02
24

CrashPlan

CrashPlan is a company providing business backup, without plan for individual users.

Features

  • 10$/month/device fee
  • Triple destination data storage and protection
  • Silent and continuous
  • Generous retention and versioning
  • Deleted file protection

I had considered a bunch of options and configurations (using rdiff-backup, duplicity, backup-ninja, amazon s3, remote server). What it finally came down to was simplicity.

CrashPlan is cross platform, but not open source.

It's also worth noting that with a (paid) CrashPlan Central 'family' plan you can backup all the computers you own.

Tim Lytle
  • 926
  • CrashPlan could be good, but is insanely slow to backup. – Goddard Oct 21 '16 at 21:20
  • Do note that Crashplan is stopping their service to non-enterprise customers: https://www.crashplan.com/en-us/consumer/nextsteps/ – Ours Aug 28 '17 at 17:23
23

Bacula

I used Bacula a long time ago. Although you would have to learn its architecture, it's a very powerful solution. It lets you do backups over a network and it's multi-platform. You can read here about all the cool things it has, and here about the GUI programs that you can use for it. I deployed it at my university. When I was looking for backup solutions I also came across Amanda.

One good thing about Bacula is that it uses its own implementation for the files it creates. This makes it independent from a native utility's particular implementation (e.g. tar, dump...).

When I used it there weren't any GUIs yet. Therefore, I can't say if the available ones are complete and easy to use.

Bacula is very modular at it's core. It consists of 3 configurable, stand-alone daemons:

  • file daemon (takes care of actually collecting files and their metadata cross-platform way)
  • storage daemon (take care of storing the data - let it be HDD, DVDs, tapes, etc.)
  • director daemon (takes care of scheduling backups and central configuration)

There is also SQL database involved for storing metadata about bacula and backups (support for Postgres, MySQL and sqlite.

bconsole binary is shipped with bacula and provides CLI interface for bacula administration.

iElectric
  • 119
  • 5
alxlenc
  • 1,067
18

Simple Backup Install Simple Backup

Note: As of 2021-01 the last release was on 2013.

Simple Backup is another tool to backup your file and keep a revision history. It is quite efficient (with full and incremental backups) and does not take up too much disk space for redundant data. So you can have historical revision of files à-la Time Machine (a feature Back in time - mentioned earlier - is also offering).

Features:

  • easy to set-up with already pre-defined backup strategies
  • external hard disk backup support
  • remote backup via SSH or FTP
  • revision history
  • clever auto-purging
  • easy sheduling
  • user- and/or system-level backups

alt text

As you can see the feature set is similar to the one offered by Back in time.

Simple Backup fits well in the Gnome and Ubuntu Desktop environment.

Huygens
  • 4,713
  • 6
    Simple backup has failed for me multiple times, one time resulting in some pretty upsetting data loss. I would not recommend it. – Alex Launi Nov 01 '10 at 03:16
  • @Alex I'm interested... I use back in time, but I had tried Simple Backup before. I choose the first because I can browse the backups. Could you be more specific about the problem encounter? Just out of curiosity. – Huygens Nov 01 '10 at 21:57
  • 2
    The tarball it created had tons of invalid data in it, leaving it unextractable. This happened more than once. – Alex Launi Nov 02 '10 at 15:17
  • 2
    I would not recommend this tool; it's very hard to use it as root (by default it will save everything in your home directory meaning that a bad rm command will purge everything) and it keeps generating bad compressed files (though it gives a warning) and the GUI is not as nice as that of back in time. – user2413 Nov 08 '10 at 13:00
  • @kwak: I did not understand you. Why do you find it hard to use? Who would do a bad rm command? Why would it "purge" everything (as you've supposedly backup the files)? Do you mean that you have unextractable archives like Alex? – Huygens Nov 08 '10 at 22:29
  • 1
    @Huygens:> Sorry, for my poorly worded comment. My experience is that, by default, the current version of sbackup does not save the back ups in a root-protected directory. If you do not change the default, your back ups will obviously not survive a bad .rm command. This second point is not related to Alex's point on bad tar.gz's and is linked to the choice of default behavior of sbackup, not to its intrinsic qualities. – user2413 Nov 09 '10 at 16:53
  • I've used sbackup for the past year or two for local backups. Never had any problems, unless I ran low on disk space, but it can send out an email with the status, so that hasn't been a problem. I like that it can save to a standard gzip file (or other options), and I don't even notice when it's running. Also, it has a nice default for archiving old backups. I really think that nobody should assume a backup program will choose the correct location to back up, and backing up to the same drive is not a good idea, anyway. But how can it choose where to backup without instructions? – Marty Fried Jan 25 '12 at 22:03
18

tar

tar, a simple and reliable tool for archiving files, can also be used for backups. But today, we have better and faster backup tools with more useful features. Depending on your needs, tar can still be useful.

Create a full backup of your home directory:

cd to the directory where you want to store the backup file, and then:

tar --create --verbose --file backup.tar <path to the home directory>

For subsequent backups, we want to avoid a full backup - because it takes too much time. So we simply update the files in backup.tar:

Again, cd to the directory where the backup file is, and then use --update:

tar --update --verbose --file backup.tar <path to the home directory>

All files that are either new or have been modified will be saved in backup.tar. Deleted files will be kept. To restore the most recent backup, right-click on the file and choose "Extract to...". To retrieve older versions of your files, you have to open backup.tar, and find the files (and versions) you want to restore.

Note: You cannot use --update on a compressed tar file (e.g. .tar.gz).

15

DAR Install DAR

DAR - the Disk ARchive program - is a powerful command line backup tool supporting incremental backups and restores. If you want to backup a lot of files then it may be considerable faster than rsync (rolling checksum) like solutions.

Glorfindel
  • 971
  • 3
  • 13
  • 20
maxschlepzig
  • 3,474
15

Attic Backup / Borg Backup

Note: As of 2021-01 the last release was on 2015.

Attic is a deduplicating backup program written in Python. The main goal of Attic is to provide an efficient and secure way to backup data. The data deduplication technique used makes Attic suitable for daily backups since only the changes are stored.

Main Features:

  • Easy to use
  • Space efficient storage: Variable block size deduplication is used to reduce the number of bytes stored by detecting redundant data.
  • Optional data encryption: All data can be protected using 256-bit AES encryption and data integrity and authenticity is verified using HMAC-SHA256.
  • Off-site backups: Attic can store data on any remote host accessible over SSH
  • Backups mountable as filesystems: Backup archives are mountable as userspace filesystems for easy backup verification and restores.

Requirements:

Attic requires Python >=3.2. Besides Python, Attic also requires msgpack-python and OpenSSL (>= 1.0.0). In order to mount archives as filesystems, llfuse is required.

Note:

There is also now a fork of Attic called Borg.

rcs
  • 464
  • 1
  • 8
  • 19
13

Spideroak

A dropbox like backup/syncing service with comparable features.

  • Access all your data in one de-duplicated location
  • Configurable multi-platform synchronization
  • Preserve all historical versions & deleted files
  • Share folders instantly in web
  • ShareRooms w / RSS
  • Retrieve files from any internet-connected device
  • Comprehensive 'zero-knowledge' data encryption

Listed supported systems: Debian Lenny, OpenSUSE, RPM-Based (Fedora, etc.), CentOS/RHEL, Ubuntu Lucid Lynx, Ubuntu Gutsy Gibbon, Ubuntu Karmic Koala, Ubuntu Maverick Meerkat, Ubuntu Intrepid Ibex, Debian Etch, Ubuntu Hardy Heron, Slackware 12.1, Ubuntu Jaunty Jackalope

More info at https://spideroak.com

Derek
  • 249
  • 1
    Note that there's no automatic way to delete old backups. Thus, unless you're fond of manually hunting through their clunky UI, there'll be no end to the amount of space required. SpiderOak says that you should never need to delete old backups thanks to their deduplication. I disagree. Also, SpiderOak omits symlinks, claiming that they're complicated to handle due to the possibility of symlink loops. – Scott Severance May 29 '12 at 10:33
  • 5
    This really isn't a backup tool. I used SpiderOak in 2009 and it failed in multiple ways: failed to backup whole directory trees, never finished syncing properly, and I couldn't recover much of the data it did back up. Don't depend on SpiderOak for backup or sync is my view - even if they have fixed these bugs the architecture is still syncing all files to all PCs, and simply not suitable for backup. – RichVel Nov 01 '12 at 12:19
  • 1
    as mentioned for dropbox: backup and syncing are two different tasks! – DJCrashdummy Jun 18 '17 at 19:39
  • I previously recommended this tool, but it can go on backing things up in a cache directory for FOREVER and never uploading anything and the user remains unaware. When you finally need the files you will find none of them uploaded and even though the data is in the cache directory you can't do anything about it. it is useless. – Goddard May 16 '19 at 18:12
11

FlyBack

Warning: Unmaintained, last update in 2010.

Similar to Back in Time

Apple's Time Machine is a great feature in their OS, and Linux has almost all of the required technology already built in to recreate it. This is a simple GUI to make it easy to use.

FlyBack v0.4.0

Derek
  • 249
  • 1
    Note that this software is not actively maintained: its last update was in 2010 (that's what I call back in time). – Jealie Jul 21 '15 at 17:23
10

Areca Backup

Warning: Unmaintained, last release in 2015.

is also a very decent GPL program to make backups easily.

Features

  • Archives compression (Zip & Zip64 format)
  • Archives encryption (AES128 & AES256 encryption algorithms)
  • Storage on local hard drive, network drive, USB key, FTP / FTPs server (with implicit and explicit SSL / TLS)
  • Source file filters (by extension, subdirectory, regular expression, size, date, status, with AND/OR/NOT logical operators)
  • Incremental, differential and full backup support
  • Support for delta backup (store only modified parts of your files)
  • Archives merges : You can merge contiguous archives into one single archive to save storage space.
  • As of date recovery : Areca allows you to recover your archives (or single files) as of a specific date.
  • Transaction mechanism : All critical processes (such as backups or merges) are transactional. This guarantees your backups' integrity.
  • Backup reports : Areca generates backup reports that can be stored on your disk or sent by email.
  • Post backup scripts : Areca can launch shell scripts after backup.
  • Files permissions, symbolic links and named pipes can be stored and recovered. (Linux only)
AndyB
  • 439
10

Jungledisk Pay for application

Is a winner as far as I'm concerned. It backs up remotely to an optionally-encrypted Amazon S3 bucket, it's customisable, it can run in the background (there are various guides available for setting that up). There's a decent UI or you can hack an XML file if you're feeling so inclined.

I backup all of my home machines with the same account, no problem. I also can remotely access my backed-up data via myjungledisk.com .

It's not free, but in US terms it's certainly cheap enough (I pay around $8 a month). I feel that's more than acceptable for an offsite backup where someone else deals with hardware and (physical) security etc issues.

I can't recommend it enough.

nwahmaet
  • 151
  • I've been using this one for years, and I agree. This is a very good product, and one bonus for me is that it is cross platform. You can use the same product across all platforms you use, be it Linux, Mac or Windows. – sbrattla Oct 04 '15 at 19:19
  • The big "$4" with small "As Jungle Disk is designed for 2-250 employee businesses each customer account is subject to a minimum monthly charge of $8 per month." below is a very discouraging start. – reducing activity Aug 07 '18 at 05:58
8

I run a custom Python script which uses rsync to save my home folder (less trash etc) onto a folder labelled "current" on a separate backup HDD (connected by USB) and then the copy (cp) command to copy everything from "current" onto a date-time stamped folder also on the same HDD. The beautiful thing is that each snapshot has every file in your home folder as it was at that time and yet the HDD doesn't just fill up unnecessarily. Because most files never change, there is only ever one actual copy of those files on the HDD. Every other reference to it is a link. And if a newer version of a file is added to "current", then all the snapshots pointing to the older version are now automatically pointing to a single version of the original. Modern HDD file systems takes care of that by themselves. Although there are all sorts of refinements in the script, the main commands are simple. Here are a few of the key ingredients:

exclusion_path = "/home/.../exclusions.txt" # don't back up trash etc
media_path = "/media/... # a long path with the HDD details and the "current" folder
rsync -avv --progress --delete --exclude-from=exclusion_path /home/username/ media_path
current = "..." # the "current" folder on the HDD
dest = "..." # the timestamped folder on the HDD
cp -alv current dest

I had some custom needs as well. Because I have multiple massive (e.g. 60GB) VirtualBox disk images, I only ever wish to have one copy of those, not snapshot versions. Even a 1 or 2 TB HDD has limits.

Here are the contents of my exclusions file. The file is very sensitive to missing terminal slashes etc:

/.local/share/Trash/
/.thumbnails/
/.cache/
/Examples/
  • 2
    A tool that does something very similar for you (always having complete snapshots, using hard links to not waste disk space) is rsnapshot -- maybe you should give it a try – Marcel Stimberg Sep 02 '10 at 09:08
7

BorgBackup is a CLI tool and with Vorta as its GUI does everything you need and more. There is even a PPA for BorgBackup itself.

The main difference between BorgBackup and any other backup solution is that it's a deduplicating backup solution:

E.G. if you have multiple copies of one single file, that file will take up space only once.

  1. Install BorgBackup:

    sudo add-apt-repository ppa:costamagnagianfranco/borgbackup
    sudo apt update
    sudo apt install borgbackup
    
  2. Install Vorta:

    pip install vorta
    
  3. Make your initial backup:

    borg init --encryption=repokey-blake2 /media/ExternalHDD/{user}
    
  4. click the Vorta icon to go to the GUI and configure it.

Fabby
  • 34,259
6

Dirvish

Note: As of 2021-01 the last release was in 2005.

A nice command line snapshot backup tool which uses hardlinks to reduce diskspace. It has a sophisticated way to purge expired backups.

student
  • 2,312
6

Duplicati

An open source, gratis backup application running on Linux, with gui that "securely stores encrypted, incremental, compressed backups on cloud storage services and remote file servers. It works with Amazon S3, Windows Live SkyDrive, Google Drive (Google Docs), Rackspace Cloud Files or WebDAV, SSH, FTP (and many more)".

Version 1.0 is considered stable; there is a version 2 in development with considerable internal changes that is currently working (though I wouldn't use it for production). There are standard or custom filter rules to select files to backup.

I have been using it for years partly (not connected to anyone there but have considered looking at the API to add a backend, speaking as a developer) although infrequently, on both a Windows laptop and my Ubuntu 14.04 install.

A fork of duplicity.

Breezer
  • 105
4

s3ql is a more recent option for using Amazon s3, Google Storage or OpenStack Storage as a file system. It works on a variety of Linux distros as well as MacOS X.

Using it with rsync, you can get very efficient incremental offsite backups since it provides storage and bandwidth efficiency via block-level deduplication and compression. It also supports privacy via client-side encryption, and some other fancy things like copy-on-write, immutable trees and snapshotting.

See Comparison of S3QL and other S3 file systems for comparisons with PersistentFS, S3FS, S3FSLite, SubCloud, S3Backer and ElasticDrive.

I've been using it for a few days, starting from s3_backup.sh, (which uses rsync) and am quite happy. It is very well documented and seems like a solid project.

nealmcb
  • 3,647
4

TimeVault

Warning: unmaintained

TimeVault a is tool to make snapshots of folders and comes with nautilus integration. Snapshots are protected from accidental deletion or modification since they are read-only by default.

Can be downloaded from Launchpad.

papukaija
  • 2,425
4

PING is a no-nonsense free backup tool that will let you make backups of entire partitions. It is a standalone utility that should be burnt on CD.

What I like about this program is that it copies the entire partition. Imagine this: while modifying your Ubuntu as a superuser, you changed a vital part and Ubuntu won't start up anymore.

You could format the hard disk and reinstall Ubuntu. While backup solutions as Dropbox, Ubuntu One etc. might be useful for retrieving the important files , it won't restore your wallpaper, Unity icons and other stuff that made your Ubuntu the way you liked it.

Another option is to ask for help on the internet. But why not just restore the whole system to the way it was a few days ago? PING will do exactly this for you.

Pro's:

  • Will not only backup documents, but system files as well
  • It's easy to use
  • It is possible to backup other (non-Linux) partitions as well
  • It will compress the backup in gzip or bzip2 format, saving disk space

Cons:

  • The PC will have to be restarted before being able to backup
  • PING will make a backup of an entire partition, even when only few files have been modified
  • You'll need an external hard drive or some free space on your PC to put your backups

An excellent Dutch manual can be found here.

3

Obnam

Warning: Software is no longer maintained, authors recommend not using it

'Obnam is an easy, secure backup program. Backups can be stored on local hard disks, or online via the SSH SFTP protocol. The backup server, if used, does not require any special software, on top of SSH.

Some features that may interest you:

  • Snapshot backups. Every generation looks like a complete snapshot, so you don't need to care about full versus incremental backups, or rotate real or virtual tapes.
  • Data de-duplication, across files, and backup generations. If the backup repository already contains a particular chunk of data, it will be re-used, even if it was in another file in an older backup generation. This way, you don't need to worry about moving around large files, or modifying them.
  • Encrypted backups, using GnuPG.'

An old version can be found in the Ubuntu software sources, for the newest version refer to Chris Cormacks PPA or Obnams website.

shaddow
  • 400
3

inosync

A Python script that offers a more-or-less real-time backup capability.

Mote that this software is not maintained anymore.

"I came across a reference to the “inotify” feature that is present in recent Linux kernels. Inotify monitors disk activity and, in particular, flags when files are written to disk or deleted. A little more searching located a package that combines inotify's file event monitoring with the rsync file synchronization utility in order to provide the real-time file backup capability that I was seeking. The software, named inosync, is actually a Python script, effectively provided as open-source code, by the author, Benedikt Böhm from Germany (http://bb.xnull.de/)."

http://www.opcug.ca/public/Reviews/linux_part16.htm

CentaurusA
  • 2,672
1

I recommend Timeshift for Ubuntu users who want to backup an entire partition.

Timeshift is a system restore utility which takes snapshots of the system at regular intervals. These snapshots can be restored at a later date to undo system changes. Timeshift creates incremental snapshots using rsync or BTRFS snapshots using BTRFS tools.

Timeshift can be installed from the default Ubuntu repositories in Ubuntu 20.04 and later with the following command.

sudo apt install timeshift

In Ubuntu 18.04 Timeshift is available from ppa:teejee2008/timeshift.

To get started using Timeshift read this tutorial: How to Use Timeshift to Backup and Restore Ubuntu Linux.

If you get this bug All snap apps doesn't launch after Timeshift system restore delete all traces of snapd with sudo apt autoremove --purge snapd and reinstall the snaps that couldn't launch previously. In this scenario typically 2 or 3 snap packages might need to be reinstalled.

karel
  • 114,770
1

saybackup and saypurge

There is a nice script called saybackup which allows you to do simple incremental backups using hardlinks. From the man page:

This script creates full or reverse incremental backups using the
rsync(1) command. Backup directory names contain the date and time
of each backup run to allow sorting and selective pruning. At the end of each successful backup run, a symlink '*-current' is updated to always point at the latest backup. To reduce remote file
transfers, the '-L' option can be used (possibly multiple times) to
specify existing local file trees from which files will be
hard-linked into the backup.

The corresponding script saypurge provides a clever way to purge old backups. From the home page of the tool:

Sayepurge parses the timestamps from the names of this set of backup directories, computes the time deltas, and determines good deletion candidates so that backups are spaced out over time most evenly. The exact behavior can be tuned by specifying the number of recent files to guard against deletion (-g), the number of historic backups to keep around (-k) and the maximum number of deletions for any given run (-d). In the above set of files, the two backups from 2011-07-07 are only 6h apart, so they make good purging candidates...

student
  • 2,312
1

backup2l

Warning: unmaintained, last commit on 2017-02-14

From the homepage:

backup2l is a lightweight command line tool for generating, maintaining and restoring backups on a mountable file system (e. g. hard disk). The main design goals are are low maintenance effort, efficiency, transparency and robustness. In a default installation, backups are created autonomously by a cron script.

backup2l supports hierarchical differential backups with a user-specified number of levels and backups per level. With this scheme, the total number of archives that have to be stored only increases logarithmically with the number of differential backups since the last full backup. Hence, small incremental backups can be generated at short intervals while time- and space-consuming full backups are only sparsely needed.

The restore function allows to easily restore the state of the file system or arbitrary directories/files of previous points in time. The ownership and permission attributes of files and directories are correctly restored.

An integrated split-and-collect function allows to comfortably transfer all or selected archives to a set of CDs or other removable media.

All control files are stored together with the archives on the backup device, and their contents are mostly self-explaining. Hence, in the case of an emergency, a user does not only have to rely on the restore functionality of backup2l, but can - if necessary - browse the files and extract archives manually.

For deciding whether a file is new or modified, backup2l looks at its name, modification time, size, ownership and permissions. Unlike other backup tools, the i-node is not considered in order to avoid problems with non-Unix file systems like FAT32.

student
  • 2,312
0

zpaq

zpaq is a file archiver that grew out of the PAQ series of file compression tools. It can be used for incremental backups, and you can revert to any previous file version. By default, the add options only adds files with a newer date, or with a changed file size. By default, zpaq cuts files into blocks that are about 64 kilobytes big, and it stores only the blocks the have not yet been encountered by the program.

As a backup software, zpaq has a disadvantage - it does not allow you to delete old backups. On the other hand, once the initial backup has been done, incremental backups only need very little memory, and are very fast.

The -key option encrypts the backup with AES-256.

Backup

zpaq add backup.zpaq <path to the directory you want to back up>

Using the most extreme (and slowest) compression (default is 1):

zpaq add backup.zpaq <path to the directory you want to back up> -method 5

Using -method 0 does not compress any data. For backups, -method 1 is recommended, although -method 2 is nearly as fast.

List the files in your most recent backup

zpaq list backup.zpaq

List contents of second backup

zpaq list backup.zpaq -until 2

List all versions of all files

zpaq list backup.zpaq -all

The version is indicated by a four-digit number, starting with 0001. (Additional digits are added as needed.)

Extract the most recent backup

zpaq extract backup.zpaq <destination>

Extract the second version of your backup

zpaq extract backup.zpaq <destination> -until 2

Extract all versions of all files that have "diary" in their names

zpaq extract backup.zpaq -only "*diary*" -all

The file versions will be saved in different folders - 0001 for the first version, 0002 for the second, et cetera.

user258532
  • 1,258
  • 2
  • 16
  • 25
0

A new one is : https://github.com/kopia/kopia

Kopia is a simple, cross-platform tool for managing encrypted backups in the cloud or locally. It provides fast, incremental backups, secure, client-side end-to-end encryption, compression and data deduplication. and can do automatic backups

https://kopia.io

after comparing kopia to restic, duplicati, duplicacy, duplicity, bup, borg ... and others, I decided to use kopia primarily , but I'm using in the same time restic, duplicati and bup , so I have 4 types of backup , if one fail then I still have 3.

Install method:

GUI: linux (portable AppImage) . windows (install exe)

command line: a simple self contained portable go exe

enter image description here

enter image description here

0

fwbackups

installation:

Download

sudo apt install sudo apt-get install gettext autotools-dev intltool python-crypto python-paramiko python-gtk2 python-glade2 python-notify cron

tar xfj fwbackups-VERSION.tar.bz2 cd fwbackups-VERSION ./configure --prefix=/usr make su -c "make install"

MandiYang
  • 2,019
0

faubackup

Another small tool which lets you do incremental backups with hardlinks was Faubackup.

From the homepage:

This Program uses a filesystem on a hard drive for incremental and full backups. All Backups can easily be accessed by standard filesystem tools (ls, find, grep, cp, ...)

Later Backups to the same filesystem will automatically be incremental, as unchanged files are only hard-linked with the existing version of the file.

It allows to create different levels of backups. From the man page:

FauBackup may be configured to keep certain backups for a long time and remove others. Have a look at traditional backup systems. You have tapes for daily, weekly, monthly and yearly backups, and store them according to your local backup policy. FauBackup can do this for you on harddisks, too. That is, it can keep some yearly, weekly, etc. backups for you and automatically remove other obsoleted backups.

Four different backup-types are recognized: daily, weekly, monthly and yearly. The first existing backup in such an interval will be considered belonging to the coresponding type. Thus, the first backup in a month (eg. 2000−12−01@06:30:00) will be a monthly backup; the first backup in 2001 will be of all four types, as January 1st, 2001 is a Monday.

The number of backups kept for each type is configureable (See faubackup.conf(5) ). If a backup doesn’t belong to such a type (eg. second backup in a day), or is too old for that type, it will be removed on faubackup --

student
  • 2,312
0

boxbackup

From the homepage:

Box Backup is an open source, completely automatic, on-line backup system. It has the following key features:

  • All backed up data is stored on the server in files on a filesystem - no tape, archive or other special devices are required.
    -The server is trusted only to make files available when they are required - all data is encrypted and can be decoded only by the original client. This makes it ideal for backing up over an untrusted network (such as the Internet), or where the server is in an uncontrolled environment.
    -A backup daemon runs on systems to be backed up, and copies encrypted data to the server when it notices changes - so backups are continuous and up-to-date (although traditional snapshot backups are possible too).
  • Only changes within files are sent to the server, just like rsync, minimising the bandwidth used between clients and server. This makes it particularly suitable for backing up between distant locations, or over the Internet.
  • It behaves like tape - old file versions and deleted files are available.
  • Old versions of files on the server are stored as changes from the current version, minimising the storage space required on the server. Files are the server are also compressed to minimise their size.
  • Choice of backup behaviour - it can be optimised for document or server backup.
  • It is designed to be easy and cheap to run a server. It has a portable implementation, and optional RAID implemented in userland for reliability without complex server setup or expensive hardware. http://www.boxbackup.org/
student
  • 2,312
-1

rbackup

rbackup tries to combine the advantages of rdiff-backup and rsnapshot.

student
  • 2,312
  • 4
    This software hasn't seen an update in almost six years. Are you certain that it works on newer versions of Ubuntu? A more detailed answer would be helpful. – Kevin Bowen Mar 28 '13 at 10:24
-2

For the people that don't know, MEGA is a Dropbox alternative, with 50GB of free storage, available for Mac, Windows and Linux, created by Kim Dotcom.

Install

Download the Mega Sync Client for Linux. Open the terminal in the directory you downloaded the deb files, then Copy/Paste the following code: sudo dpkg -i megasync-xUbuntu_14.04_amd64.deb. After that start mega from the Dash, from there one it will start up at login. Also note that the deb file also adds a ppa in your sources list. Meaning future updates, you will get via your Software Updater.

sudo add-apt-repository ppa:otto-kesselgulasch/mega
sudo apt-get update
sudo apt-get install megasync

Features

Here are some features that are touted by Mega:

  • Secure:

    • Your data is encrypted end to end. Nobody can intercept it while in storage or in transit.
  • Flexible:

    • Sync any folder from your PC to any folder in the cloud. Sync any number of folders in parallel.
  • Fast:

    • Take advantage of MEGA's high-powered infrastructure and multi-connection transfers.
  • Generous:

    • Store up to 50 GB for free!

Source:

excerpt from:

Which I am the author of.

As stated in other file-sharing-service answers, synchronisation is not backup (tldr: risk of synchronising corrupted/deleted files, particularly if no file-versioning available). The key to decrypt the encrypted data at Mega is secured and accessed by your account credentials (kept remotely but encrypted also), so as long as you still have login access/a user you shared the files to can login, the files won't be lost unless synchronised versions are overwritten by bad data.

Breezer
  • 105
blade19899
  • 26,704
  • 2
    For future references, MEGA/Dropbox/etc. is not backup solution but online storage with sync ... so that's a reason of downvotes (probably). – dmnc Sep 06 '19 at 14:00