1

I have my app on a VPS(Ubuntu Server 18.04), and the host can do weekly backups of all the system.

But this is not enough, so I want to do my own backups(besides I had hosts backup files failing in the past), but not all the system.

  1. I need to do incremental backups of multiple locations.
  2. The incremental backups to be in a folder/director structure(all multiple locations in the same date folder, but remain incremental):

    • folder_name__date1
    • folder_name__date2
    • folder_name__today
  3. Automate the process, to be done daily

  4. Maintain a local backup of the folder/files structure
  5. Update to a cloud (DropBox, Box), but before updating archive the backups with passwords

I read about rsync and chron, but it I don't know:

  • how to do the increment using folders
  • archive/update to cloud
  • automate all process using bash

2 Answers2

1

I also had to read some document to get myself ready to reach your goal, let's start! Without know the type of data and the access frequency you can use rsync to synchronize the folder you want to backup in to a temporary directory to perform the incremental backup I'm suggesting to use tar for compress 7zip for encription and for remote copy rclone.

  1. source tar: http://www.gnu.org/software/tar/manual/html_node/Incremental-Dumps.html
  2. source rsync: https://download.samba.org/pub/rsync/rsync.html
  3. Source 7zip: https://linux.die.net/man/1/7z
  4. source rclone dbox: https://rclone.org/dropbox/
  5. bash scripting: https://www.tldp.org/LDP/abs/html/

So what I will to do done this task, supposing the folders to backup is /home/user and /home/user2 the temporary directory is /media/Backup/folder_name__date1. I'll start a script with two rsync job one for folder to obtain the following structure:

mkdir -pv /media/Backup/folder_name__`date +%Y%m%d`
Backup/
└── folder_name__20180910
    ├── user
    └── user1

the rsync command to do so are:

rsync -a /home/user /media/Backup/folder_name__20180910
rsync -a /home/user1 /media/Backup/folder_name__20180910

ATTENTION: if the source path are end with a / will not create a destination folder but sync all the data in the root destination folder.

Now you had an exact copy in a place you are sure data will not accessed during the backup process, this is important to guarantee the data quality and also to let tar working properly, it base the incremental on the time stamp of files. let's create our archive:

tar --create \
           --file=/path/where/you/like/folder_name__20180910.tar \
           -C /media/Backup/ \
           --listed-incremental=/var/log/db.snar \
           folder_name__20180910
  • --create Indicate to create and archive
  • --file Indicate the path where the tar will be saved
  • --listed-incremental this is the path of the database where tar will look to search the delta of data
  • -C Indicate to tar the working directory, this permit you to not have an archive with all the folder of the path, with this you will find folder_name__20180910 in the root of the tar
  • The last is the folder to add in the archive

IMPORTANT: the database tar create have to be the same also in the future, if it change you will have always a full

Now we have our first backup! let's encrypt it!

password=`date | md5sum | cut -c1-15` 
echo ${password} > /root/folder_name__20180910.pwd && chmod 400 /root/folder_name__20180910.pwd
/root/folder_name__date1.pwd
7z a -mhe=on -p`echo ${password}` \
-t7z -m0=lzma -mx=9 -mfb=64 -md=32m -ms=on \
/path/where/you/like/folder_name__20180910.7z \
/path/where/you/like/folder_name__20180910.tar

here we generate a random password from the md5sum of date (you can use various way to do it) save the password in a file match the archive and save it in to the /root folder after this we create the compressed archive.

last step, copy it remotely on a cloud service, I assume you read the documentation and the account in rclone is configured as remotecopydata and if you want remotecopypassword

rclone copy /path/where/you/like/folder_name__20180910.7z remotecopydata:Backup
rclone copy /path/where/you/like/folder_name__20180910.7z remotecopypassword:Backup

BASH To automate in to bash in a quick & dirty way you could just adapt the command in this post in a file it should work! To add some control read the bash documentation at tlpd.org or search also here, a lot of question on bash command to how to do something.

To let script run automatically every day at midnight:

crontab -e 

and add:

MAILTO=amail@boxes.org
0 0 * * * /path/to/your/scrypt.sh

the MAILTO= let all the messages from the script sent to the e-mail configured. Add some control to the script before run it automaticaly

That's all, I hope you will find this post helpful!

AtomiX84
  • 1,231
1

I suggest you install rsnapshot. In the configuration file, you can set the following:

backup /etc/ localhost/

would set /etc/ to be backed up into the localhost/ folder. Thus, you can also back up other directories and put them elsewhere (though I suggest you create a backup location for each host you plan to backup).

rsnapshot does incremental backups and saves space by using symbolic links. Thus each backup is a complete backup of the directory in question, without using additional space.

In order to automate this, set a cronjob. Run sudo crontab -e and add rules like this:

0 4 * * *         /usr/bin/rsnapshot alpha
0 3 * * 3         /usr/bin/rsnapshot beta
0 2 1 * *         /usr/bin/rsnapshot gamma

This does an alpha backup daily, a beta backup once a week on Wednesdays, and a gamma backup once a month. Of course, set this however you want. The documentations have recommendations, including the order (in a day) between alpha, beta, and gamma. None of this (so far) requires bash scripting. rsnapshot and cron will handle everything here.

However, the last thing you want is encryption and uploading to the cloud. I honestly don't know how to do that -- sorry! If I backed up to Dropbox, I would have to continually remove files...something that rsnapshot already does.

I suppose you can use tar and gzip to archive and compress and then gpg to encrypt. But, if you have another computer or an NAS at home or work, I suggest you just copy your files to it. Or, as the configuration file (/etc/rsnapshot.conf) says, you can backup another computer remotely. i.e.:

backup root@example.com:/etc/ example.com/

There are options for you to mount (i.e., attach) an external drive, back up, and then unmount it, as well.

Anyway, I know this isn't every thing you asked for, but I hope it helps.

Ray
  • 2,071