I also had to read some document to get myself ready to reach your goal, let's start!
Without know the type of data and the access frequency you can use rsync to synchronize the folder you want to backup in to a temporary directory to perform the incremental backup I'm suggesting to use tar for compress 7zip for encription and for remote copy rclone.
- source tar:
http://www.gnu.org/software/tar/manual/html_node/Incremental-Dumps.html
- source rsync: https://download.samba.org/pub/rsync/rsync.html
- Source 7zip: https://linux.die.net/man/1/7z
- source rclone dbox: https://rclone.org/dropbox/
- bash scripting: https://www.tldp.org/LDP/abs/html/
So what I will to do done this task, supposing the folders to backup is /home/user and /home/user2 the temporary directory is /media/Backup/folder_name__date1.
I'll start a script with two rsync job one for folder to obtain the following structure:
mkdir -pv /media/Backup/folder_name__`date +%Y%m%d`
Backup/
└── folder_name__20180910
├── user
└── user1
the rsync command to do so are:
rsync -a /home/user /media/Backup/folder_name__20180910
rsync -a /home/user1 /media/Backup/folder_name__20180910
ATTENTION: if the source path are end with a / will not create a destination folder but sync all the data in the root destination folder.
Now you had an exact copy in a place you are sure data will not accessed during the backup process, this is important to guarantee the data quality and also to let tar working properly, it base the incremental on the time stamp of files.
let's create our archive:
tar --create \
--file=/path/where/you/like/folder_name__20180910.tar \
-C /media/Backup/ \
--listed-incremental=/var/log/db.snar \
folder_name__20180910
- --create Indicate to create and archive
- --file Indicate the path where the tar will be saved
- --listed-incremental this is the path of the database where tar will look to search the delta of data
- -C Indicate to tar the working directory, this permit you to not have an archive with all the folder of the path, with this you will find folder_name__20180910 in the root of the tar
- The last is the folder to add in the archive
IMPORTANT: the database tar create have to be the same also in the future, if it change you will have always a full
Now we have our first backup! let's encrypt it!
password=`date | md5sum | cut -c1-15`
echo ${password} > /root/folder_name__20180910.pwd && chmod 400 /root/folder_name__20180910.pwd
/root/folder_name__date1.pwd
7z a -mhe=on -p`echo ${password}` \
-t7z -m0=lzma -mx=9 -mfb=64 -md=32m -ms=on \
/path/where/you/like/folder_name__20180910.7z \
/path/where/you/like/folder_name__20180910.tar
here we generate a random password from the md5sum of date (you can use various way to do it) save the password in a file match the archive and save it in to the /root folder after this we create the compressed archive.
last step, copy it remotely on a cloud service, I assume you read the documentation and the account in rclone is configured as remotecopydata and if you want remotecopypassword
rclone copy /path/where/you/like/folder_name__20180910.7z remotecopydata:Backup
rclone copy /path/where/you/like/folder_name__20180910.7z remotecopypassword:Backup
BASH To automate in to bash in a quick & dirty way you could just adapt the command in this post in a file it should work! To add some control read the bash documentation at tlpd.org or search also here, a lot of question on bash command to how to do something.
To let script run automatically every day at midnight:
crontab -e
and add:
MAILTO=amail@boxes.org
0 0 * * * /path/to/your/scrypt.sh
the MAILTO= let all the messages from the script sent to the e-mail configured.
Add some control to the script before run it automaticaly
That's all, I hope you will find this post helpful!
man bash
. – wjandrea Sep 10 '18 at 03:31