2

I have two partitions:

  • /dev/sda is mounted as /
  • /dev/sdb is mounted ias /stuff

I want to copy everything from / to /stuff/backup but I also want the symbolic links to update to point to/in /stuff/backup.

For example, if I have a symbolic link like so:

/path/to/some/link -> /path/to/the/real/file

Then after it is backed up it should look like this:

/stuff/backup/path/to/some/link -> /stuff/backup/path/to/the/real/file

I have googled a bit but can't seem to figure this out.

  • This is not really the best idea, what are you trying to do ? Backup ? See https://superuser.com/questions/138587/how-to-copy-symbolic-links – Panther Jul 15 '17 at 04:27
  • See this post. Mr Wong at bottom of answers. Is this what you might be looking for? I've had to do what he is suggesting, creating the destination directly first. – smjadtnf707 Jul 15 '17 at 13:17
  • @bodhi.zazen I want to make a viewable backup/copy of my Ubuntu install to a second HD so I can navigate it and view files and what not. – IMTheNachoMan Jul 16 '17 at 14:04
  • @bodhi.zazen that link doesn't help me figure out how to copy the entire / directory. – IMTheNachoMan Jul 16 '17 at 14:06
  • @burliway777 does that mean I need to copy each file/folder one at a time? i wanted to make a copy of the entire / folder. – IMTheNachoMan Jul 16 '17 at 14:09
  • I am not sure why you are doing this. You really can not copy all of / while it is mounted / running, you will need to boot a live CD. IMO this is a very inefficient method of backup and most if not all of / is available from apt (apt-get). You are best off booting a live cd and using clonzilla, dd, or tar. The link I gave you discusses several methods to preserve links, take your pick of which method you prefer. – Panther Jul 16 '17 at 16:15
  • @bodhi.zazen So I am going to re-install my OS from scratch (not restoring from backups). After I re-install I'll want to mimic a lot of settings/things from the current install so I'll need to be able to reference those files to read their settings and what not. What files I need to reference I won't know until I start the re-install. Not all the files are in /etc. So I thought if I could take a copy fo the entire / to /mnt/data/backup then I could look at the files I need when I need it. – IMTheNachoMan Jul 16 '17 at 16:19
  • That method will result in a huge backup. What I personally do on servers is save a copy of any system files I edit in /root mirroring / . So if I edit /etc/fastab for example, I save a copy of my new file in /root/etc/fstab. Then when I backup I only need to back up /home /root and any server data such as /var/www/html or mysql database. – Panther Jul 16 '17 at 16:30
  • For what you want, IMO, tar is a better tool. cd / tar -cvpzf backup.tar.gz --exclude=/backup.tar.gz --exclude=/proc --exclude=/tmp --exclude=/mnt --exclude=/dev --exclude=/sys / . This excludes directories you do not want to copy, preserves links, and compresses the archive. – Panther Jul 16 '17 at 16:31
  • You can probably exclude other directories as well such as /var/log and /var/cache/apt/archives your downloaded packages or just tar /etc or wherever you edited files + /home + /root + /var/www/html . – Panther Jul 16 '17 at 16:34

1 Answers1

3

From the discussion, IMO, tar is the best tool as it will preserve links. I also suggest doing this from a live CD. You would mount your ubuntu root partition and cd into it.

If you are backing up a database, use the database tools, ie mysqldump

From a running system (not live usb):

NOTE: Review these excludes before you run this command, see below for details.

Minimal excludes would be /proc /tmp /mnt /dev and /sys The others are optional but will significantly reduce the size of your backup

cd / # THIS CD IS IMPORTANT THE FOLLOWING LONG COMMAND IS RUN FROM /
tar -cvpzf backup.tar.gz \
--exclude=/backup.tar.gz \
--exclude=/proc \
--exclude=/tmp \
--exclude=/mnt \
--exclude=/dev \
--exclude=/sys \
--exclude=/run \ 
--exclude=/media \ 
--exclude=/var/log \
--exclude=/var/cache/apt/archives \
--exclude=/usr/src/linux-headers* \ 
--exclude=/home/*/.gvfs \
--exclude=/home/*/.cache \ 
--exclude=/home/*/.local/share/Trash /

This will compress the archive, it is completely portable, and is located at /backup.tar.gz

What the options mean: from https://help.ubuntu.com/community/BackupYourSystem/TAR

tar - is the command that creates the archive. It is modified by each letter immediately following, each is explained bellow.

c - create a new backup archive.

v - verbose mode, tar will print what it's doing to the screen.

p - preserves the permissions of the files put in the archive for restoration later.

z - compress the backup file with 'gzip' to make it smaller.

f - specifies where to store the backup, backup.tar.gz is the filename used in this example. It will be stored in the current working directory, the one you set when you used the cd command.

The \ just continue the command on the next line and I added them for clarity.

The --exclude should be self evident, but it excludes those directories.

You can use the --one-file-system option rather then all the excludes for /proc, /sys, /mnt, /media, /run and /dev , however, if you have a separate /boot or /home (or other partition) you would need to add it into the archive.

You can exclude directories you know you do not need, ie have no edits. /usr/share or similar.

To view the contents see How can I view the contents of tar.gz file without extracting from the command-line?

You can see the file contents with vim / gvim and list differences with zdiff

EDIT: From the comments "It bombs half way through showing: /lib/plymouth/themes/ /lib/plymouth/themes/ubuntu-text/ tar: /: file changed as we read it after 10 minutes on a mini-PCIe SSD (Sata II channel). This will take some time to fine tune"

This happens because the file system is in use so if there are changes written to disk during the tar you will get these sorts of messages. This sort of problem can be avoided by excluding as much as possible in the tar archive and/or running tar from a live CD/USB

Also, from the comments, other candidates for exclusion are :

~/.cache # These files are completely unnecessary and in fact you can at any time recover disk space by deleting this directory.

/usr/src/linux-headers* # Again , large amount of data you do not need.

~/.local/share/Trash # Review and delete trash or exclude this directory

/media # Will have for example Windows partions

/var/run/user/$USER/... which is a symbolic link to /run so I --exclude=/run as well. #This will have removable devices such as flash driver and android devices so probably can be excluded.

Panther
  • 102,067
  • This is also an excellent answer to the question "How to fully backup Ubuntu?" +1. The directory ~/.cache is large but I don't know what in it can be excluded. – WinEunuuchs2Unix Jul 16 '17 at 19:25
  • You can exclude all of ~/.cache . If yo do so, use the full path --exclude=/home/your_user/.cache – Panther Jul 16 '17 at 19:28
  • I would exclude that directory also. As you can see, this method of backup give you a very large size. It is hard to write one command that works for everyone. Personally on workstations I only back up /home and on servers if I edit a config file I save a copy in /root and server backup is /root , /home , and server data such as mysql , /var/www/html, and similar and any shared user data. Smaller backups that way. – Panther Jul 16 '17 at 20:03
  • I will backup logs separate. – Panther Jul 16 '17 at 20:05
  • My ~/.cache is 1 GB so I excluded it along with /usr/src/linux-headers* which has 21 kernels. ~/.local/share/Trash was huge and caused backup to run out of disk space so it had to be emptied. /media had pointers to Windows on /dev/sdb which was 60 GB so had to be --exclude as well. My android phone is in /var/run/user/$USER/... which is a symbolic link to /run so I --exclude=/run as well. – WinEunuuchs2Unix Jul 16 '17 at 21:06
  • It bombs half way through showing: /lib/plymouth/themes/ /lib/plymouth/themes/ubuntu-text/ tar: /: file changed as we read it after 10 minutes on a mini-PCIe SSD (Sata II channel). This will take some time to fine tune... – WinEunuuchs2Unix Jul 16 '17 at 21:10
  • That is why you should not run it from a running desktop / server. Better to backup from a live usb . – Panther Jul 17 '17 at 02:20
  • GREAT Detailed answer Bodhi! :) – smjadtnf707 Jul 18 '17 at 01:45
  • thank you! this worked. and i can even extract the tar to /stuff if I need to interactively browse the file structure. – IMTheNachoMan Jul 18 '17 at 15:50
  • @bodhi.zazen Finally found out the reason why the backup was bombing out. I was creating the .tar in root directory which was part of the backup. Creating it in /tmp instead (which is excluded) solved the problem. The final script is part of this answer: https://askubuntu.com/questions/917562/backup-linux-configuration-scripts-and-documents-to-gmail/922493#922493 – WinEunuuchs2Unix Jul 27 '17 at 00:57
  • --exclude=/root/backup.tar.gz should have fixed that. – Panther Jul 27 '17 at 01:19