5

My disk space is dwindling by about 2GB a day! I only have a few more days before I run out of space.

$ df -h
Filesystem            Size  Used Avail Use% Mounted on
/dev/sda4             143G  126G   11G  93% /
udev                  491M  4.0K  491M   1% /dev
tmpfs                 200M  696K  199M   1% /run
none                  5.0M     0  5.0M   0% /run/lock
none                  499M  144K  499M   1% /run/shm
/dev/sda2             1.9G  580M  1.2G  33% /tmp
/dev/sda1              92M   29M   58M  33% /boot

I have been searching for the biggest directories/log files, deleting and compressing. But I am still losing the war. Finally, I realised I have a big misunderstanding:

julian@server1:~$ sudo du -h / | tail -n 1
16G     /

All of my files in / only add up to 16 GB. That leaves 110 GB unaccounted for!

Clearly I have a misunderstanding: I thought the '/dev/sda4' line represented all the files visible from '/'. What should I be reading to understand where the other storage has gone?

More details:

  • I have an Ubuntu 11.10 server, that was set-up by data-center staff.
  • It is running

    • my own code (which is fairly prolific with log files, but otherwise doesn't store much stuff on the drive)
    • duplicity for backups (which tends to store a lot of signature files)
    • various other standard services, like Apache, nagios, etc. They are very lightly used.
  • It has been up for about 4 months without a reboot.

  • I lied about the du output (simplified it for effect). It also complained about not being able to access GVFS and the du processes's own resources. I believe they are irrelevant:

.

 du: cannot access `/home/julian/.gvfs': Permission denied
 du: cannot access `/proc/10841/task/10841/fd/4': No such file or directory
 du: cannot access `/proc/10841/task/10841/fdinfo/4': No such file or directory
 du: cannot access `/proc/10841/fd/4': No such file or directory
 du: cannot access `/proc/10841/fdinfo/4': No such file or directory
  • I had the same issue "a little further exam revels that rsync is running and the / is growing obviously, for whatever reason, copying itself as a backup. ~killing rsync stops the increased production of used disk space".http://askubuntu.com/questions/105951/low-disk-space-filesystem-root – Ringtail Apr 10 '12 at 03:45
  • @BlueXrider: That case is interesting, but is almost the opposite symptoms. df was reporting all okay, while another tool was reporting running out. – Oddthinking Apr 10 '12 at 03:58
  • You could always try rebooting. On the other hand, if it's not your own server, or you can't stop your code, that obviously won't work. – zpletan Apr 10 '12 at 02:43
  • I shall do that. It is inconvenient (has to be done off-peak), and it may just push the problem back another 4 months, but if it saves the system from dying in 5 days, I'll take that as a win! – Oddthinking Apr 10 '12 at 02:47

1 Answers1

8

This may be caused by some application writing to an unlinked temporary file - such files will not show in du output (since they have no entries in any directory) but the application is still able to write to it so the file will grow and take up space.

You can use lsof +L command to find files which have hardlink count of zero and see which process owns the file. Restarting the process should free up the space.

See this answer for a slightly longer explanation

Sergey
  • 43,665
  • Interesting. lsof +L1 shows two files. One small, and one size 0t0, neither on the filesystem that is the problem. I shall schedule a reboot and see if that unjams any unlinked temporary files. – Oddthinking Apr 10 '12 at 02:45
  • I have no idea if this is the real solution, but the reboot helped, so I will give it to you. – Oddthinking Apr 20 '12 at 11:57
  • Awesome. I forgot I had set the scrollback in gnome-terminal to store unlimited lines. lsof +L showed huuuge files. I set it back to store ~4000 lines and then cleared up the scrollback buffers. Freed up nearly 30 GB of space... sigh The problems I face because of years of uptime... ;) – Aaron C. de Bruyn Nov 21 '13 at 03:21