0

In /opt/abc I have a path where a new folder gets created daily. In that folder a series of files gets created.

I would like to run a script every Sunday at 02:00 and compress each folder which is older than 2 days. I would not like to have everything compressed to one folder, but each folder to get compressed individually. If the compression is successful, then the original folder can be deleted.

I have tried to create a script but all it does is to compress them to a single file. also it does not delete the original folders.

How should I proceed?

Fabby
  • 34,259
john
  • 177
  • 2
  • 4
  • 8

3 Answers3

1

This single command should do the job, so it can simply be entered into your crontab:

find /opt/abc/* -maxdepth 0 -mtime +2 ! -name '*.tar.gz' -exec tar czf {}.tar.gz {} \; -exec rm -rf {} \;

I haven't tested it that thoroughly, but I am sure it won't accidentally delete stuff.

It will, however, delete archives, if they have the same name as a directory in question before the .tar.gz, as GNU tar will overwrite, if the creation switch, c, is given. It will be safer if you archive into another directory:

find /opt/abc/* -maxdepth 0 -mtime +2 ! -name '*.tar.gz' -exec tar czf /opt/abcompressed/{}.tar.gz {} \; -exec rm -rf {} \;

Also, the archives will contain the directory structure /opt/abc/<dirname>/<files>, which I wasn't able to work around yet.

s3lph
  • 14,314
  • 11
  • 59
  • 82
  • You can always strip unwanted parts of a tar archive when extracting using the --strip-components option, set the option's argument as 2, and it will remove the /opt/abc/ part of the path. – Arronical Jul 03 '15 at 14:38
0

You can use xargs -n 1 to only pipe a single file argument into the compress command.

Be careful about spaces.

AlexAD
  • 11
0
find /path/to/directory -mtime +2 -exec ls "{}" \;

Is a useful snippet to list files over 2 days old, though it only counts full days, and there's an element of rounding that happens there, so using minutes with the -mmin option may work better. I've also seen people relpace the -exec with print0 and pipe the output to xargs, handles unusual filenames better than echo would. You can specify the type of file find is looking for too, to make it only find directories, I think this is the -type option

You can replace the ls in the command with other commands, I often use ls to make sure I'm happy with the output, then replace it with mv /path/to/target when I'm removing files over a certain age. I suppose you could use a tar command in place of the ls to achieve the compression you want.

You could script this, and call the script as a cron job, cron is the script scheduler on Ubuntu (and other linux systems).

That should start you off!

Arronical
  • 19,893