5

I wrote a script to backup a database. When I execute it directly, it works. I tried to make it a cron job and while it runs (I checked with service status cron) it seems that it silently fails.

Here is the script:

#!/bin/bash
echo "Starting mongo backup"
mkdir /home/ubuntu/backups
docker exec -it mongodb mongodump --archive=/root/mongodump.gz --gzip
docker cp mongodb:/root/mongodump.gz /home/ubuntu/backups/mongodump_$(date +%Y-%m-%d_%H-%M-%S).gz
echo "Mongo dump complete"
printf "[default]\naccess_key=\nsecret_key=\nsecurity_token=\n" > ~/.s3cfg
s3cmd put /home/ubuntu/backups/* s3://my-backup-bucket/
echo "Copy to S3 complete"
rm /home/ubuntu/backups/* -r
echo "Files cleaned"

I used only absolute paths (EDIT: yeah actually I didn't), no environment variable, no un-escaped %. I don't know what I missed.

1 Answers1

6

One possible reason is that you don't use absolute paths to the commands and some of your commands are not located in /usr/bin or /bin that belong to $PATH envvar in Cron by default.

You can find out where is located each executable of your commands by the command which, for example which s3cmd. Then you can put the commands with their absolute paths in your script.

Another approach is to assign new value for $PATH in your script or in crontab: Why crontab scripts are not working?

You can redirect the output of your Cronjob to a file to debug where is the problem. For this purpose modify your the job in this way:

* * * * * /path/to/the-script >/path/to/log-file 2>&1

In addition I would prefer to use $HOME instead of ~ within the scripts.

pa4080
  • 29,831