0

My goal is to regularly pack the most important files on a server into a tar.gz file. In order to automate this I am using a cronjob to execute a script:

$ crontab -l
# blablabla
* * * * * /home/backup/run_backup.sh >> /home/backup/output.txt 

For debugging reasons, I am currently running the job every minute.

As far as I can judge, it is not a cron problem, since the job executes every minute and runs as root (whoami in the script prints root).

The script that is running looks as follows:

#!/bin/sh -l

echo "Starting ..."

DATE=`date '+%Y-%m-%d_%H-%M-%S'`
BACKUP_PATH="backups/$DATE.tar.gz"

# MAKE BACKUP OF ENTIRE DATABASE
echo "Backing up database ..."
DATABASE_BACKUP_PATH="/home/backup/mysql.sql"
docker exec central-mysql sh -c 'exec mysqldump --all-databases -uroot -p"xxx"' > "$DATABASE_BACKUP_PATH"

# MAKE ARCHIVE
echo "Creating tar archive ..."
tar -vczf "$BACKUP_PATH" \
        /home/database/docker-compose.yml \
        /home/docker_res_usage.sh \
        /home/gitlab/docker-compose.yml \
        /home/gitlab/data/git-data/repositories \
        /home/mailserver/docker-compose.yml \
        /home/mailserver/mail/dkim \
        /home/mailserver/rainloop/_data_/_default_/configs/application.ini \
        /home/mailserver/rainloop/_data_/_default_/domains \
        /home/mailserver/readme.md \
        /home/php7-apache-alpine \
        /home/sftp \
        /home/traefik \
        /home/websites \
        "$DATABASE_BACKUP_PATH"

# CLEAN-UP
echo "Cleaning up ..."
rm "$DATABASE_BACKUP_PATH"

The docker command runs without problem, creating a .sql file. However, the tar command starts to list the first few files, and then just stops somehow. output.txt looks like this:

Cleaning up ...
Starting ...
Backing up database ...
Creating tar archive ...
/home/database/docker-compose.yml
/home/docker_res_usage.sh
/home/gitlab/docker-compose.yml
/home/gitlab/data/git-data/repositories/
/home/gitlab/data/git-data/repositories/websites/
/home/gitlab/data/git-data/repositories/websites/sbpp.git/
/home/gitlab/data/git-data/repositories/websites/sbpp.git/config
/home/gitlab/data/git-data/repositories/websites/sbpp.git/HEAD
/home/gitlab/data/git-data/repositories/websites/sbpp.git/hooks
/home/gitlab/data/git-data/repositories/websites/sbpp.git/info/
/home/gitlab/data/git-data/repositories/websites/sbpp.git/info/exclude
Cleaning up ...

Question: Are there known reasons for tar to just stop compressing? Do you see errors in my script?

The strange thing about it is that when I run the script manually, it works just fine. Is it a cronjob thing after all? Do I have permission problems, even though I am running as root?

Marco7757
  • 101
  • 1
  • 1
    Rather than only capturing STDOUT you usually also need to capture STDERR to help you debug. Maybe do `/home/backup/run_backup.sh 2>&1 >> /home/backup/output.txt ` - But a very typical problem in cron jobs is that you need to use absolute rather than relative paths – HBruijn May 10 '19 at 07:35
  • Thanks. I replace all paths by absolute paths, but forgot my BACKUP_PATH variable. Changing this to absolute path fixed the issue. – Marco7757 May 10 '19 at 08:11

0 Answers0