2
  • I'm using cronolog to let the apache log files rotate.
  • I'm using fail2ban to monitor those log files and ban in case of abuse.

  • There's always an access.log symlink which is created by cronolog and points to the current log file.

  • Each night, I run a cron to compress yesterday's logfiles :

    find /var/log/apache2/ -daystart -mtime +0 \( -name "*access*.log" -or -name "*error*.log" \) -type f -exec gzip {} \;
    

The problem is that, for website with low traffic, the access.log symlink now is dead because the file it was pointing to has been renamed by gzip. The result is that fail2ban gives up those jails because it cannot stat the files anymore.

The solution is to only compress the log files for which there is no symlink pointing to.

I found myself a way of doing so, and it works, but I would like to know if there is an easier way to do, because this one is really complicated, and not very fast (a find in a find).

    find /var/log/apache2/ -daystart -mtime +0 \( -name "*access*.log" -or -name "*error*.log" \) -type f -exec sh -c 'test `find /var/log/apache2/ -lname {} | wc -l` -eq 0' \; -exec gzip {} \;
Fox
  • 952
  • 2
  • 12
  • 21

1 Answers1

0

Possibly something like this:

LOGDIR=/path/to/log/files
CURRENT=$(stat -c "%N" $LOGDIR/access.log | sed -e "s/.* -> //" -e "s/[\`']//g")

for logfile in $(find $LOGDIR -type f -name \*.log)
do
  if [ "$logfile" != "$LOGDIR/$CURRENT" ]
  then
    gzip $logfile
  fi
done

May need modification depending on your exact requirements and the locations of the files.

Unbeliever
  • 2,286
  • 1
  • 9
  • 17