0

I have conf for bacula. I try remove files after job, but it don't work:(

Part my conf:

Job {
  Name = "spass.mysql"
  Type = Backup
  Level = Full
  Client = spass-fd
  RunScript {
    RunsWhen = Before
    FailJobOnError = No
    RunsOnClient = Yes
    Command = "/usr/bin/mysqldump -EB syspass --result-file /tmp/syspass-%i.sql"
  }
  Client Run After Job = "/usr/bin/rm -f /tmp/*.sql"
  Schedule = "WorkTimeEveryHour"
  Storage = File-spass
  Pool = spass
  FileSet="spass.mysql"
  Messages = Standard
  Priority = 10
  Write Bootstrap = "/var/spool/bacula/%c_%n_%e.bsr"
}

FileSet {
  Name = "spass.mysql"
  Include {
    Options {
      signature = MD5
      compression = GZIP
      wildfile = "*.sql"
    }
    Options {
      Exclude = yes
      RegexFile = ".*"
      RegexDir = ".*"
    }
    File = /tmp/
  }
}

In message this ok:

14-Jun 17:02 bnode10-dir JobId 364: Start Backup JobId 364, Job=spass.mysql.2017-06-14_17.02.09_10
14-Jun 17:02 bnode10-dir JobId 364: Using Device "FileStorage-spass" to write.
14-Jun 17:02 spass-fd JobId 364: shell command: run ClientBeforeJob "/usr/bin/mysqldump -EB syspass --result-file /tmp/syspass-364.sql"
14-Jun 17:02 bacula-sd JobId 364: Volume "spass0006" previously written, moving to end of data.
14-Jun 17:02 bacula-sd JobId 364: Ready to append to end of Volume "spass0006" size=995952259
14-Jun 17:02 spass-fd JobId 364: shell command: run ClientAfterJob "/usr/bin/rm -f /tmp/*.sql"
14-Jun 17:02 bacula-sd JobId 364: Elapsed time=00:00:01, Transfer rate=200.8 K Bytes/second
14-Jun 17:02 bnode10-dir JobId 364: Bacula bnode10-dir 5.2.13 (19Jan13):

How to fix it?

PS. I think trouble be permission. But not have idea how to fix it

Nikita
  • 3
  • 1
  • 1
  • 3

2 Answers2

0

You can put your command in a script with an output on the console.

Bacula will write this output in the log file of your job.

Like you, i have a job which save dumps of databases. For that, I have a script which do the dump before the job and a script which delete the dump after the job.

Example of my script which do the dump before the job (it's in powershell but the idea is the same):

$dump="e:\scripts\save\sql\"+$bdd+'.my.sql'

   $process = Start-Process -FilePath "$($mysqldump_path)mysqldump.exe" 
                                    -ArgumentList "-u uuu -ppassword -B database_name" `
                                    -RedirectStandardOutput $dump `
                                    -Wait -WindowStyle hidden -PassThru `
                                    -RedirectStandardError $log_temp

        if ($process.ExitCode -eq 0){
            write-host "   Dump OK "
            $file=Get-Item $($dump)
            write-host ("   "+$file.Name+" : "+$file.Length+" octets : "+$file.LastWriteTime)
        }
        else {
            write-host "   Error Dump of $bdd "
            Get-Content $log_temp |  Out-file -FilePath $log_error -Append
            exit_script
        }

If the dump is ok, I have a "dump ok" line in the bacula job log :

shell command: run ClientRunBeforeJob "c:\windows\system32\WindowsPowerShell\v1.0\powershell.exe -NonInteractive -file E:\scripts\save\command_bacula.ps1"
ClientRunBeforeJob: start script : 06/13/2017 22:59:23
ClientRunBeforeJob:
ClientRunBeforeJob: ---- PostgreSQL ----
ClientRunBeforeJob: Dump bdd1
ClientRunBeforeJob:    Dump OK
...

With that, you can check the output of your script and where it would have an error.

Sorcha
  • 1,315
  • 8
  • 11
0

You need to wrap your call in a shell. Just plainly calling rm /tmp/*.sql will not apply globbing so it removes the file called /tmp/*.sql which does not exist. There is also no warning as you stated -f Try using Client Run After Job = "/usr/bin/bash -c '/usr/bin/rm -f /tmp/*.sql'" which will run your command in a bash which will then do the globbing.

Andreas Rogge
  • 2,670
  • 10
  • 24