11

Do anyone know some good way to delete files on remote server that are older than X days using just SCP/SFTP? Sure I can write some script on perl etc but I feel it's overkill.
Any UNIX way?
Oneliner?
Separate utility?

Thanks

P.S. The task is to delete some outdated backup files.

Mike
  • 374
  • 1
  • 3
  • 13

4 Answers4

7

Sure I can write some script on perl etc but it's overkill.

You don't need a script to achieve the intended effect - a one-liner will do if you have shell access to send a command:

ssh user@host 'find /path/to/old_backups/* -mtime +7 -exec rm {} \;'

-mtime +7 matches files created one week ago from midnight of the present day.

danlefree
  • 2,873
  • 1
  • 18
  • 20
  • Sad but this is using SSH and remote oneliner. There is no shell access, just SCP/SFTP. – Mike Sep 27 '10 at 10:09
  • @Mike - Well that one-liner can save you some time over writing a perl script, if that is the case - you could use `atime` instead of `mtime` to match the last access time (i.e. when your files were last downloaded) and run a daily cron job. – danlefree Sep 27 '10 at 16:37
  • there is no shell access to remote machine. – Mike Oct 02 '10 at 11:03
  • @Mike I was under the impression that you could negotiate adding a cron job with the administrator of the server hosting your backup files - my apologies if this is not possible. – danlefree Oct 03 '10 at 03:43
5

This question is very old but I still wanted to add my bash only solution as I was just searching for one when I came here. The grep tar in the listing command is just for my own purpose to list only tar files, can be adapted of course.

RESULT=`echo "ls -t path/to/old_backups/" | sftp -i ~/.ssh/your_ssh_key user@server.de | grep tar`

i=0
max=7
while read -r line; do
    (( i++ ))
    if (( i > max )); then
        echo "DELETE $i...$line"
        echo "rm $line" | sftp -i ~/.ssh/your_ssh_key user@server.de
    fi
done <<< "$RESULT"

This deletes all tar files in the given directory except the last 7 ones. It is not considering the date though but if you only have one backup per day it is good enough.

dirkaholic
  • 176
  • 1
  • 4
  • While this is interesting and all, doesn't it make more sense to make one connection which completes the entire task instead of a connection for each file to be deleted plus one more to get the list of files? – chicks Feb 15 '17 at 22:14
  • I'm not sure it is possible to run a sequence of commands like this using one connection and I think it would it would be over-optimizing as well. For the use case of deleting old backup files of backups that run once a day it means you would effectively do 2 ssh connections per day, one for the list and one for the one file that is out of max now. I think that is quite an acceptable tradeoff. – dirkaholic Feb 16 '17 at 09:10
2

If you insist on SCP/SFTP you can list files, parse them using a simple script and delete old backup files.

Batch mode "-b" switch should help you out. It reads sftp commands from file. http://linux.die.net/man/1/sftp

M_1
  • 363
  • 2
  • 10
  • Sure it's possible, but I'm looking for more elegant UNIX way if it exists. – Mike Sep 25 '10 at 10:12
  • Do you have any idea? maybe suggestion from your side would help? Then we can make the idea a bit better? – M_1 Sep 25 '10 at 12:08
0

Nothing worked for me from above answers.
Especially when you limited by password and cannot use private key for sftp utility.
I have found good script using lftp (gist).

You need to uncomment # STORE_DAYS=6 to specify count manually.

#!/bin/bash
# Simple script to delete files older than specific number of days from FTP. Provided AS IS without any warranty.
# This script use 'lftp'. And 'date' with '-d' option which is not POSIX compatible.

# FTP credentials and path
FTP_HOST="ftp.host.tld"
FTP_USER="usename"
FTP_PASS="password"
FTP_PATH="/ftp/path"
# Full path to lftp executable
LFTP=`which lftp`

# Enquery days to store from 1-st passed argument or strictly hardcode it, uncomment one to use
STORE_DAYS=${1:? "Usage ${0##*/} X, where X - count of daily archives to store"}
# STORE_DAYS=6

function removeOlderThanDays() {

# Make some temp files to store intermediate data
LIST=`mktemp`
DELLIST=`mktemp`

# Connect to ftp get file list and store it into temp file
${LFTP} << EOF
open ${FTP_USER}:${FTP_PASS}@${FTP_HOST}
cd ${FTP_PATH}
cache flush
cls -q -1 --date --time-style="+%Y%m%d" > ${LIST}
quit
EOF

# Print obtained list, uncomment for debug
#    echo "File list"
#    cat ${LIST}
# Delete list header, uncomment for debug
#    echo "Delete list"

    # Let's find date to compare
    STORE_DATE=$(date -d "now - ${STORE_DAYS} days" '+%Y%m%d')
    while read LINE; do
        if [[ ${STORE_DATE} -ge ${LINE:0:8} && "${LINE}" != *\/ ]]; then
            echo "rm -f \"${LINE:9}\"" >> ${DELLIST}
            # Print files wich is subject to delete, uncomment for debug
            #echo "${LINE:9}"
        fi
    done < ${LIST}
    # More debug strings
    # echo "Delete list complete"
    # Print notify if list is empty and exit.
    if [ ! -f ${DELLIST}  ] || [ -z "$(cat ${DELLIST})" ]; then
        echo "Delete list doesn't exist or empty, nothing to delete. Exiting"
        exit 0;
    fi
# Connect to ftp and delete files by previously formed list
${LFTP} << EOF
open ${FTP_USER}:${FTP_PASS}@${FTP_HOST}
cd ${FTP_PATH}
$(cat ${DELLIST})
quit
EOF

# Remove temp files
rm ${LIST} ${DELLIST}

}

removeOlderThanDays
Maxim
  • 103
  • 3