-1

I currently have access to an FTP server, with files uploaded onto them. I was interested in knowing how to:

Automatically delete files on a specific folder on the FTP server which are x days old?

I'm pretty new to Linux/Unix, I've read people suggest using a Cronjob that runs every night to delete those files. I do know that the server is running some variation of Linux. However, I'm looking for something a little more elegant:

I do not currently know if there is PHP or Perl etc. installed on the server or if I am able to install them. But if there was any Open Source alternatives to accomplish this:

1) Allow the user a web-based interface to upload files to the server 2) Allow another user to download the file from the same interface 3) The server automatically keeps track of the file from the date of upload and deletes files which are x days old

I'd appreciate any suggestions. I wasn't able to find any open source solutions from an initial Google search ...

c0d3rz
  • 103
  • 1

1 Answers1

3

Welcome to the Linux world. Actually crontab is a very elegant way of doing things, and you should follow the advice you got so far. It is the Linux way of scheduling tasks. Unless you want to code the wheel again just use crontab.

The command you are looking for is:

find /path/to/your/folder -mtime +[days old]

eg:

sysadmin@omg:~/sync/0434$ find . -mtime +180
./201305_10min.csv
./201308_10min.csv
./201307_10min.csv
./201303_10min.csv
./201312_10min.csv
./201301_10min.csv
./201311_10min.csv
./201302_10min.csv
./201306_10min.csv
./201304_10min.csv
./201211_10min.csv
./201401_10min.csv
./201309_10min.csv
./201212_10min.csv
./201310_10min.csv

This will show all files older than 180 days old...

To delete these files I would add:

find . -mtime +180 | xargs rm

xargs will pass each line returned from find to rm as an argument

And if I cron this I will never be bothered with files older than 180 days on my system anymore.

The web-based interface is a whole subject itself. You should be familiar with HTML forms, GET/POST methods to communicate with the client and use probably PHP method ftp_put to upload to your server.

http://www.w3schools.com/html/html_forms.asp
http://php.net/manual/en/function.ftp-put.php

Also note that pointing your web server to your ftp directory will list its contents (files) which are downloadable with a click so again you don't need to code a web interface yourself. For example try:

sudo apt-get install apache2
sudo rm /var/www/index.html
sudo cp -r /tmp/* /var/www
sudo chown -R www-data:www-data /var/www

Type-in your server's address in your browser:

http://your-server-IP

You should see something similar to your requirement: 2) Allow another user to download the file from the same interface

I hope you find this useful and introductory to Linux. Appart the web interface to upload files -which is purely web-design, not linux question, ask anything that seems confusing to you in the comments. I will try to make it clear for you. But please do learn at least which distribution you are using not 'some variation of Linux'.

  • 1
    I would use atime (access time) instead of mtime. If you have a popular file that gets downloaded every week, you would like to keep it around. However, if the filesystem is mounted with -noatime option, it will not work. – ThoriumBR Aug 22 '14 at 14:23
  • Thanks Louis for the helpful reply. I've setup a machine with Ubuntu on it and plan to implement the things you have told me. Its a great start and I appreciate your patient answer rather than just being down-voted, I wasn't meaning to be lazy ... – c0d3rz Aug 22 '14 at 18:41