There is no way to rely on the dates for when a file was copied or moved into a folder. Windows manages to preserve it across filesystems, drives, network shares, etc. You might be able to work something out with a linux file server, or prevent people from directly copying files by using FTP or a web based upload system.
If you are ok with people not being able to modify the files after they upload, you could have separate upload and access folders, and a script that moves files between them and re-dates them. But it sounds like you want people to be able to modify the files directly.
So a simple, if somewhat hacky, solution would be to mess with the dates. I would write two scripts:
Hourly Date Changer script
Have a script run once an hour or so, in your preferred language, that:
- Looks for any file with a date modified within the last 20 years.
- When it finds such a file, change it's date modified to today minus 20 years.
In powershell, it would look something like this:
$path = "D:\test"
$today = Get-Date
$before = $today.AddDays(-7300) #356*20 days
Get-ChildItem -Recurse -Path $path | foreach {
if ($_.LastWriteTime -gt $before) {
Write-Host $_.Name
$_.LastWriteTime = $before
}
}
Running this script today (May 27), sets the modified date of all files to June 1st, 1994 - exactly 356*20 days ago. Because it is changing only files newer than the $before value, it won't touch files it has already set to the past.
Cleanup Script
The cleanup script would run every night, and:
- Search for files with date modified "20 years and X days ago"
- Delete them
I won't write the script for this part - there are plenty of utilities that can handle deleting files that are older than a specified date, choose whichever you like. The important part is to look for files that are 7300+X days old, where X is the number of days you want to keep them since they were last modified.
Advantages
This has a few advantages over the other answers here:
- The timer will reset if someone modifies the file.
- No need for NTFS alternative streams to mark the files (which are preserved when moving the file, so could cause premature deletion of a modified file)
- Should have minimal if any performance impact. No need to keep a database or list of filenames and/or hashes.
- Nothing breaks horribly if the scripts fail to run. There is no service or constantly running program needed to update the date. Just a couple scheduled tasks. Solutions that rely on watching for new files and updating their last modified time to right now could end up deleting new files if the service fails or runs into a race condition.
The only problem I can see is if people copy a file that was last modified 20 years ago to the drop folder. I think in most scenarios, that's unlikely to be much of an issue, but it could come up.