-5

By what method are most system administrators managing the automatic cleanup of hundreds of thousands of old files across a wide range of folder locations?

These folders are located across the enterprise on many servers. I'm looking to manage their automatic cleanup by describing each location and the specific rules which govern their cleanup.

Such rules might be the age of file, based on date created, last change, or last-modified dates, the size of file, naming convention of folder or filenames.

It would be ideal to set triggers to invoke the cleanup without manual intervention, such as disk free space or percent, or simply a periodic cleanup.

Mark Henderson
  • 68,316
  • 31
  • 175
  • 255
Robert Kerr
  • 291
  • 1
  • 6
  • 16
  • edited to not speak of a specific product that accomplishes these things. we wouldn't want to know what someone's using, just that they're using one. right? – Robert Kerr Jul 02 '14 at 20:26
  • There's always Disk Cleanup... – Michael Hampton Jul 02 '14 at 20:35
  • Disk Cleanup does not provide automated unmonitored rule-based deletion. – Robert Kerr Jul 02 '14 at 20:37
  • Actually it can, at least to some extent. Look at its `/sageset` and `/sagerun` command line switches, and then throw it in Scheduled Tasks. [KB315246](http://support.microsoft.com/kb/315246) has more. It's probably not scalable enough for your needs, so I wouldn't make a whole answer out of this, but it might help someone. – Michael Hampton Jul 02 '14 at 20:41
  • DiskCleanup is completely INADEQUATE and doesn't even begin to address the requirements. I'm looking to automatically delete files and folders from network folder locations that are not amount the choices in the sageset options. I'm not looking for how to clean temp files and other special folders on a Windows XP computer, or I would have gone to superuser. – Robert Kerr Jul 02 '14 at 21:06
  • 2
    ...and we're back to a product recommendation or a script we're not going to write for you, for free. For what it's worth, the better approach to cleaning up old files is to not. Disk is cheap, and cleaning up old files will inevitably result in someone throwing a fit about their "critical" file being deleted. Instead, enforce disk quotas so that users or departments have to clean up after their own mess, so as not to run out of their allotment of disk. – HopelessN00b Jul 02 '14 at 21:21
  • 1
    any down-voters want to explain the problem with this question? – Maslow Jul 02 '14 at 22:23
  • @Maslow `This question does not show any research effort; it is unclear or not useful`. – HopelessN00b Jul 02 '14 at 23:32

1 Answers1

1

I don't. None of the criteria you mentioned--"age of file, based on date created, last change, or last-modified dates, the size of file, naming convention of folder or filenames"--are adequate to evaluating whether or not a file is "valuable." For example, your script based on date/last modified could delete Marketing's important promotional video but leave the iTunes library someone thought it would be clever to hide nearby. For similar reasons, you can't just delete all MP3 files because Marketing might be creating MP3s for legitimate promotional purposes.

The only way to judge whether or not a file is worth retaining is for a human being to make that determination, and in the case of user files I'm not the best judge. The user is.

Push that task back on the people who are the "experts" on those files: the people who created them.

Katherine Villyard
  • 18,510
  • 4
  • 36
  • 59
  • Not at all correct. We know the value of files based on the folder they are located in. Many of these folders accumulate thousands of files rather quickly, so I am looking to automate their deletion. There are hundreds of these folders in various hierarchies, and not every subfolder should have its contents deleted, thus why I need the rules-based processing. – Robert Kerr Jul 02 '14 at 23:00
  • 2
    @RobertKerr Then write a script to do it, and don't say we didn't warn you when it blows up in your face. – HopelessN00b Jul 02 '14 at 23:29
  • 1
    What @HopelessN00b said, more or less. If you're certain that you know the contents of these directories and no one will "hide" anything in them and cry later, write a script. However, if you're looking for a product recommendation to do this for you, that's off-topic *and* I know of no such product. – Katherine Villyard Jul 03 '14 at 01:10
  • Appreciate your efforts, and was directing this question to admins who in their environments have similar circumstances. This type of admin would work in a company that processes large volumes of files per day, perhaps tens of thousands or hundreds of thousands of files, as I do. Some portion of those files aren't needed beyond a given time frame and can be reclaimed. And yes I have written a utility to do this prior and am writing a new one to do all the above and more. It's smarter not to reinvent. – Robert Kerr Jul 03 '14 at 01:24
  • 1
    If I were in your position, I'd consider something that creates a directory for deleted files, moves the files based on date to that directory, and runs `rd /s /q` on it. I find that's the fastest way to get rid of large chunks. It almost sounds like you want something like a Nagios event handler to kick it off--although experience would make me chicken to do that myself. With my luck, I'd nuke the CEO's iTunes library. ;) – Katherine Villyard Jul 03 '14 at 01:29