2

We have multiple file servers where I work containing terabytes and terabytes worth of files. We have no system in place currently for archiving/deleting stale data. My task is to create a script that will move/delete files older than 3 years. Moving/deleting the files aside, just "getting" the files is an issue. Using Powershell so far.

I'm running into files where the filename/path is longer than ~248 characters, which Powershell doesn't seem to be able to handle. I've also looked into Robocopy, but Robocopy has no way to access the LastAccessTime file property. Plus Powershell seems to be really slow. Here is the code I have been testing with:

Get-ChildItem "\\path" -Force -Recurse | 
        where {!($_.PSIsContainer) -and ($_.LastAccessTime -lt (Get-Date).AddYears(-3))} | 
        select Mode,LastAccessTime,Length,Name

My question is this: is there a faster, more efficient way to "get" terabytes worth of files recursively (including the LastAccessTime property) where some paths are longer than 248 chars?

Looking for free solutions mostly, but if there are some good paid solutions, I would be willing to check them out.

Benjamin Hubbard
  • 173
  • 1
  • 1
  • 7

4 Answers4

1

If you're on Win2k8 or newer, you can use File Server Resource Manager to create a File Expiration task which will move files older than x to a different directory.

mfinni
  • 35,711
  • 3
  • 50
  • 86
1

The code example will be slow, as you're enumerating every single file and directory recursively, before applying the filter. This is not PowerShell's fault, the underlying filesystem classes in .Net are actually very fast.

You want to use the builtin filter parameters instead of piping to where-object (in general, do filtering as far "left" as possible.)

Trondh
  • 4,191
  • 23
  • 27
0

I got bit by powershell not long ago whilst doing something similar. It takes ages and ages. Luckily I had Cygwin installed and could test the corresponding maneuvre using native *nix commands. The difference in time taken was staggering.

So speedwise the slowness has less to do with windows or the filesystem per se, but more to do with the execution efficiency of the language used.

Path depth/length I have not yet experimented with, but for these very time consuming file handling tasks I now use cygwin over powershell on a windows machine. It sucks, but there it is.

There are probably other, more recommendable alternatives which also make for speedier file operations than the windows native commands. If there's one there's many is my thinking on that.

ErikE
  • 4,676
  • 1
  • 19
  • 25
  • Tried Cygwin, but it was incredibly slow for me. – Benjamin Hubbard Dec 16 '13 at 18:46
  • I'm sorry to hear that, it is obviously not the solution suited to your task. I'm filing your experience under very interesting though, as I use it next to daily just for performance reasons. Is it possible to briefly describe the test you did? – ErikE Dec 16 '13 at 19:24
  • I typed `find "./pathtofolder"` in Cygwin. Using System.IO.DirectoryInfo in PowerShell, it took about 2.8 min to finish the command. However, Cygwin took 14 minutes. Robocopy takes less than 2 minutes. Same folder for all. – Benjamin Hubbard Dec 16 '13 at 20:51
0

I am just going to use Robocopy and go by LastModifiedDate.

Benjamin Hubbard
  • 173
  • 1
  • 1
  • 7