We have multiple file servers where I work containing terabytes and terabytes worth of files. We have no system in place currently for archiving/deleting stale data. My task is to create a script that will move/delete files older than 3 years. Moving/deleting the files aside, just "getting" the files is an issue. Using Powershell so far.
I'm running into files where the filename/path is longer than ~248 characters, which Powershell doesn't seem to be able to handle. I've also looked into Robocopy, but Robocopy has no way to access the LastAccessTime file property. Plus Powershell seems to be really slow. Here is the code I have been testing with:
Get-ChildItem "\\path" -Force -Recurse |
where {!($_.PSIsContainer) -and ($_.LastAccessTime -lt (Get-Date).AddYears(-3))} |
select Mode,LastAccessTime,Length,Name
My question is this: is there a faster, more efficient way to "get" terabytes worth of files recursively (including the LastAccessTime property) where some paths are longer than 248 chars?
Looking for free solutions mostly, but if there are some good paid solutions, I would be willing to check them out.