12

I had a user with more that 100,000 files in a single directory. The machine locks up and become unusable for a long time whilst the Explorer fills up memory and the machine can crash. Is there a simple way of dumping the directory and contents? We tried using the command line deltree and this did not seam much better.

DaveF

Extra -

We have had this problem several times with the same user (actually client). A program creates 100s of thousands of temp files during its run. If the program works correctly it does not matter as they get removed slowing as it is finished with this. But every now and again it dies leaving the mess of files. We have tried using rmdir and del etc methods and they are faster than using explorer but we generally end up with the same problems. The partition get damaged, there is an unexplained locked file, some file has a permission problem, which stops the delete from working. Normally these problem can be fixed with a reboot but in this case as this is a critical system we can't in production time. On Unix you can just do a rm -rf etc and this is quick and there is generally no real problem with it. All the command line tools we have tried don't seam to work reliably.

BTW - the application is being changed but this will not go into production for sometime and I though this problem was probably interesting to others.

Update - because we have MKS Nutcracker on the system (without the command line utilities) we got the programmer to knock together a simple rm. This is several orders faster than del/deltree etc. I find it strange that there is no such simple app readily available as a standard windows system admin tool.

David Allan Finch
  • 273
  • 1
  • 2
  • 11

16 Answers16

15

You could use rmdir:

rmdir /s /q FOLDERNAME

Flags: /s means delete recursively, /q is the "quiet" mode.

splattne
  • 28,348
  • 19
  • 97
  • 147
  • We did try this - I will add more to the question. – David Allan Finch May 27 '09 at 10:15
  • this should have worked for you. Did it work ? – s_ruchit May 27 '09 at 10:23
  • yes mostly - I have adding what problems we have seen above in the question. thanks for your comment. – David Allan Finch May 27 '09 at 10:27
  • I am going to accept this answer. But there is a better solution which is to use a simpler unix style rm but as this is not aviable to most user, this is the best choice for 99% of users. There should be a unix style rm implemented for windows which does nothing else but recursively deletes. BTW our programmers MKS remove_tree (see above) thrashed rmdir so the problem is not Windows but the built-in apps. – David Allan Finch Jun 01 '09 at 09:51
6

I use to have this problem with servers I was administering. Skip explorer altogether and use the command prompt. Navigate to the folder and "del ."

This way you avoid the overhead of the GUI (explorer is trash) and Recycling bin.

user3913
  • 69
  • 1
4

Unless you need this for legacy 16-bit applications, try to increase the performance by disabling short file name generation. This can have a significant impact on directory operations with large number of files.

Regedt32 select:
"HKLM\system\CurrentControlSet\Control\FileSystem\NtfsDisable8dot3NameC
reation"=1

Then use:

RMDIR /S /Q [drive:]path

to remove the directory with all its files.

Peter Stuer
  • 1,473
  • 9
  • 11
3

How about moving that temp folder to a different drive/partition? Instead of deleting all the garbage, you could format the drive (either GUI or command line). With quick format, this should be reasonably fast.

  • 1
    We tried this and sometimes it would not allow this. The client now runs the program in a different directory every time. – David Allan Finch May 27 '09 at 12:44
  • Create a hardlink between the path & drive, ie: the hard link "c:\my silly app\where the files stored" points the drive X: in a startup batch – Nime Cloud Nov 22 '11 at 12:15
2

This is a combination of the previous posts

open a command window:
File -> Run -> cmd

Remove the directory
rmdir /s /q FOLDERNAME

OR

Remove a pattern of files
cd \....\directory
del *.[something]

As you've noted in your question Windows Explorer can really slow things down.

After seeing your update, I have one of my own
Check out this List of file removers

Brad Bruce
  • 616
  • 8
  • 17
  • I had noticed this with only several 100s of files in the past it is a shame that explorer can be made to do it as quick as the command line apps - they we could get the client to do it themselves. Thanks for your answer. – David Allan Finch May 27 '09 at 10:39
1

I have nothing better than del and rm, but even if they take a very long time, it should not result in partition damage. The locked files may be because the program that created them died unexpectedly.

Sometimes, even though the files cannot be deleted, they can be moved to another temporary subdirectory. I just create a _todel subdirectory and move them there. It usually works better if I move the entire directory one level up, rather than the files themselves.

Other than that, the only way to make the deletes go faster is to defrag the hard disk. A temporary workaround may be to move this directory to a samba share, or use an NTFS junction to a portable USB hard disk. (A small partition would be better and faster than using the entire hard disk). You can then just eject the hard disk, and do a quick format on another PC.

chayim
  • 11
  • 1
  • That is what I think - IE the damaged files but why are they not closed by the NT Kernel? Speed is a problem with the program so using Samba or a USB hard disk is a non starter. But the client has been suggested to create a separate partition for each run and to reformat. Not sure if they have tried it yet. - Thanks – David Allan Finch May 27 '09 at 12:52
1

You could also boot into the recovery console, or something along those lines. That should completely ignore any strange issues with the OS.

Wesley
  • 32,320
  • 9
  • 80
  • 116
Yannone
  • 347
  • 3
  • 9
1

A damaged partition from deleting files? That's a major problem with the disk, regardless of the number of files being deleted.

I'd try the command line delete and delete as many as you can until you come across one that is locked. Then use Process Explorer to find what process has that file open. Kill that process (if you can), and then do the delete again. Rinse, repeat :)

DougN
  • 670
  • 2
  • 7
  • 16
1

Reboot using a 'Live' Linux CD, the rm -rf

1

Kinda hacky, but you can create a script that will chunk delete the files, i don't know the format, but assuming it's files that start with the letters a-z, you can do

deltree /y a*
deltree /y b*
...
deltree /y z*

change deltree with your removal method of choice. probably be slower, but less error prone?

sidenote: you can also try installing cygwin to get commands like rm which probably would continue deleting files even after it encounters an error with one file.

Roy Rico
  • 602
  • 1
  • 7
  • 20
1

The various methods above all work. If you have cygwin loaded on a thumb drive, you can just plug it in to your local USB, fire up your cygwin shell and run the RM command as referenced above. This is a handy way to keep unix flexibility close at hand.

user13846
  • 266
  • 1
  • 7
1

Mount the folder from a linux machine and do the recursive delete from there.

# mount -t cifs //server/share /mnt/tobedeleted -o username=yourshareusername

# rm /mnt/tobedeleted/* -R

You could even use a cron job to delete the files at a regular interval.

steve
  • 11
  • 1
1

Why not just use robocopy and mirror an empty directory?

Just create an empty folder, and then use:

ROBOCOPY C:\ThisIsAnEmptyFolder C:\Users\SomeUser\Desktop\SomeFolderWithTempFiles /MIR
NotoriousPyro
  • 260
  • 1
  • 5
0

If none of the other suggestions are working, have you considered speeding up your I/O?

Buy better controllers, faster disks..

Or I wonder, if the files are temporary (as you imply) and sizes permit, you could add extra RAM to the box (cheap) and set up a RAM disk?

Best case, performance wouldn't be degraded when deleting large volumes of files. Worst case, a reboot would clear the partition.

tomfanning
  • 3,308
  • 6
  • 33
  • 34
0

Try cut-and-paste it into the recycle bin, it should bypass trying to move every single file. Not sure what it will do to emptying the bin though because then it probably will go over every file.

Erwin Blonk
  • 151
  • 2
  • 4
  • 14
0

Does deleting with SHIFT+DEL make any difference as to the speed?

Kevin Kuphal
  • 9,064
  • 1
  • 34
  • 41