How to delete huge number of files on Windows

22

9

I have a directory that contains millions of sub-directories and trillions of files. And now I have to clear it. Saying trillion, I'm not talking about file size, but the number of files.

I've tried deleting it with del/s, and using Windows Explorer. Neither can complete the task. I've tried deleting some of the sub-directories one by one, and that took me days. The problem I met was that every time, no matter using del or Explorer, I can see in the Task Manager that the explorer instance consumes sky-high memory, and gradually pushes my system to crash.

There are still some hundred million files to be deleted. Is there any possibility to achieve with one (or just a few) commands / actions?


[EDITED]

I've tried doing it with Cygwin rm -fr, and yielded the same result. Summarized as:

  1. No matter using Windows Explorer, DEL from command prompt, or Cygwin rm command, the System memory gradually drops to zero, and the box will eventually crash.

  2. If at any point, before the system fails, the process is closed (by CTRL+C or what else), the box will continue to work as normal. However, all used memory will NOT be freed. Say, I've stop the process while system memory reaches 91%, Task Manager tells: 4G RAM in total, Cache is 329M, and Available 335MB. Then the memory usage will stay around this level until I reboot the machine. If I stop the explorer instance in Task Manager, the screen will go blank with HDD light all time on, and never came back. Normally, when I stop the explorer instance in Task Manager, I can re-invoke it by pressing Win+E, or it were restarted automatically.

Well, really nice memory management!


[EDIT AGAIN] It seems that some of the used memory did got freed after a long while, but not all. Some of the Cached & Available memory did come back in Task Manager. I haven't waited any longer, not sure what will happen then.

Jackey Cheung

Posted 2012-04-24T09:46:16.883

Reputation: 341

So your main problem is the fact that directories and subdirectories aren't being deleted? – Sandeep Bansal – 2012-04-24T10:01:13.943

@Jackey Cheung: which version of windows you are using? – Siva Charan – 2012-04-24T10:10:00.580

The version I'm using is Windows 7 64-bits. The files/directories that are processed got deleted. The problem is that it can't process so many files in a run, and eventually stuck/crashed. – Jackey Cheung – 2012-04-24T10:27:44.460

1You could write a batch script that recoursively deletes files, not starting from the top level but on e.g. the fifth level of the folder structure. That would split the job into many separate and sequential 'rm's – None – 2012-04-24T11:27:09.847

Yeah, writing a batch file could do trick. Actually I was considering write a program to specifically do this. But this is off the topic. – Jackey Cheung – 2012-04-25T00:59:55.910

9I have to know, how the hell did you get a trillion files, really... – Moab – 2012-04-25T01:35:42.060

I'm guessing Virus? Otherwise I don't know how it's possible to get that many files in various subdirectories – Mark Kramer – 2012-04-25T04:09:55.563

It's normal to have such number of files on our server due to the nature of its work. – Jackey Cheung – 2012-04-25T05:28:56.560

@JackeyCheung Typical NTFS allocation unit size (desktop): 4 KByte. Minimum size on disk for empty file: MFT record (one allocation unit, 4 KByte). Trillion files at 4 KByte each, (approx 1,000 * 1,000,000,000), 4 Petabytes... ... ... by the way, typical server allocation units (when drives are larger) are greater than 4 KBytes. So with a trillion files, you should be using at least 4 petabytes unless you set a smaller allocation unit size when formatting. I can see why it's slow/why it crashes. – Bob – 2012-04-26T02:47:46.710

@JackeyCheung In all seriousness, though, you're going to have some serious MFT size growth if you have a whole lot of files being created and deleted. – Bob – 2012-04-26T02:48:43.323

It might be worth booting to the Windows install DVD (or a Windows PE DVD / USB stick if you're so inclined) and trying the delete from there. – Harry Johnston – 2012-04-26T04:59:18.510

Is this an NTFS volume? (If it is a volume with a third-party file system, this would explain both how you fit so many files on one volume and why you're running out of memory.) – Harry Johnston – 2012-04-26T19:24:19.673

2A trillion files needs a file table that is 1 PB. The biggest hard disks nowadays are a few TB. How did you possibly get a partition that big? – user541686 – 2013-10-15T22:01:12.973

Answers

10

Technical Explanation

The reason that most methods are causing problems is that Windows tries to enumerate the files and folders. This isn’t much of a problem with a few hundred—or even thousand—files/folders a few levels deep, but when you have trillions of files in millions of folders going dozens of levels deep, then that will definitely bog the system down.

Let’s you have “only” 100,000,000 files, and Windows uses a simple structure like this to store each file along with its path (that way you avoid storing each directory separately, thus saving a some overhead):

struct FILELIST {                   // Total size is 264 to 528 bytes:
  TCHAR         name[MAX_PATH];     // MAX_PATH=260; TCHAR=1 or 2 bytes
  FILELIST*     nextfile;           // Pointers are 4 bytes for 32-bit and 8 for 64-bit
}

Depending on whether it uses 8-bit characters or Unicode characters (it uses Unicode) and whether your system is 32-bit or 64-bit, then it will need between 25GB and 49GB of memory to store the list (and this is a a very simplified structure).

The reason why Windows tries to enumerate the files and folders before deleting them varies depending on the method you are using to delete them, but both Explorer and the command-interpreter do it (you can see a delay when you initiate the command). You can also see the disk activity (HDD LED) flash as it reads the directory tree from the drive.

Solution

Your best bet to deal with this sort of situation is to use a delete tool that deletes the files and folders individually, one at a time. I don’t know if there are any ready-made tools to do it, but it should be possible to accomplish with a simple batch-file.

@echo off
if not [%1]==[] cd /d %1
del /q *
for /d %%i in (*) do call %0 "%%i"

What this does is to check if an argument was passed. If so, then it changes to the directory specified (you can run it without an argument to start in the current directory or specify a directory—even on a different drive to have it start there).

Next, it deletes all files in the current directory. In this mode, it should not enumerate anything and simply delete the files without sucking up much, if any, memory.

Then it enumerates the folders in the current directory and calls itself, passing each folder to it(self) to recurse downward.

Analysis

The reason that this should work is because it does not enumerate every single file and folder in the entire tree. It does not enumerate any files at all, and only enumerates the folders in the current directory (plus the remaining ones in the parent directories). Assuming there are only a few hundred sub-directories in any given folder, then this should not be too bad, and certainly requires much less memory than other methods that enumerate the entire tree.

You may wonder about using the /r switch instead of using (manual) recursion. That would not work because while the /r switch does recursion, it pre-enumerates the entire directory tree which is exactly what we want to avoid; we want to delete as we go without keeping track.

Comparison

Lets compare this method to the full-enumeration method(s).

You had said that you had “millions of directories”; let’s say 100 million. If the tree is approximately balanced, and assuming an average of about 100 sub-directories per folder, then the deepest nested directory would be about four levels down—actually, there would be 101,010,100 sub-folders in the whole tree. (Amusing how 100M can break down to just 100 and 4.)

Since we are not enumerating files, we only need to keep track of at most 100 directory names per level, for a maximum of 4 × 100 = 400 directories at any given time.

Therefore the memory requirement should be ~206.25KB, well within the limits of any modern (or otherwise) system.

Test

Unfortunately(?) I don’t have a system with trillions of files in millions of folders, so I am not able to test it (I believe at last count, I had about ~800K files), so someone else will have to try it.

Caveat

Of course memory isn’t the only limitation. The drive will be a big bottleneck too because for every file and folder you delete, the system has to mark it as free. Thankfully, many of these disk operations will be bundled together (cached) and written out in chunks instead of individually (at least for hard-drives, not for removable media), but it will still cause quite a bit of thrashing as the system reads and writes the data.

Synetech

Posted 2012-04-24T09:46:16.883

Reputation: 63 242

I'm pretty sure this doesn't work. I've tried it. The problem lies in the FOR loop. It turned out the FOR will cause the same problem as issuing DEL directly. – Jackey Cheung – 2013-10-17T13:46:48.940

1It depends on the switches you use. If you used the /r switch, then as I explained, it will try to enumerate all files. If you use the /d switch, it only enumerates the folders in the current directory, so unless you have a billion folders in the current directory, it shouldn’t cause a problem. – Synetech – 2013-10-17T16:41:14.443

7

I can't speak to the trillions of files, but I recently nuked an old file share that contained ~ 1.8M files using:

robocopy EmptyTMPFolder FolderToDelete /MIR /MT:16 /ETA /R:30 /W:5

"EmptyTMPFolder " is an empty local directory. the /MIR option will make the target look like the source (empty).

The real benefit for this approach was the retry option (/R:30). This permitted an opportunity to absorb any connectivity issues that may occur during this process. Local deletes might not find benefit in this approach.

I don't have specific benchmarks to compare the times, but I would prefer this over some of the other options suggested b/c of the retry/wait options. The deletes began near instantly.

matt.bungard

Posted 2012-04-24T09:46:16.883

Reputation: 71

I found that this is by far the most efficient method when running the cleanup on a large network drive folder tree. Thank you for the tip. – Tonny – 2017-05-10T15:43:59.167

5

To delete all folders will take a long time, and there is not a whole lot you can do about it. What you can do is save your data, and format your drive. It is not optimal, but it will work (and quickly).

Another option is perhaps to use some linux distro on a live CD that can read from an NTFS partition. I know from personal experience that rm -rf folderName can run for at least 2 days without crashing a system with 2GB of RAM. It will take a while, but at least it will finish.

soandos

Posted 2012-04-24T09:46:16.883

Reputation: 22 744

1hm, Linux. I'm thinking about the Cygwin. Though it is suppose to use underlining Windows functions, just wondering if it will make any difference in the case. I shall try it out. – Jackey Cheung – 2012-04-24T11:10:58.513

1you can use git bash – raindrop – 2013-08-18T08:57:51.247

4

Erm.. I don't want to know how you created so many.

What's happening is Explorer is trying to enumerate every single file, and store the information in memory, before it starts deleting. And there's obviously way too many.

Have you tried the command rmdir /s? As long as it actually deletes the files as they are found rather than waiting on every single one to be enumerated, it may work.

How many levels of subdirectories are there? If there's only one, or some other low number, then a quick batch file that manually recurses through might work.

Any method will take a while, though.

Bob

Posted 2012-04-24T09:46:16.883

Reputation: 51 526

Aside from soandos' suggestion of reformatting, of course. That would be fast, but if this is your system drive you will have to reinstall Windows. – Bob – 2012-04-24T10:04:35.067

I am pretty sure that enumeration has to take place, just so the program knows what to delete next. rmdir cannot delete files as they are found, as it starts from the top, and has to traverse somehow. The only question is how much excess info it stores. – soandos – 2012-04-24T10:50:21.417

@soandos Explorer counts every file. I was thinking some method that implements a DFS style of enumeration: going as far as possible down a branch, deleting when it hits a file, before coming back up. In other words, recursion, which is what rm -rf does. That works best with relatively shallow directory structures. I'm not sure if rmdir /s does this. It should. – Bob – 2012-04-24T11:00:28.573

I don't think rmdir will help in the case, since the system has finished deleting files yet. I think there shouldn't be much problem to rmdir all the directories, it'll take long time, but I don't think it'll crash. The problem is on deleting files. As I've mentioned, I've tried using del (command prompt) and explorer. Both have crashed after about working half day or so. And in both cases, the explorer instance in task manager shows insanely high memory consumption. – Jackey Cheung – 2012-04-24T11:17:26.553

1@JackeyCheung rmdir /?: /s Removes all directories and files in the specified directory in addition to the directory itself. Used to remove a directory tree. In other words, the /s flag removes files as well. How did you use del? And yea, it might be better to just use rm -rf as soandos suggested. – Bob – 2012-04-24T11:24:21.130

rmdir doesn't delete files within directories, actually it doesn't even delete directories that are not empty. Have to del /s before rmdir. – Jackey Cheung – 2012-04-25T01:01:26.627

1@JackeyCheung: you're wrong. If you give rmdir the /s flag, it deletes files as well as directories. – Harry Johnston – 2012-04-26T04:56:29.170

4

One possible cause of an issue like this is thin provisioning, typically found in SAN environments. Some solid-states drives might exhibit the same issue. If this is the case, this configuration change might solve your problem:

fsutil behavior set DisableDeleteNotify 1

Note that this change may impact performance on solid state drives, and may prevent automatic and/or manual rethinning of SAN drives.

Harry Johnston

Posted 2012-04-24T09:46:16.883

Reputation: 5 054

3

Shift+Delete skips the Recycle Bin, and might significantly speed up things.

If that doesn't work (extreme cases), try Fast Folder Eraser and / or Mass Directory Eraser

Tamara Wijsman

Posted 2012-04-24T09:46:16.883

Reputation: 54 163

when I tried to delete huge number of picture files, yes Shift DEL is quite fast if compared with normal DEL,thanks – V-SHY – 2014-10-03T17:10:25.117

3

It's probably your antivirus/antimalware consuming all the memory and then crashing the system.

Windows itself doesn't have a problem deleting huge numbers of files, although it certainly is slower than a similar operation on most non-Microsoft filesystems.

Ben Voigt

Posted 2012-04-24T09:46:16.883

Reputation: 6 052

Nice point! Sure worth a look at. – Jackey Cheung – 2012-04-25T03:34:04.223

I've turned off antivirus, and the memory still got eaten as before. – Jackey Cheung – 2012-04-25T05:46:50.843

Turning off antivirus doesn't help freeing memory after process is stopped either. – Jackey Cheung – 2012-04-25T07:11:19.083

@JackeyCheung: Which antivirus program is it? Some do not actually turn off completely... – Ben Voigt – 2012-04-25T13:31:38.263

2

Try this, and modify as you need..

It is a tested script on Win2003 based on Synetech's Technical Explanation and Analysis answered Oct 15 '13 at 15:22

@echo off

rem ### USE FULL PATH AS FIRST ARGUMENT TO SCRIPT, DONT FORGET QUOTES !
rem ### If you move this script, fix script path variable...
SET STATICFULLSCRIPTPATH="D:\scripts\FOLDER"
SET SCRIPTNAME="DeleteFast.bat"

rem ### If CD fails or IF condition has problems,
rem ### and DEL or RMDIR runs, its better to be at safe place.
if not exist "%TEMP%\SAFE" mkdir "%TEMP%\SAFE"
if exist "%TEMP%\SAFE" cd /d "%TEMP%\SAFE"

rem ### Fix quote overflow
set var1="%1"
set var1=%var1:"=%

if not [%1]==[] (
    cd /d "%var1%"

    echo # KILLING F AT : "%var1%"
    rem ### uncomment to do damage! ### 
    rem # del /f/q * > nul

    for /d %%i in (*) do call "%STATICFULLSCRIPTPATH%\%SCRIPTNAME%" "%var1%\%%i"

    rem ## Finish deleting the last dir
    cd /d "%var1%\.."

echo # KILLING  DIR : "%var1%"
rem ## Remove dir.. first try
rmdir /q "%var1%"

if exist "%var1%" (
    rem ## Remove dir.. second try
    rem ## If thousands of files/dirs had permission/ownership problems, then prepare to wait a long time.
    rem ### uncomment to do damage! ### 
    rem #cmd.exe /c takeown /f "%var1%" && icacls "%var1%" /grant SOMEBODY:F

    rem ### uncomment to do damage! ### 
    rem #rmdir /s/q "%var1%"
)
)

cd /d "%STATICFULLSCRIPTPATH%"

Testrun.. There are folder like A1 to A4, B1 to B4 and C1 to C4 nested differently..

Z:\>"D:\scripts\FOLDER\DeleteFast.bat" "D:\scripts\TESTF\DIRS"
# KILLING F AT : "D:\scripts\TESTF\DIRS"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A1"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A1\B1"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A1\B1\C 1"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A1\B1\C 1"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A1\B1"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A1\B2"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A1\B2\C 2"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A1\B2\C 2"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A1\B2"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A1"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A2"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A2\B3"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A2\B3\C 3"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A2\B3\C 3"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A2\B3"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A2"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A3"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A3\B4"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A3\B4\C 4"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A3\B4\C 4"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A3\B4"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A3"
# KILLING F AT : "D:\scripts\TESTF\DIRS\A4"
# KILLING  DIR : "D:\scripts\TESTF\DIRS\A4"
# KILLING  DIR : "D:\scripts\TESTF\DIRS"

D:\scripts\FOLDER>

I can't comment (site complains about reputation of mine), so I add my comment here..

Bjv's solution creates usuless temporary filelists. And then reiterates them second time to do the actual work. https://superuser.com/a/892412/528695

The original Synetech's script didn't work for me. https://superuser.com/a/416469/528695

@echo off
if not [%1]==[] cd /d %1
echo "%1"
for /d %%i in (*) do call %0 "%%i"

Results..

Z:\>C:\privscripts\TESTF\DeleteFastORIGINAL.bat "C:\privscripts\TESTF\DIRS"
""C:\privscripts\TESTF\DIRS""
""A1""
""B1""
""C1""
The system cannot find the path specified.
""B2""
The system cannot find the path specified.
""A2""
The system cannot find the path specified.
""A3""
The system cannot find the path specified.
""A4""

C:\privscripts\TESTF\DIRS\A1\B1\C1>

E.O

Posted 2012-04-24T09:46:16.883

Reputation: 21

I can verify that @user4350129 is correct when he says that Synetech's script doesn't work - I had the same behaviour on my Win7x64 box. – leinad13 – 2015-12-02T11:50:43.110

Damn my script was not perfect either, problems with if arg missing and quote overflow broke takeown anc icacls commands.. also I only checked folders not files.. I fixed those issues.. post edited, but always test before you use. – E.O – 2015-12-02T12:55:24.167

2

I had similar problems time ago with just 10 million of files but in a server 2003, to delete the files I used a ftp server/client, and left the client deleting the files and folders. It's a slow solution but it works perfect.

Probably you will have a second problem with the MFT in NTFS that have no solution, the MFT is an array that in win 2003 (I am not sure if Microsoft have a solution after win 2003) is storing all the files in a incremental way so with trillion of files the size will be crazy, in my case the MFT had 17 million of records and the size of the MFT was around 19GB with just 45000 files, I tested in other systems and looks like for 1 million of records the MFT will be around 1 GB.

You can check the status of the MFT with this command:

defrag C: /a /v
  • C: - unit letter
  • /a - analyze
  • /v - verbose

Another tricky solution, since there is no tool that can shrink the MFT, the tools just fill with 0 the name of the files and properties but nothing more, but you can use VMware converter or another kind of P2V and create a virtual machine based on your server, in that way you will fix all the problems related to the MFT, I never tested the conversion from V2P, now I am working only in virtual environments, but I saw many info about it on internet.

That win 2003 it's working perfectly now, the size of the MFT is 40MB and everything is ok, if you want I can tell you more about backups, defrags or others task related to millions of tiny files.

user5286776117878

Posted 2012-04-24T09:46:16.883

Reputation: 33

2

A problem you might be running into is that the directory does not get compacted when you delete a file/folder, so if you have a folder with 1 million files in it and delete the first 500k of them. There are a ton of blocks at the beginning of your directory that are for all intents blank.

BUT, explorer and a command prompt still has to look through those blocks just in case there is a file there. Something that might help is to "move" a folder from someplace down the tree to a new folder off the base of the drive, then delete that new folder. Moving the folder will only move the pointer to the folder so it should go quickly and not actually move all the files under it to new space on the drive.

Another thing you may try is to use a 3rd party tool like "PerfectDisk" to Compact folders after deleting a bunch of files.

Kelly

Posted 2012-04-24T09:46:16.883

Reputation: 21

2

Trying various approaches to delete over 10 million fusion log files, I noticed about 30K files on average could be deleted over a 10 min period. That would take about 55 hours for the 10 million files...

Using the script below, the deletion rate increased by ~75%. File lists are created and executed by concurrent processes increasing the disk operations (but not linearly.) I'm showing 4 forks, but two might suffice.

There's an option to use PowerShell which significantly reduces the time required to prepare the lists.

BTW, I tested using two direct del operations allowing for collisions, but there was no noticeable reduction in overall deletion time compared to a single del operation. And while it might not be desirable to create delete lists, the time saved was worth it.

@ECHO OFF
SETLOCAL EnableDelayedExpansion

IF /I "%~1"=="timestamp" (
    CALL :ECHOTIMESTAMP
    GOTO END
)

rem directory structure to delete
SET "DELETE=c:\_delete\Content.IE5\???<<<change this>>>???"
rem primary list of discovered files to delete
SET "LIST=delete-list.txt"
rem base path for sub-lists
SET "LISTBASE=.\delete-list"
SET "TITLE=Batch Delete Process"
rem specifies number of batch delete processes to spawn
SET FORKS=4
rem when set to 1, use PowerShell for list building and delete.  Definitely improves time to build fork sublists
SET POWERSHELL=0
rem specifies max files to delete when greater than 0
SET MAXDEL=1000000

rem prompt for confirmatoin
SET /P CONT=About to delete all files and directories from !DELETE!. Continue (Y/N)?
IF /I NOT "!CONT!"=="Y" EXIT /B

CALL :ECHOTIMESTAMP

ECHO Accumulating list of files to delete...
dir /b /s "!DELETE!" > "!LIST!"

FOR /F "delims=" %%c IN ('type "!LIST!" ^| find /C ":"') DO SET "COUNT=%%c"
ECHO Discoverd !COUNT! files and directories to delete.

IF  %MAXDEL% GTR 0 IF !COUNT! GTR %MAXDEL% (
    SET COUNT=%MAXDEL%
    ECHO Limiting files/directories deletion count to  !COUNT!
)

CALL :ECHOTIMESTAMP
ECHO Preparing !FORKS! delete processes...
SET /A LIMIT=!COUNT!/!FORKS!

IF !POWERSHELL! EQU 1 (
    SET SKIP=0
    FOR /L %%n IN (1,1,!FORKS!) DO (
        SET "CURRENT=!LISTBASE!-%%n.txt"
        SET "LIST[%%n]=!CURRENT!"
        DEL /f /q "!CURRENT!" > nul 2>&1
        IF %%n EQU !FORKS! SET /A LIMIT+=!FORKS!
        SET CMD=type \"!LIST!\" ^| select -first !LIMIT! -skip !SKIP!
        powershell -command "& {!CMD!}" > "!CURRENT!"
        SET /A SKIP+=!LIMIT!
    )

) ELSE (
    rem significantly slower but no PowerShell.
    SET L=1
    SET N=!LIMIT!
    SET C=0
    FOR /F %%f  IN (!LIST!) DO (
        IF !C! LSS !COUNT! (
            IF !N! GEQ !LIMIT! (
                SET "CURRENT=!LISTBASE!-!L!.txt"
                SET "LIST[!L!]=!CURRENT!"
                DEL /f /q "!CURRENT!" > nul 2>&1
                SET /A L+=1
                SET /A N=0
            ) ELSE (
                SET /A N+=1
            )
            ECHO %%f >> "!CURRENT!"
        ) ELSE (
            GOTO ENDLIST
        )
        SET /A C+=1
    )
)
:ENDLIST

CALL :ECHOTIMESTAMP
ECHO Forking !FORKS! delete processes...
FOR /L %%t IN (1,1,!FORKS!) DO (

    SET "CURRENT=!LIST[%%t]!"
    IF !POWERSHELL! EQU 1 (
        SET "TAB=        "
        SET BLANK=!TAB!!TAB!!TAB!!TAB!!TAB!!TAB!!TAB!!TAB!
        SET BLANK=!BLANK!!BLANK!!BLANK!!BLANK!
        SET DEL_CMD=del -force -recurse -ea SilentlyContinue -path \"$_\"
        SET $W_CMD=$w=$Host.UI.RawUI.WindowSize.Width
        SET $S_CMD=$s=\"$_\";$i=[math]::max^(0,$s.length-$w^);$s=$s.substring^($i, $s.length-$i^);$s=\"$s !BLANK!\";$s=$s.substring^(0,[math]::min($w,$s.length^)^)
        SET ECHO_CMD=Write-Host \"`r$s\" -NoNewLine
        SET CMD=type \"!CURRENT!\" ^| %% {!DEL_CMD!; !$W_CMD!; !$S_CMD!; !ECHO_CMD!}
        SET CMD=powershell -command "^& {!CMD!}" ^& ECHO\ ^& "%~dpnx0" timestamp
        ECHO CMD !CMD!
    ) ELSE (
        SET LOOP=FOR /F %%%f IN ^(!CURRENT!^) DO
        SET OP=del "%%%f"
        SET CMD=@ECHO OFF ^&^& ^(!LOOP! !OP!  ^> nul 2^>^&1 ^)  ^& "%~dpnx0" timestamp
    )
    rem ECHO !CMD!
    START "!TITLE! %%t" cmd /k  !CMD!
)

GOTO END

:ECHOTIMESTAMP
SETLOCAL
    SET DATESTAMP=!DATE:~10,4!-!DATE:~4,2!-!DATE:~7,2!
    SET TIMESTAMP=!TIME:~0,2!-!TIME:~3,2!-!TIME:~6,2!
    ECHO !DATESTAMP: =0!-!TIMESTAMP: =0!
ENDLOCAL
GOTO :EOF

:END
ENDLOCAL
EXIT /B

bvj

Posted 2012-04-24T09:46:16.883

Reputation: 131

1

I ran into the same issue some time ago. I wrote a small utility that does exactly that: recursively delete a directory. It will not enumerate the files, and it will not consume much memory (O(n+m) at max with n = maximum directory depth and m = maximum file/directory count in one of the subdirs). It can handle long filepaths (> 256 characters). I'd love to get feedback if you can solve your problem with this.

You can find it here: https://github.com/McNetic/fdeltree (executable in the releases folder)

Nicolai Ehemann

Posted 2012-04-24T09:46:16.883

Reputation: 121

1

I found this thread looking for a better way than I had for deleting over 3 million files on several of the servers I support. The above are way over complicated IMO so I ended up using my known method of using the "FORFILES" command line tool in Windows (this was on Server 2003).

Anyway, below is the FORFILES command I used to delete ALL files in a folder from the command line.

forfiles /P "YOUR FOLDER PATH HERE (e.g. C:\Windows\Temp)" /C "cmd /c echo @file & del /f /q @file"

The above also ECHO's the name of the files that are being deleted to the screen, but only because I wanted to see some progress of it actually doing something, if you don't echo something it just looks like the DOS box has hung, even though it's doing the work OK as expected.

It does take a little while to initiate, i.e looks like it's doing nothing for a while (about 30m for ~3million files) but eventually you should see the file names start to appear as they are deleted. This method also takes a long time to delete the files (deletion time might be reduced without the echo?), but it does eventually work without crashing the machine, on my server forfiles was using ~1,850Kb of memory during the deletion process...

The duration for the deletion can cause an issue if your servers have auto logoff as you need to keep the mouse moving (i'd recommend running as Console user, or via a 3rd part tool such as LanDesk or SCCM etc. (or MouseJiggle.exe))

Anyway, thought I'd share my answer, good luck all!

RatMonkey

Posted 2012-04-24T09:46:16.883

Reputation: 11

1

Per this answer on StackOverflow use a combination of del and rmdir:

del /f/s/q foldername > nul
rmdir /s/q foldername

Geoff

Posted 2012-04-24T09:46:16.883

Reputation: 2 335

Yeah, I've read that before. Actually it doesn't help in my case. Since the del command crashes definitely. – Jackey Cheung – 2012-04-25T01:02:52.027

1

Since deleting the files all at once uses too much memory, you need a way to delete them one at a time, but with the process automated. This sort of thing is a lot easier to do in a Unix-style shell, so let's use Cygwin. The following command generates a list of ordinary files, transforms that list into a sequence of rm commands, then feeds the resulting script to a shell.

 find dir \! -type d | sed 's/^/rm /' | sh

The script is being executed even as it is being generated, and there are no loops, so the shell does not (hopefully) have to create any big temp files. It will certainly take a while, since the script is millions of lines long. You might have to tweak the rm command (perhaps I should have used -f? but you understand your files better than me) to get it to work.

Now you have nothing left but directories. Here's where things get dicy. Maybe you've deleted enough files so that you can do rm -rf without running out of memory (and it will probably be faster than another script). If not, we can adapt this Stackoverflow answer:

 find dir | perl -lne 'print tr:/::, " $_"' | sort -n | cut -d' ' -f2 | sed 's/^/rmdir /' | sh

Again, tweaking may be necessary, this time with sort, to avoid creating huge temp files.

Isaac Rabinovitch

Posted 2012-04-24T09:46:16.883

Reputation: 2 645