Is there a Windows command-line utility to list largest files exceeding specific size in sub-directories?

17

10

Would like to be able to find full paths of files in a directory tree that exceed a specific size (say 10MB).

Currently aware of Microsoft's Diruse (part of Windows XP Service Pack 2 Support Tools) which does what I want except it only lists directory sizes rather than files.

Umber Ferrule

Posted 2009-11-02T15:56:54.100

Reputation: 3 149

Answers

29

forfiles /P D:\ /M *.* /S /D +"01/17/2012"  /C "cmd /c if @fsize gtr 209715200 echo @path @fsize @fdate @ftime"

will scan D:\ and its sub-directories, look for all files whose last modified dates are greater than "17-JAN-2012" and whose sizes are greater than or equal to 200MB, then print their details.

forfiles is included on some Windows Servers, but not by default on Windows XP. You can extract it from the "Windows Server 2003 Resource Kit" download at http://www.microsoft.com/download/en/details.aspx?id=17657 (althou it says is for Windows Server, it runs on Win XP without problems).

learnScrapy

Posted 2009-11-02T15:56:54.100

Reputation: 406

Superb - this did what I wanted (as soon as I fixed the date for the UK 17/01/2012). Thanks. – Umber Ferrule – 2012-01-23T23:02:28.887

7

This sounds like a job for PowerShell's

get-childitem

Navigate to the directory in question, check properties with:

get-childitem | get-member

length and FullName look interesting, for example:

get-childitem |ft fullname, length -auto

Once you have mastered the basics try filtering with a where statement.

get-childitem | where-object {$_.length -gt 10000} |ft fullname, length -auto

Experiment with 100000

Guy Thomas

Posted 2009-11-02T15:56:54.100

Reputation: 3 160

1Use Get-ChildItem -recurse to search recursively – themadmax – 2017-03-13T10:19:17.437

3

The Linux utilities port at UnxUtils contains the Linux find command.

You should rename find.exe to something else, example xfind.exe, as find is a built-in function in the Windows Command Prompt. You can then find all file larger then 1000000 bytes by:

xfind directory -size +1000000 -print

Here is the doc for the Linux command find, but I do not know how exactly it was implemented in UnxUtils and for which version of find.

harrymc

Posted 2009-11-02T15:56:54.100

Reputation: 306 093

i believe GnuWin32 is more up-to-date: http://gnuwin32.sourceforge.net/ ... plus there's always Cygwin: http://www.cygwin.com/

– quack quixote – 2009-11-03T15:07:38.533

@~quack: You're right, only that the FileUtils package in GnuWin32 seems quite complex to install, requiring quite a few files. – harrymc – 2009-11-03T15:16:30.387

fair enough. i'm a cygwin user, personally. and i tend to prefer up-to-date over simple-to-install. but thanks for explaining your reasoning. – quack quixote – 2009-11-03T21:39:05.107

2

Take Command Console LE (which I end up recommending a lot recently), a free replacement for cmd.exe with a lot of extra features, has a command for that: PDIR

pdir /s /(fpn z) /[s10485760,]
  • /s means recursively, run the command from the directory you want to search.
  • /(fpn z) is the format for displaying the results, here: fullpathfullname size
  • /[s10485760,] means size = 10 MB or bigger

Snark

Posted 2009-11-02T15:56:54.100

Reputation: 30 147

1

I've just happened upon the command line tool, Disksum, which seems similar to to diruse, but gives two forms of output:

  • sorted by file counts per directory (ascending)
  • sorted by directory size (ascending)

Umber Ferrule

Posted 2009-11-02T15:56:54.100

Reputation: 3 149

1

I believe using this solution is more accurate with the description:

Download the command line executable sfk.exe at http://sourceforge.net/projects/swissfileknife/files/

Use it like this

skf.exe list -big

Example output (abbreviated):

[listing 50 of 78 files by size:]
        3951 mens\noname_30.mht
        3996 mens\noname_14.mht
        3996 mens\noname_25.mht
        4060 mens\noname_24.mht
        4263 mens\noname_31.mht
        4701 mens\noname_1.mht
       14568 Thumbnail Restore.zip
       45056 netmeter.exe
     [...]
     12337752 rktools.exe
     16826024 sp35378.exe
     16926496 jre-6u30-windows-i586.exe
     19480227 SugarCE-6.2.4.zip
     21073936 vlc-1.1.11-win32.exe
     22083184 EasyPHP-5.3.8.1-setup.exe
     25517642 MPSOFTWARE.phpDesigner.v8.0.0.145-CRD.rar
     31085033 phpdesigner8usb.zip
     48835640 netbeans 7.exe
     58900704 ZendServer-CE-php-5.2.17-5.6.0-Windows_x86.exe    
     491538432 53400105.iso

If you only want the top 10 bigger files, use:

skf.exe list -big=10

You can customize it further following instructions from: http://stahlworks.com/dev/index.php?tool=list

vicenteherrera

Posted 2009-11-02T15:56:54.100

Reputation: 185

1

The find command of cygwin utilities does this. For your requirement

find full paths of files in a directory tree that exceed a specific size (say 10MB).

this gives the result:

find -size +10M -type f -printf "%p %s\n"

-size +10M gives you "objects" bigger than 10 megabyte

-type f gives you files only

-printf prints the found files, %p is path, %s is size (in bytes) and \n is the newline.

Gerd Klima

Posted 2009-11-02T15:56:54.100

Reputation: 912

-2

I know that the question is about command line, but this question keep coming up in Google, so adding another simple way - via Explorer.

  1. open the location (a disk, or a folder) where you want to look for large files in Windows Explorer
  2. in the top-right search box type "size:gigantic" (the box will auto-suggest the syntax and other possible options)

jitbit

Posted 2009-11-02T15:56:54.100

Reputation: 360

This is a good and valid reply. It shouldn't be downvoted. – zar – 2018-07-16T18:42:31.653

Question was looking for command line answers. – john – 2014-01-06T12:36:00.147