Is it bad if millions of files are stored in one NTFS folder?

6

1

I have a very large NTFS volume with more than 20TB capacity and containing millions of files. If I put all of the files in the same folder, I know the performance is bad when I open the folder in Windows Explorer.

Is the performance still bad if I open one of the files directly in my program?

flypen

Posted 2012-07-25T11:58:32.403

Reputation: 267

Answers

4

If you are opening the file directly there's no matter how many of files you got in there. but if you are using TAB auto-complete to access file faster it will definitely affect the performance.

I found some clues in here -> https://stackoverflow.com/questions/197162/ntfs-performance-and-large-volumes-of-files-and-directories

To answer your question more directly: If you're looking at 100K entries, no worries. Go knock yourself out. If you're looking at tens of millions of entries, then either:

a) Make plans to sub-divide them into sub-folders (e.g., lets say you have 100M files. It's better to store them in 1000 folders so that you only have 100,000 files per folder than to store them into 1 big folder. This will create 1000 folder indices instead of a single big one that's more likely to hit the max # of fragments limit or

b) Make plans to run contig.exe on a regular basis to keep your big folder's index defragmented.

Read below only if you're bored.

The actual limit isn't on the # of fragment, but on the number of records of the data segment that stores the pointers to the fragment.

So what you have is a data segment that stores pointers to the fragments of the directory data. The directory data stores information about the sub-directories & sub-files that the directory supposedly stored. Actually, a directory doesn't "store" anything. It's just a tracking and presentation feature that presents the illusion of hierarchy to the user since the storage medium itself is linear.

mnmnc

Posted 2012-07-25T11:58:32.403

Reputation: 3 637