5
I know that, technically, an individual Windows XP directory can hold an immense number of files (over 4.29 billion, according to a quick Google search).
However, is there a practical ceiling where too many files in one directory starts having an impact on reads to those files?
If so, what factors would exacerbate or help the issue?
I ask because my employer has several hundred XP machines in the field at client sites, and the performance on some of the older ones is getting "sludgy."
The machines download and display client-defined images, and my supervisor and I suspect that our slacktastic approach to cache management could be to blame as some of the machines have tens of thousands of images on them. I'm trying to gather evidence to support or contest the theory before spending time on a coding fix.
fsutil behavior set disable8dot3 1
– user1686 – 2010-03-31T18:32:46.317Yeah, I know there's a limit. Thanks for confirming my brief research that it's just over 4.29 billion files. Do you know if the DOS-compatible short names thing is an issue down in the range of tens of thousands of files, rather than 300,000 files? – BlairHippo – 2010-03-31T19:12:12.610
I don't think there is any issue with tens thousands files in a folder. Nothing related to this has been found my friend ;) – r0ca – 2010-03-31T19:29:37.493
@Blair: Tens of thousands are not that many files. Also accessing them isn't a bottleneck in any case, if anything, then enumerating them would be. But not at those counts, usually. – Joey – 2010-04-01T08:00:05.363