I answer my own question : Yes, it's definitely slower.
I wrote a C# Console Application
that creates many empty files in a folder and then randomly access them.
Here is the results :
10 files in a folder : ~26000 operation/sec
1.000.000 files a in folder : ~6000 operation/sec
Here is source code :
List<string> files = new List<string>();
Console.WriteLine("creating files...");
for (int i = 0; i < 1000 * 1000; i++)
{
string filename = @"C:\test\" + Guid.NewGuid().ToString();
using (File.Create(filename));
files.Add(filename);
}
Console.WriteLine("benchmark...");
Random r = new Random();
Stopwatch sw = new Stopwatch();
sw.Start();
int count = 0;
while (sw.ElapsedMilliseconds < 5000)
{
string filename = files[r.Next(files.Count)];
string text = System.IO.File.ReadAllText(filename);
count++;
}
Console.WriteLine("{0} operation/sec ", count / 5);
Duplicate of http://superuser.com/questions/6382/how-many-files-can-you-put-in-a-windows-folder, http://superuser.com/questions/453348/is-it-bad-if-millions-of-files-are-stored-in-one-ntfs-folder
– James P – 2013-07-25T08:27:45.917Yes. MSDN advices not to keep more than 20k files in a single directory. (Windows Vista 2gb Ram) - I have noticed when it starts to go over 40k (Windows 7 4gb Ram) it grinds to a halt. Everything just hangs and stops to work. But having 100k sub directories does not affect speed at all :) – Piotr Kula – 2013-07-25T08:29:42.913