This may have been asked but...
I'm trying to profile the impact of file fragmentation on one of our business applications. What's the easiest and cleanest way to cause a 20GB file to fragment into roughly (say) 35,000 pieces?
Ideally the distribution would be random throughout the disk and/or be directly reproducible as much as possible but those are secondary concerns. It wouldn't have to be the exact number of fragments either, but within a range of 2-3k fragments.
Naturally I wouldn't be working with live data, but I'd like to avoid anything which could cause wider file system problems (this would be on a production server during off-hours).
If there's no elegant way of doing this, I'm happy to explore the alternatives - e.g. creating thousands of files in multiple simultaneous processes, using perl or a c# console program.
Platform is Windows Server 2003. Not sure of the configuration of the physical disks but the disk space is split between
- 12GB OS partition
- 35GB page file partition
- 500GB "data" partition. (this has the file[s] I'm interested in)