7
5
I have a program that is very heavily hitting the file system, reading and writing randomly to a set of working files. The files total several gigabytes in size, but I can spare the RAM to keep them all mostly in memory. The machines this program runs on are typically Ubuntu Linux boxes.
Is there a way to configure the file system to have a very very large cache, and even to cache writes so they hit the disk later? I understand the issues with power loss or such, and am prepared to accept that. Crashing aside, in normal operation the writes should eventually reach the disk!
Or is there a way to create a RAM disk that writes-through to real disk?
hmm ... last answer before mine was 3 months ago, OP hasn't been seen since posting the question, no other obvious activity, yet it appeared on the front page ... I guess the system is trying to get answers to questions – Anon – 2010-05-20T21:01:15.257
That is precisely what user account Community is for - bumping unanswered questions. http://stackoverflow.com/users/-1/community
– Corey – 2010-05-20T22:41:52.013:shrug: the OP is long gone, and nobody else seems to care – Anon – 2010-05-21T13:19:26.153
I do care, and I'm back; it will take me a while to digest your question and answer it fairly, please be patient – Will – 2010-05-27T13:07:34.523