In Windows 8.1, is there a way ensure a process is not the first to get killed when running out of RAM?

18

I wrote a .NET 4.5 application that buffers colour, infrared, and depth data from a Kinect v2, performs some processing on it, and then dumps it to disk, in uncompressed form; the .NET application also starts ffmpeg as a subprocess and pipes colour data to it to be encoded as H.264.

Because I'm not using an SSD, the video data arrives quicker than I can write to disk. But that's ok, it's acceptable for me to discard video frames when I'm low on RAM. My only requirement is that whatever I keep be mostly contiguous 8- to 10-second chunks of video. So I have added some logic in my .NET 4.5 application to start discarding video frames when I don't have enough RAM to buffer contiguous 8 to 10 seconds of video (roughly 1.5 to 2 GB).

And, to prevent page thrashing, I have completely disabled paging files. This leaves me with a total of 16 GB physical RAM.

My problem is that even with that mechanism in place, sometimes my .NET application or the ffmpeg subprocess still get killed when Windows 8.1 freaks out about low RAM, because obviously my application is using the most RAM when it has a huge backlog of video data to write to disk. Is there a way to tell Windows that my processes are more important than others so that Windows would start killing other less important processes first?

Kal

Posted 2014-12-18T03:02:19.440

Reputation: 533

10I didn't think windows killed processes, I thought that was a linux only feature. – Scott Chamberlain – 2014-12-18T03:58:07.027

can you try to limit the ram used by your app to always leave behind a couple of GBs spare? Because it seems you are not able to discard as fast as the data is adding up. – A-b – 2014-12-18T11:15:15.500

4@ScottChamberlain: That's because turning off the paging file on Windows is very rare. It gets you all kinds of unexpected and unusual behavior. The obvious answer here is "don't turn off the paging file; that forces Windows to keep unused data in RAM so your app can't use that RAM" – MSalters – 2014-12-18T13:44:46.957

1If this was a StackOverflow question, I could point you to CreateMemoryResourceNotification which is a lot less hacky. – MSalters – 2014-12-18T13:49:55.423

If the bottleneck is writing to disk, then fix that. Uncompressed data is ridiculous. Use something like Lagarith http://lags.leetcode.net/codec.html to compress before writing to disk. It's lossless, but very fast.

– longneck – 2014-12-18T14:32:07.560

@longneck I have considered the compression option. However, the machine only has a dual-core Core i3 and ffmpeg is already struggling with keeping up with encoding H.264 at 30fps in realtime. I think my best bet at the moment is to increase the padding. – Kal – 2014-12-18T14:39:45.540

Do you have to encode in realtime? – longneck – 2014-12-18T14:43:30.137

The H.264 encoding has to happen in real time because it runs 24/7. My estimate is that it will produce about 95 GB for every 24 hours. The size would be ridiculous in raw uncompressed form. In comparison, I'm only keeping raw uncompressed frames for select 8- to 10-second periods - when the Kinect detects a body - so the size is more manageable. – Kal – 2014-12-18T14:56:40.393

7@Kal: If disk access is a bottleneck, use a stronger compression, if CPU is a bottleneck, use a faster compression. If both are a bottleneck, rethink your entire design and start over, or get better hardware. – Mooing Duck – 2014-12-18T20:26:42.903

You disabled the page file. What did you think was going to happen. – Factor Mystic – 2014-12-19T03:09:57.663

Its pretty insane to use .net for ANY real time application, but most of all for video processing. It is likely you would get "acceptable" performance by doing the entire pipeline in OpenCL including a compression step. – Aron – 2014-12-19T03:10:05.187

1@FactorMystic OMG he did what? Disabling page file is going to reduce your usable RAM significantly. – Aron – 2014-12-19T03:12:17.160

Disabling the page file is actually not as bad as you might imagine. The computer uses about 2 GB out of 16 GB when sitting idle. I find that reduction in usable RAM acceptable, and it's much better than page thrashing for my purposes. – Kal – 2014-12-19T03:27:19.497

Can you lower the resolution of your input\output? This could lead to much lower memory consumption and faster encoding performances for your Core i3 processor. – mordack550 – 2014-12-19T09:51:25.567

@Kal Beware what you mean by "uses". If you're referring to the working set/resident memory, which is represented as "used" in the Windows Task Manager, then note that that does not accurately represent available memory with a page file disabled. Windows will refuse to allocate ("commit") memory when there isn't enough virtual memory (physical + pagefile) available. Many applications will commit far more memory than they ever use, which has to be reserved and unused in physical RAM when the page file is disabled. If you have 2 GB "used", you could have > 4 GB "committed"/unavailable. – Bob – 2014-12-21T23:39:54.283

Answers

45

Windows doesn't kill processes when all of the RAM is used. What actually happens is that processes fail to allocate memory and crash.

This is happening because all of your physical memory is in use and because the pagefile is disabled, the memory manager no longer has the ability to write pages that are not being used. This keeps your physical RAM full and when your process, or anything else running at the time, tries to allocate a page, it fails. Some applications crash.

This presentation from Technet explains: http://channel9.msdn.com/Events/TechEd/NorthAmerica/2011/WCL405

The pagefile is keeping applications from crashing when you utilize all of your memory by acting as a backstop for the over commitment.

Virtual memory is pretty much the foundation of how modern operating systems allocate resources, so it's all about having things in RAM that are in use, and moving stuff in and out from disk.

There are really only two answers:

  1. Re-enable the pagefile and increase the RAM on your computer to reduce disk thrashing.
  2. Reduce the memory requirements of your application.

The bottom line is that RAM is just another level of cache, and all of the stuff about virtual memory, pagefiles, memory mapped files, and all that basically comes down to this: if you're running out of memory, you need to add more.

Dawn Benton

Posted 2014-12-18T03:02:19.440

Reputation: 986

4Or use less.... – nhgrif – 2014-12-18T12:18:25.303

1Please note that the backlog is building up because the data can't be written to disk fast enough. I don't think that enabling virtual memory on the very same disk can help there... – Alexander – 2014-12-18T12:41:35.967

3In fact, the page file will be somewhere else on the disk. And since we know it's not an SSD, that means a physical seek which is the slowest disk operation. – MSalters – 2014-12-18T13:46:21.477

Yes, that's exactly the reason I disabled the page file altogether. Having the video jitter due to page seeking is unacceptable. – Kal – 2014-12-18T14:50:35.890

9Sounds like you need explicit memory management in your application then... – Joe – 2014-12-18T15:16:37.157

1@Joe exactly this. The garbage collector is going to make memory management a nightmare in these type of situations. This type of situation is trivial for me to deal with in C++ because I have fine-tuned control of all memory usage. Though there are design patterns that will work just fine for this case in C# as well, it's not as simple as what most people would try for. – Thebluefish – 2014-12-18T20:03:51.923

@thebluefish What are you proposing you'd do that you can do in vanilla C++ but not C#? Large objects are allocated on the LOH so the GC should pretty much keep it alone. The cool tricks with memory mapping the ringbuffer twice (the first thing I'd do if I needed a low latency ringbuffer) is neat, but is neither doable in vanilla C++ nor C#, but can be done with both if you use platform specific functions. – Voo – 2014-12-18T20:38:32.200

1@Voo: Locate the large objects in a memory mapped file straight away. Let the OS paging code deal with it. Memory is just as much disk as it's RAM, in Virtual Memory systems. – MSalters – 2014-12-18T23:16:22.160

@msalters As I said the interesting stuff is platform dependent and you can do that with c# as easily as with c++ (memory making alone wouldn't make any difference though - but you can then map the file twice continually into virtual memory - that's useful) – Voo – 2014-12-19T07:16:48.747

0

Go thru Windows Tool Panel & Advanced Settings & disable unneeded things, like window effects if you haven't already, and get Sysinternals Process Explorer &/or System Monitor to find & turn off anything extraneous that's wasting CPU or memory.

More importantly, use Process Explorer &/or System Monitor to watch as your program is executing and see exactly where & how it fails. Which thread runs short of memory and dies first - the main prgm or the ffmpeg part? Is there a specific dll or other shared resource that balloons unexpectedly in size? Or is the execution proceeding correctly, except biting off more than it can chew in data?

Figuring out more precisely the nature of your problem will likely point you in the direction of a solution. You could, for instance, implement your frame dropping policy more aggressively, while optimizing better for your 8-10 sec chunk criterion to achieve lower overall RAM overhead

Final suggestions: Maybe consider switching to Linux, and in the meantime, re-enable the paging file (linux calls it the swap space, which makes it sound more fun IMHO, like a swap-meet or something!) Good luck.

NS-MoCompSvc

Posted 2014-12-18T03:02:19.440

Reputation: 1