12

I have recently migrated a set of Windows Server 2008 R2 / IIS 7.5 servers to new servers running Windows Server 2012 / IIS 8.

I am experiencing some odd behavior from IIS. We have 2 identical servers, each server is running 2 web sites, each on their own app pool. The code for each of the web sites is identical. (Literally... same dll's and everything, just slightly different configuration).

The app pools are set to recycle on a schedule every 24 hours, but during that 24hr period, the CPU usage of the w3wp worker process jumps up in increments of 12.5% (the server has 8 processors, so I don't think that is a coincidence).

Once the CPU usage jumps up, it WILL NOT go back down until the app recycles. As far as I can tell, the app is doing nothing and processing NO requests at this time. I can block off all traffic to the server and the CPU usage will just stay there. I can even RESTART the web site, and the CPU usage stays the same. The only way to reset the CPU usage is to recycle or restart the app pool that it runs on.

I am somewhat certain that this issue has nothing to do with my code, but some sort of a poor IIS configuration or a change in IIS 8 that is working poorly with the hardware configuration or something?

Not sure if it's important or not, but these are Rackspace Performance Cloud servers.

Here is a screenshot to show you the CPU load over time on these servers (green arrows point to the times when the app pool recycles. You can see that each plateau is an integral multiple of 12.5%:

enter image description here

Has anyone observed this behavior? I have found this question from 2009 with someone having what appears to be the same issue with IIS 6:

IIS w3wp using high cpu with no traffic

Any help is much appreciated

Leland Richardson
  • 203
  • 1
  • 2
  • 9

5 Answers5

2

This really looks like some code stuck in an infinite loop.

A request comes in, IIS starts serving it, something (probably a bug) triggers this behavior, a worker thread enters an infinite loop and pegs a CPU to 100%, and then it just stays this way until the app pool is recycled.

Even if no new requests come in, the CPU remains in use because the stuck thread never actually terminates.

Sometimes a new request triggers this behavior again, and then you get two stuck CPUs (or three, or four...).

Recycling the app pool of course terminates all worker threads, thus the problem gets solved... until it happens again.

Massimo
  • 68,714
  • 56
  • 196
  • 319
1

Had the exact same issue with Sharepoint 2013 and IIS 8 on 2012... We never troubleshot, but instead downgraded to SP2013 on 2008 R2 and all was well.

George
  • 73
  • 3
1

You can try using the Debug Diagnostic tool to track down what is causing the problem. It usually is for troubleshooting crashes and memory leaks, but could help find which component is causing the issue.

Greg Bray
  • 5,530
  • 5
  • 33
  • 52
  • HOw use ***Debug Diagnostic tool*** programatically when `high CPU or RAM more 90%` ? – Kiquenet Jul 02 '18 at 10:21
  • @Kiquenet You could try taking a memory dump of the process and then analyzing it on some other machine. I am facing a similar issue and was able to capture a dump in < 1 min on a server at ~100% CPU usage – Piyush Saravagi Jan 02 '20 at 19:17
  • yeah, then capture a dump in < 1 min on a server at ~100% CPU usage *programmatically* ? – Kiquenet Jan 15 '20 at 10:53
0

You could attach a CPU profiler to the w3wp process and have a look what's going on in there. You should be able to see what consumes the CPU cycles.

MichelZ
  • 11,008
  • 4
  • 30
  • 58
  • how attach a CPU profiler to the w3wp process programmatically when ***high CPU or RAM more 90%*** ? – Kiquenet Jul 02 '18 at 10:15
0

Looks like an infinite loop to me. I've seen this a few times, despite IIS saying there are no outstanding requests. I'm not sure how that can be, but this is exactly what you would see. The difficult part is that IIS doesn't log requests until they complete, so finding out which request triggers this behavior is difficult.

James
  • 363
  • 2
  • 4
  • 16