1
I work with servers, and now, starting to gravitate more into server performance monitoring.
Some application developers that I've met lately claim that Windows/Linux Services and its applications (Web services, file servers, math applications, BI, Databases, etc) starts to suffer a considerable loss of processing power when the CPU usage reaches around 75%, even when there's 25% of processing power left.
Does the CPU usage really have an impact on the performance of an application after reaching 75%?
The problem here is that it depends entirely on the workload. You can have something that uses 100% of the memory bandwidth available to the CPU but only use 30% of of the CPU time, Same goes for HDDs, SSD or even GPU resources. It's impossible to say definitively that 80%, 90% or even 100% CPU usage will impact a user or service on that system. Something using 100% of the CPU but at minimum priority won't affect processes at a high priority and so won't affect the system at all. A system has to be specified and configured to match its workflow, not assumed that "it'll do" if CPU < 80%. – Mokubai – 2016-02-10T13:37:55.580