CPU usage and speed of program

1

This is for my server. The scenario goes as follows:

Computer A has a faster processor than computer B. They both execute the same program, but Computer A's CPU usage only goes up to 30% while computer B goes up to 80%.

Would the time taken/performance of the application on both computers be the same? (I think this because there's is never a time on either computer which the CPU goes to 100%; thus I would believe that no process would need to wait)

The only difference between the two processors is clock speed. Architecture and # of cores are the same.

agz

Posted 2013-03-19T00:24:32.360

Reputation: 6 820

Answers

2

Impossible to say based off of that.

The application might be single threaded, where A is a multiprocessor and one core is utilized heavily, and the rest of the cores sit idle and are less efficient. CPU B on the other hand could have less cores (Or even one theoretically) and be very efficient specifically for the type of process being done and be faster than the "faster" processor.

If CPU A did not support certain instruction sets, it might only use 30% of its overall power, but again a more efficient processor that handles the instruction sets might be faster at this task despite being a slower processor.

CPU A might have a higher clock rate, and more cores, but have a poor architecture. One possibility here would be if it was constantly making false predictions and having to clear / reload the pipeline. It might be a faster processor overall, but there always exists the fact that it will handle some tasks worse than a slower processor (Look back at AMD Barton cores vs the Netburst Pentium 4's, the AMDs were lower clocked significantly but on many tasks were faster).

CPU B could be better at shuttling data around from disk (SSD perhaps) and RAM, thus it spiked to 80% partially because it could load enough data to cache fast enough to fill up, while CPU A was on a 4,200 RPM laptop drive attached and burned each segment of data faster, but could not keep enough data in cache to keep up.

Now, none of this is definitive, but for the sake of the argument it just goes to show you there is WAY more to it then usage and raw measurements of speed.

Austin T French

Posted 2013-03-19T00:24:32.360

Reputation: 9 766

The information is great! For my scenerio im assuming the clock speed to be the only varying factor. All architectures and cores are the same – agz – 2013-03-19T01:01:26.103

1Ah, well if your going to take the fun out of it :D Then strictly theoretically, yes they should complete at the same time. In the real world though, I would suspect CPU finishes first (Although perhaps immeasurably by our mortal chronometers. Because at 30% usage, there is 70% percent of its time it can handle other tasks such as AV, move data around for scratch, listen to music etc. CPU B might be in the middle of another request when the next instruction for the original task is sent, thus delaying its completing for a billionth of a second or so each time this happens. – Austin T French – 2013-03-19T01:19:44.603

Great information! – agz – 2013-03-20T19:57:55.150

1

Impossible to tell. If the CPU usage is for all processes (and there is no other real load), that means that the process is not CPU bound (if so, CPU usage would shoot up to nearly 100%). So the CPU is waiting for something else. That "something else" might be the disk, or anything else. The total time will then depend on how fast that is (and how much overlap between CPU and "something else" there is). I wouldn't be too suprised by comparative numbers like you cite, CPUs have gotten significantly faster, disks much less so. And CPU clock speed (which you compare) isn't all, as AthomSfere's answer tells, AMD CPUs used to have much less clock speed than intel's, but were faster anyway.

vonbrand

Posted 2013-03-19T00:24:32.360

Reputation: 2 083