Is there a meaningful SINGLE UNIT measure of computing power?

3

1

A discussion has arisen on the English Language SE site about the term computing cycles, where I personally think normal usage is clock or processor cycles.

Obviously different processor & software architectures massively affect the significant of anything measured in "cycles" anyway. Comparing such values for a RISC processor as against an Intel Core i7 sounds pretty meaningless to me.

But it got me to thinking. I realise "computing power" in general is a vague concept, involving many sub-elements (processor speed / flexibility, video / disc / memory speed, etc.) So is there any concept of a "rough & ready" measurement that somehow averages out all the potentially relevant performance metrics of any particular machine so it can be compared to others?

FumbleFingers

Posted 2011-04-13T13:55:42.363

Reputation: 357

Windows attempts to do this with a WEI score...http://windows.microsoft.com/en-US/windows-vista/What-is-the-Windows-Experience-Index

– Moab – 2011-04-13T15:52:01.693

Yes I knew about this one. But WEI isn't a "unifying value". As I understand it, it's just the lowest score of all the subsystems (mainly in the context of running Windows itself, I think). – FumbleFingers – 2011-04-14T01:41:10.957

Yes you are correct, I did say "attempt". What can I say, its Windows. – Moab – 2011-04-14T01:50:58.030

@Moab: Well thanks for pointing it out, anyway. I really should have dismissed WEI in my Question. It doesn't so much rate your whole system as tell you which bit is holding it back. Which might sometimes be helpful, but mostly I just see it telling people with ordinary laptops that their video isn't up to snuff. Fine if you have a desktop and know how to buy & fit a video card upgrade, but with a lappie it's just something to make you sigh. – FumbleFingers – 2011-04-14T02:06:56.593

I don't think there is a good way to get a unified score or measurement that is truly accurate or objective, something will be left out or obscured in the score, too many hardware aspects to benchmark these days, "computing power" does not mean what it use to. In a way Microsoft did it right by breaking down the numbers so an average user can see what the bottleneck is. I snicker at people who buy notebooks expecting desktop gaming performance because the specs "looked" good, then complain because they freeze, lock up, BSOD and overheat, sometimes all at once. :-> – Moab – 2011-04-14T02:21:16.350

Yeah, that's about right. Secretly I was hoping there might be some generally-agreed "weighting factors" for each different subsystem. Then I'd be really interested in seeing how these factors got adjusted over time as we shift our ideas about which ones are important. But I guess there ain't no such animal, so I can't settle back and watch it evolve. – FumbleFingers – 2011-04-14T02:28:22.010

Answers

3

No. Different types of architectures tend to excel at different things - for example, a GPU would excel at heavily parallel tasks while a CPU would be a generalist design.. For HPC purposes, the FLOP (or TeraFLOP) is often used. Common alternate measures historically included the MIPS Dhrystone whetstone and linpack

I'd probably add that in hobbyist circles the time to calculate a million digits of pi is also used, but this is nowhere near scientific.

Journeyman Geek

Posted 2011-04-13T13:55:42.363

Reputation: 119 122

OK thanks. That does it for me. I do remember "calculate pi" (and "sieve of eratosthenes") calculations being used as benchmarking aids donkey's years ago. But as you say - not very scientific, to say the least. Program-specific values, really. – FumbleFingers – 2011-04-14T01:35:04.110

0

My friend and I realized the existence of this problem a while back and came up with an answer that is probably good more for laughs than anything else: we measured computer power in terms of how many instances of a canonical application it could run at once without bogging down or seizing up. The application we picked as our reference: GNU Emacs, and so the corresponding unit was the "GNU". Of course this program isn't too heavy on the 3D graphics, disk bandwidth (most of the time), or network utilization, but it does require a respectable amount of memory and CPU power, so it seemed a reasonable choice. At the time, an average machine rated about 1 GNU on this scale, so you can tell this was a while ago :)

Nowadays, you could pick a different (large) application that might cover more bases: some people might suggest Photoshop, or for general-purpose use Firefox would probably serve well, or if you're a software developer you might use Eclipse, etc. Thus you could have a 2-Photoshop machine, for instance.

Steve

Posted 2011-04-13T13:55:42.363

Reputation: 9

The particular aspects of "pc power" that are considered important really do change over time. Today, my primary desktop "house computer" often has to do quite a few things at once, which wasn't really the case even a few years ago. It seems to me having plenty of memory and multiple processor cores are much more important now than they were a while back. – FumbleFingers – 2011-09-12T18:07:36.737