Actually this is not a technical problem, but rather a human brain problem. This may surprise you, but let me explain. I have good basis for what I say.
Part of the problem is how the software updates and patches are applied, but that is not the core of the problem I don't think.
The hardware machines have actually gotten significantly faster over the years, but the software's ability to load it down has increased at an even faster rate, giving the perception and the actuality that some things are slower, as they are.
For example my first Z-80 box had a clock speed of 1 mega hertz. Now my development platform runs at 2.66 ghz, or over 2000 times faster. I don't recall exactly, but all of CPM fit in about 16kb. Now Windows is who knows how big, but much, much bigger. It uses many layers of abstraction which get amazing things done in a more general way, but these layers take their toll on performance.
Let me get back to the human brain. What is well understood is that software engineers for many years have said and believed with some good reason, that hardware would just get faster and faster and so software didn't need to be careful with issues of optimization. So programmers did things to get things working and quickly at the cost of speed, ... thinking that the hardware people would take care of that problem. So the updates and patches are done with the thinking they are temporary, i.e. short term.
It is: short term, micro thinking, in a long term, macro problem.
I read an interesting book many years ago where a couple of scientists laid out this short term versus long term human thinking problem, and did some experiments on a wide range of humans to see how they make these tradeoffs. Their book is New World New Mind, and the authors are Paul Ehrlich and Robert Ornstein. I would put it down as the most important book I have read in the past 20 years because it provided a solid framework for how we solve the problem.
What they noted was that the human brain evolved in a time when making short term decisions made sense. Live for the moment and the day, but don't think to much about the future. It just wasn't worth it. So our gut sense of things which we often use to make decisions is a very old part of the brain and not well suited to many modern problems. And the brain has had no realistic time to evolve as the world has rapidly changed with population growth and technology's impact on things.
What professor's Ehrlich and Ornstein discovered was that very smart and well educated Ph.D.'s but also janitors made the same mistakes when presented with short term versus long term problems. Not something we generally think is the case.
One very good and compelling example of how this same problem is playing out in the world today, has to do NOT with the hardware environment, but it's big brother the whole darn environment in which we live. We humans are generally making the mistake of living for today, for the moment, but the reality is that global warming is upon us exactly because we have not allowed for it or taken measures to deal with it. It's the slowing of the hardware, by the software problem, all over again, but in a different context.
Ornstein and Ehrlich suggested that we might be able to make more correct decisions by basing our decision not on our gut instinct, but rather on data and statistics. So for example if a software engineer had statistics as to how fast their software was bloating relative to how fast the hardware was getting faster they might make better decisions as to what to include, what to leave out, and how much to optimize algorithms. In other words if they used actual data to make decisions, rather than their gut instinct.
Thank you for the good question. Sometimes the simple questions are the best I think. it gave me the opportunity to consider this from a new angle. I had never before seen the parallel between the hardware software issue in the human context.
1
Here is a great IEEE article written specifically in regards to transistor aging I urge anyone interested in this topic to read.
– Breakthrough – 2012-08-01T23:50:04.92717The computer fairies getting bored and going to a faster place :( – Phoshi – 2009-10-14T11:26:14.260
1@Phoshi computer fairies? I thought it was gremlins eating up CPU cycles. – alex – 2009-10-14T11:28:12.993
2The fairies keep the CPU cycling better. It's a constant battle. – Phoshi – 2009-10-14T11:30:28.620
I have a theory that Hardware simply "burnes out" with time. – Alexey Kulikov – 2009-10-14T11:58:12.870
11computer fairies? bah. it's a well-known fact that older hamsters don't run as fast as the younger ones. you have to open the case and swap in a fresh hamster once in a while. – quack quixote – 2009-10-14T13:01:43.257
Generally, hardware doesn't get slower. Either you're not comparing the same software, or you've got a more specific problem. – David Thornley – 2009-10-14T15:58:40.550
2the biggest question is what to do with the used-up hamsters. i named the last two "Cheaper Than" and "Cat Food", but i'm not convinced that's the best disposal method. – quack quixote – 2009-10-14T18:30:29.650
1@quack: Cats are not optimal in this application. They make a game of it, resulting in far too much squeaking and---what's worse---sometimes decide to make you a gift of the corpse. Boas is where it's at. – dmckee --- ex-moderator kitten – 2009-10-15T00:49:04.303