Why does hardware get slower with time?

38

7

Why does hardware get slower with time? I have been a PC owner since 1990 and every computer I have had in my life became really, really slow after 3-4 years (even with a full system-reinstall). It is the case with Windows PCs. It is also the case with Apple Hardware. Why is this happening? Can this be avoided?

Alexey Kulikov

Posted 2009-10-14T11:12:38.110

Reputation: 547

1

Here is a great IEEE article written specifically in regards to transistor aging I urge anyone interested in this topic to read.

– Breakthrough – 2012-08-01T23:50:04.927

17The computer fairies getting bored and going to a faster place :( – Phoshi – 2009-10-14T11:26:14.260

1@Phoshi computer fairies? I thought it was gremlins eating up CPU cycles. – alex – 2009-10-14T11:28:12.993

2The fairies keep the CPU cycling better. It's a constant battle. – Phoshi – 2009-10-14T11:30:28.620

I have a theory that Hardware simply "burnes out" with time. – Alexey Kulikov – 2009-10-14T11:58:12.870

11computer fairies? bah. it's a well-known fact that older hamsters don't run as fast as the younger ones. you have to open the case and swap in a fresh hamster once in a while. – quack quixote – 2009-10-14T13:01:43.257

Generally, hardware doesn't get slower. Either you're not comparing the same software, or you've got a more specific problem. – David Thornley – 2009-10-14T15:58:40.550

2the biggest question is what to do with the used-up hamsters. i named the last two "Cheaper Than" and "Cat Food", but i'm not convinced that's the best disposal method. – quack quixote – 2009-10-14T18:30:29.650

1@quack: Cats are not optimal in this application. They make a game of it, resulting in far too much squeaking and---what's worse---sometimes decide to make you a gift of the corpse. Boas is where it's at. – dmckee --- ex-moderator kitten – 2009-10-15T00:49:04.303

Answers

33

Sometimes it IS the hardware, especially with laptops. Modern processors have circuitry to protect them from overheating, and will deliberately reduce the CPU speed if the core temperature gets too hot (or also to save power when demand is low and you're running on batteries - Intel calls the feature "SpeedStep" on their processors). If you notice your fan running all the time or the machine getting excessively hot around the cooling fan outlet, your computer's "airways" may have become clogged with dust.

I had a Dell Latitude that ran like new after I opened it up and removed about a quarter inch thick "sponge" of dust from between the fan and the heat sink. Dell actually has downloadable service instructions on their website that explain all the steps to open up the machine and get inside for this kind of service. If you're not comfortable with this, you probably have a techie friend who'll help you out. It's definitely worth the risk if you're planning to get rid of the machine otherwise!

If you think this might be what's happening on your machine, try downloading a utility like "SpeedFan" that allows you to check the temperature of your CPU as well as other components. With this app, you can graph the temperatures when you first start the machine. If they start climbing quickly and never seem to decrease, you can bet cooling is an issue. In my case, I also used a free app called "CS Fire Monitor" to show me the actual speed of my processor and I found that once it got hot, it was dropping to less than half speed. There's lots of good freeware out there that will show you this kind of information; just Google "CPU Temp Freeware" or "CPU Speed Freeware" or something along those lines and you'll find all sorts of options.

Hopefully, this will save a few people from replacing or throwing away decent hardware that just needs some respiratory therapy!

Todd Corson

Posted 2009-10-14T11:12:38.110

Reputation: 346

39

There are a few effects here:

  1. Your perception of how fast the computer should be is changing. When you first get new hardware you have something concrete to compare it against - the old hardware. This gives you an empirical measure of the speed improvement. As time goes by your memory of how slow the old hardware was fades you only have how fast the current hardware was recently to compare against.
  2. New versions of software come out which add new features to either extend functionality or make use of the new hardware. This will be, by definition, a larger program than before which will take up more resources thus causing your hardware to run a little bit slower.
  3. Accumulation of drivers, programs/tasks running in the background etc. Each additional driver/background task takes up a little bit more resource - hard disk space, memory, CPU cycles etc. While each one isn't large the effect is cumulative. People expect modern programs to update themselves so there are extra tasks running that you aren't aware of. The longer you have the computer the more of these programs you are likely to have installed.

When taken together they give the impression that the hardware is slowing down.

There may be other effects due to wear and tear on the hardware (disk fragmentation, memory latency) too.

ChrisF

Posted 2009-10-14T11:12:38.110

Reputation: 39 650

5@Chris - "Apples to Apples", lol. Pun intended? – Moshe – 2010-03-17T13:26:21.923

3@Moshe I wish I could say Yes but it never crossed my mind. :-( Seriously, patches (esp security patches) often have performance impacts so testing a fully patched system vs. what you remember from years ago is fraught with unaccounted variables. – Chris Nava – 2010-03-17T14:12:12.013

Updates and Patches on the Macintosh side, often increase the performance of the system. – Benjamin Schollnick – 2012-03-02T17:57:17.230

+1 As rightly said 'user perception' & 'hardware wear & tear' (eg. Transistors & capacitors wearing out, constant heat causing an increase in latency) take a toll on overall actual & percieved hardware performance. – Ganesh R. – 2009-10-14T11:44:23.427

10This is not true. For example, I have filmed myself using my new PowerBook G4 a couple of days after it has arrived. I did not have to wait a single second for the Finder to open up and do usual file management stuff. CPU load average was under 2%. To prove my point, I have reinstalled TIGER completely formatting the hard drive. I still get an average CPU load of 30% and have to wait 3-4 seconds before any finder opearion completes. – Alexey Kulikov – 2009-10-14T11:55:28.523

1@Alexey Kulikov - ChrisF didn't say "It's all in your head" was the only reason that hardware slows down. Wear and Tear on the hardware could cause what you are describing too. There are so many variables it's hard to say exactly why. – J. Polfer – 2009-10-14T14:01:31.477

1Did your full reinstall include applying updates? In that case it's not a true apples to apples comparison. – Chris Nava – 2009-10-14T14:27:03.353

14

When I have run benchmarks (both trivial ones like bogomips, and more serious one like Dhrystone and Whetstone) on five to eight year old hardware, I have always found that it turned in the same results as when it was new. (Always on Linux and Mac OS boxen, BTW.)

I have less experience with hard drives, but I did test one fast and wide SCSI2 drive about five years on (with hdparm) and got answers comparable to the original spec.

So, I think it is mostly, as others have said, a combination of new expectations and heavier software.

That said, I do currently have a powerbook G4 which could use testing, as it sure feels slower now than it used to. The suggestion above that clock throttling may come into play if the cooling system gets fouled is a good one.

dmckee --- ex-moderator kitten

Posted 2009-10-14T11:12:38.110

Reputation: 7 311

13

Page's Law ;)

Wirth's law, also known as Page's law, Gates' law and May's law, is a computing adage which states that software is getting slower more rapidly than hardware becomes faster.

sYnfo

Posted 2009-10-14T11:12:38.110

Reputation: 1 902

3+1 It's not the hardware, but the software being slower – Ivo Flipse – 2009-10-14T12:02:22.347

I had this as a slap in the face when I got broadband (and thus could download new software). PC got so much slower so quickly, and nothing helped :( – Phoshi – 2009-10-14T15:24:36.703

6haha, Gate's Law is also mentioned there: "... the speed of commercial software generally slows by fifty percent every 18 months thereby negating all the benefits of Moore's Law." – Bratch – 2009-10-14T20:04:24.500

6

Some slow-down is caused by hard disk fragmentation, whose cure is Defragmentation.
this is defined as:

file system fragmentation, sometimes called file system aging, is the inability of a file system to lay out related data sequentially (contiguously), an inherent phenomenon in storage-backed file systems that allow in-place modification of their contents. It is a special case of data fragmentation. File system fragmentation increases disk head movement or seeks, which are known to hinder throughput. The correction to existing fragmentation is to reorganize files and free space back into contiguous areas, a process called defragmentation.

On Windows there is another reason, that of the Windows Registry

The Windows Registry is a database that stores settings and options for Microsoft Windows operating systems. It contains information and settings for hardware, operating system software, most non-operating system software, and per-user settings. The registry also provides a window into the operation of the kernel, exposing runtime information such as performance counters and currently active hardware.

Over time, the registry time accumulates junk and needs also to be cleaned-out and optimized.

Another explanation is that newer versions of the operating system are usually more bloated and so slower. This means that just by installing the latest O/S version or patches, you may after a few years suddenly notice that your computer is now slower and it is time to invest in new hardware that can efficiently support the requirements of the latest version of your operating system.

harrymc

Posted 2009-10-14T11:12:38.110

Reputation: 306 093

To add to Matthew's comment, I've never seen a computer speed up after the registry has been cleaned either. For the hundreds of thousands of entries in there, removing a couple of hundred isn't going to make any difference. – Richard – 2012-07-20T18:41:53.620

@Richard: I'm not trying to sell registry cleanup to anybody (the vast majority has no need of it), but I did see a computer on which some operations (repeat - specific ones) were slightly speeded-up. Not to mention a load of old rubbish cleaned out (many tens of thousands of unusable items). But this special case was caused by the repeated reinstallation and finally uninstallation of mammoth-size products, not exactly normal usage. – harrymc – 2012-07-20T20:15:31.330

assuming a "fresh install" includes formatting the system drive, neither of these affect a fresh OS install on older hardware. – quack quixote – 2009-10-14T13:03:33.070

The effect of fragmentation also depends to a loarge degree on the file system you are using. Some filesystems (to whit FAT) suffer a lot, and others suffer very little. Nor do all OS uses an infinitely growing centralized database to store various switches and options. – dmckee --- ex-moderator kitten – 2009-10-14T15:08:55.563

I'm going to go out on a limb here and declare in 20 years of heavy computer usage I've never seen a system speed up noticeably after a disk has been de-fragmented. Is there any objective evidence to contradict by experience. – Matthew Lock – 2009-10-20T03:42:32.500

1it depends on what you're doing. trolling through my non-system data drive (say, performing md5 calculations) is much faster when all files on the drive are defragmented than when 1+GB files are scattered around in 200+ fragments. if you're not seeing system speedup after a defrag, perhaps your system files weren't fragmented to begin with. – quack quixote – 2009-10-20T05:37:17.557

4

You get used to the speed and it now longer feels fast.

For example, I had a customer who had a routine (which they regarded as down-time) that took over an hour on an old computer and when they upgraded their computer the process took five minutes which made them very happy for a while.

Fast forward a few years and they now complain about this routine taking five minutes. And every time they complain, they genuinely seem to have forgotten about the time it took an hour.

sgmoore

Posted 2009-10-14T11:12:38.110

Reputation: 5 961

2

There's a certain amount of perception issue, but if you're actually measuring a reduction in performance, I'd look to moving parts in the system.

"Moving parts," you ask, "what moving parts?"

Two easy categories to check: fans and disk drives. Fans are obvious, but in addition to the fan itself, make sure the airflow and cooling are unobstructed to ensure that interior component temperatures are also where they were when the box was new. Disks are a little more subtle, but a deteriorating disk can cut down dramatically on performance while appearing to work. See if the disk benchmarks match new performance, or if the error count is up dramatically.

While they don't really move, they're the moral equivalent: cable connectors. Any detachable end of each cable. Unplug, ensure clean, replug and ensure tight.

mpez0

Posted 2009-10-14T11:12:38.110

Reputation: 2 578

1

Perhaps it's purely down to your perception.

3-4 years ago, it was sparkling new hardware which was faster than the previous generation of hardware, therefore it felt very fast.

In 3-4 year since then, no doubt you have used computers with better hardware, so even if you do a clean install on the old machine, your experiences on newer hardware will leave with a lackluster impression of the old machine.

Or do you have empirical evidence that the machine actually performs slower?

JRT

Posted 2009-10-14T11:12:38.110

Reputation: 633

Yes, there is evidence -- I have filmed myself using my new PowerBook G4 a couple of days after it has arrived. I did not have to wait a single second for the Finder to open up and do usual file management stuff. CPU load average was under 2%. To prove my point, I have reinstalled TIGER completely formatting the hard drive. I still get an average CPU load of 30% and have to wait 3-4 seconds before any finder opearion completes. – Alexey Kulikov – 2009-10-14T11:56:09.347

Ok, my other thought would be that over time the harddisk has developed bad sectors and with that would come an overhead of having to re-read data that failed integrity checks. This would occur even after a complete format. – JRT – 2009-10-14T12:20:47.230

How long was the delay between when you filmed yourself using your new PowerBook, and when you reinstalled TIGER? – J. Polfer – 2009-10-14T14:10:35.050

4 years. I was always having this impression that hardware was wearing out, and a couple of days ago I have found a screencast I have recorded 4 years ago. So I went for the experiment, and voila -- all fact. – Alexey Kulikov – 2009-10-14T15:47:49.900

1

I believe some driver updates may these days also update firmware on the related device. There's also potential CPU-microcode updates, though rare.

I've seen some popular diagnostic/benchmark tools claim things worked at normal speed, yet there was some kind of low level driver/hardware issue that caused the mouse pointer to crawl and jump. At the time I didn't know about measuring DPC latency - that tool probably would have indicated there was an issue.

The point is - it's possible things can slow down in a way that makes things feel slower but doesn't show up in the kind of tools casual pc users use.

If someone wants to dig into this, I think they should have 2 identical computers, the other never getting connected on the net, never getting updates or new drivers installed. And time both computers using external timer/check time from NTP just to be sure. - and after 4 years, time both again and if there's a difference, clone the disk from the non-connected computer to the connected one and try again. And check any firmware version changes etc. edit: And when I say "time" I mean timing some custom task, not using existing benchmark. Both GPU and CPU vendors have been caught gaming known benchmarks according to Anandtech and few other sites in the past years I've read.

Anonymous Coward

Posted 2009-10-14T11:12:38.110

Reputation: 11

0

Most (if any) benchmarks aren't reliable in measuring OS snappiness. Unless the benchmark is some USB-to-USB system that is controlling the UI of another computer, emulating being a mouse/keyboard, the execution paths will be entirely different. The slowness in PC's I know about arises due to driver/security updates that can also update the firmware (and you don't know if the fw update in the driver persists or not) so the only true apples to apple comparison is to buy 2 computers and never plug the other one to internet or update the drivers after first install but preserve it for later comparison using such external benchmarking tool.

I started suspecting all benchmarks when I found a case where the benchmark was returning "all good" numbers while some hardware issue was causing the mouse to freeze around and the system was actually only barely controllable - clearly the benchmarks aren't affected by some low level things that can affect eg. snappiness and controllability of the PC.

(Slightly different but similar case: even though Q6600 benchmarked about same than equivalent Ghz dual core, I noticed responsivity was clearly lower. Back then this was explained to be due to Win-Vista scheduler not being good with 4 cores - point being - just as most benchmarks that show FPS would not detect some tiny jitters that user would feel, the PC benchmarks that tech press uses don't measure things like "Interrupt to process latency" and show the statistics of that instead of just some average)

edit: And if you're doing such setup with untouched reference PC, if it has a battery and or is ever powered, the hw maker could cheat by running an LFO to covertly obsolete the hardware eg. by slowing down some operation that benchmarks don't benchmark. A better than usual game press benchmark would be to run eg. dosbox, emulators, latency measurements inside vmware/hyperv as that will tax the cpu in more complex ways than otherwise.

edit2: and if they really wanted they could put in something that ages or some ultra-low power counter and capacitor or tiny battery charged at factory. So no matter if you never power the device they could make it slower with time but this kind of thing could be a liability if someone finds it but it wouldn't really matter unless this was made illegal and the fines were enough to put them out of business.

Anonymous Coward

Posted 2009-10-14T11:12:38.110

Reputation: 1

0

Actually this is not a technical problem, but rather a human brain problem. This may surprise you, but let me explain. I have good basis for what I say.

Part of the problem is how the software updates and patches are applied, but that is not the core of the problem I don't think.

The hardware machines have actually gotten significantly faster over the years, but the software's ability to load it down has increased at an even faster rate, giving the perception and the actuality that some things are slower, as they are.

For example my first Z-80 box had a clock speed of 1 mega hertz. Now my development platform runs at 2.66 ghz, or over 2000 times faster. I don't recall exactly, but all of CPM fit in about 16kb. Now Windows is who knows how big, but much, much bigger. It uses many layers of abstraction which get amazing things done in a more general way, but these layers take their toll on performance.

Let me get back to the human brain. What is well understood is that software engineers for many years have said and believed with some good reason, that hardware would just get faster and faster and so software didn't need to be careful with issues of optimization. So programmers did things to get things working and quickly at the cost of speed, ... thinking that the hardware people would take care of that problem. So the updates and patches are done with the thinking they are temporary, i.e. short term.

It is: short term, micro thinking, in a long term, macro problem.

I read an interesting book many years ago where a couple of scientists laid out this short term versus long term human thinking problem, and did some experiments on a wide range of humans to see how they make these tradeoffs. Their book is New World New Mind, and the authors are Paul Ehrlich and Robert Ornstein. I would put it down as the most important book I have read in the past 20 years because it provided a solid framework for how we solve the problem.

What they noted was that the human brain evolved in a time when making short term decisions made sense. Live for the moment and the day, but don't think to much about the future. It just wasn't worth it. So our gut sense of things which we often use to make decisions is a very old part of the brain and not well suited to many modern problems. And the brain has had no realistic time to evolve as the world has rapidly changed with population growth and technology's impact on things.

What professor's Ehrlich and Ornstein discovered was that very smart and well educated Ph.D.'s but also janitors made the same mistakes when presented with short term versus long term problems. Not something we generally think is the case.

One very good and compelling example of how this same problem is playing out in the world today, has to do NOT with the hardware environment, but it's big brother the whole darn environment in which we live. We humans are generally making the mistake of living for today, for the moment, but the reality is that global warming is upon us exactly because we have not allowed for it or taken measures to deal with it. It's the slowing of the hardware, by the software problem, all over again, but in a different context.

Ornstein and Ehrlich suggested that we might be able to make more correct decisions by basing our decision not on our gut instinct, but rather on data and statistics. So for example if a software engineer had statistics as to how fast their software was bloating relative to how fast the hardware was getting faster they might make better decisions as to what to include, what to leave out, and how much to optimize algorithms. In other words if they used actual data to make decisions, rather than their gut instinct.

Thank you for the good question. Sometimes the simple questions are the best I think. it gave me the opportunity to consider this from a new angle. I had never before seen the parallel between the hardware software issue in the human context.

Elliptical view

Posted 2009-10-14T11:12:38.110

Reputation: 864