CPU and HD degradation on sourced based Linux distribution

1

I was wondering for a long time if source based Linux distributions, like Gentoo or Funtoo are "destroying" your system faster than binary ones (like Fedora or Debian). I'm talking about CPU and hard drive degradation.

Of course, when you're updating your system, it has to compile everything from source, so it takes longer and your CPU is used at hard conditions (it is warmer and more loaded).

Such systems compile hundreds of packages weekly, so does it really matter? Does such a system degrade faster than binary based ones?

danilo2

Posted 2012-09-26T12:55:31.447

Reputation: 113

Answers

1

Computer hardware does not degrade faster when it is being used, assuming adequate cooling. Generally, what kills electronics is heat, and heat can be mitigated by sufficient cooling; in modern personal computers, this typically means active cooling by forced air, but other possibilities (including water cooling and, in low-powered systems, strictly passive/convective cooling) exist. Which malfunctions cause old computers to slow down and crash? and Is it possible for a router to “go bad” with time? touch on this.

There is one main exception to this, and that is flash-based storage such as that used in SSDs, which has a limited number of write cycles before each flash cell wears out. However, modern SSDs go to great lengths to mitigate this, and despite what people might tell you selected for the intended workload in most client and server workloads are plenty durable enough, even more so from a flash wear perspective. This includes compiling software, which while it does tend to create a large number of files (involving lots of small writes) is also heavily cacheable by the system and thus doesn't necessarily imply so many writes to stable storage. As Serge pointed out, as an alternative you can consider running the build on a tmpfs type file system, which normally uses RAM for storage but will resort to swap space if sufficient RAM is unavailable. That is also likely to speed up compilation, since particularly for large projects compilation is more likely to be IOPS constrained than I/O throughput or CPU constrained; and even if it is CPU constrained, the higher IOPS attainable through using RAM to store source code files will not make the situation significantly worse.

The main electronics killer aside from heat is voltage impurities, which is a factor of the power supply and largely unrelated to what tasks you perform on the computer. With a properly rated power supply (which is mostly a concern if you build a computer yourself from parts) and aside from input AC impurities (which will affect any electronics), this for all intents and purposes won't be an issue.

a CVn

Posted 2012-09-26T12:55:31.447

Reputation: 26 553

1

If you really do tuning of all packages by disabling at compile time unnecessary functionality or you have some specific clone of x86 processor that requires some specific optimizations from compiler then your system will run even faster than the same system installed from a binary distro. As for degradation of the hard drive - you may use a separate volume to keep all your intermediate files of such rebuilds that you just format each time the update completed. The another option is to perform all this building on a tmpfs device that is actually backed up by the memory and swap files/devices, so its content anyway cleared on each restart of the system.

Serge

Posted 2012-09-26T12:55:31.447

Reputation: 2 585

Hi! thank you for yous answer. I know it would run faster because it will be optimized for specific architecture. The question was if the "endless" compilation would not degradate and slow down processor. I'm talking about long term processor usage with high load (everyday compilation of packages) – danilo2 – 2012-09-26T14:07:45.537

If you lower the priority of the rebuild process so it does not affect the running services and the cooling system of your CPU is functioning properly (I mean not some special additional cooling that you may install) and the CPU works in the temperature range it is designed to work - then "no degradation will be observed" – Serge – 2012-09-26T14:16:37.663

But I would install an extra drive if this process affect HD performance of your running services – Serge – 2012-09-26T14:19:37.737

I'm using gentoo for over 2 years and I'm planning to buy new laptop now and because of that I asked this question. When I'm updating my system (compiling) the temperature is normally about 75 Celcius degrees (on i7, thinkpads laptop). Does it not degrade it? where can I find the safe temperature range it can run? – danilo2 – 2012-09-26T14:29:21.517

According to the http://www.intel.com/content/www/us/en/processors/core/3rd-gen-core-lga1155-socket-guide.html the Tcase_Max temperature (maximum temperature according to the processor spec) of the modern i7 processors at its maximum power consumption (i.e. maximum load) is about 70 Celsius. Your present CPU is 2 years old so you can find the spec for your processor and see its characteristics.

– Serge – 2012-09-26T15:38:10.137