- How do you keep your servers up to date?
- When using a package manager like Aptitude, do you keep an upgrade / install history, and if so, how do you do it?
- When installing or upgrading packages on multiple servers, are there any ways to speed the process up as much as possible?
- 53,385
- 32
- 133
- 208
- 8,239
- 17
- 51
- 64
11 Answers
On Linux/Debian based systems, cron-apt is a very handy tool that can manage automating apt via cron.
I'm using it to apt-get update
every day and send me an email if new updates has to be installed.
- 4,094
- 6
- 31
- 32
-
I like to have the minimum of auto-updated packages and the important ones are the security updates. For this reason, I add the following to the cron-apt config file: `OPTIONS="-o Dir::Etc::SourceList=/etc/apt/security.sources.list"` and then make /etc/apt/security.sources.list have just the Debian security repositories enabled. That way, I get all the security updates automatically installed in a timely manner (each night) and I can do other, riskier upgrades that might break things by hand. – Drew Stephens Feb 15 '12 at 18:20
Regarding your third question: I always run a local repository. Even if it's only for one machine, it saves time in case I need to reinstall (I generally use something like aptitude autoclean), and for two machines, it almost always pays off.
For the clusters I admin, I don't generally keep explicit logs: I let the package manager do it for me. However, for those machines (as opposed to desktops), I don't use automatic installations, so I do have my notes about what I intended to install to all machines.
- 2,731
- 6
- 26
- 37
-
4wow; is everyone upvoting because I'm so brilliant, or are people racing to get a badge? ;) – Mikeage Apr 30 '09 at 07:52
I use apt-history for history. I've no idea why this useful tool isn't included by default, it's the first package I deploy with puppet.
- 1,516
- 4
- 21
- 33
-
How does apt-history differ from what's recorded by default in /var/log? – jldugger May 01 '09 at 21:05
-
I wasn't aware of what you're referring to (from your answer); I guess I got to know apt-history and accustomed to it. – mark May 04 '09 at 20:20
I run /usr/bin/apt-get update -qq;/usr/bin/apt-get dist-upgrade -duyq as a cron job every night. In the morning I have notification of what packages need to be upgraded, and the files have already been downloaded on the machine.
Then I typically take a snapshot of the machine (most of our servers are virtual), do an apt-get dist-upgrade, check nagios and ensure everything is still working, and remove the snapshot.
Finally, I keep a running list of all changes made to every server on a wiki, in order to track any problems that arise later.
As far as limiting redundant downloads, I understand that you can set up a cacheing web-proxy (squid?) between your servers and the internet that will cache the .deb files the first time they are accessed. Perhaps this is simpler than setting up a local package repository - and has the added benefit of speeding up general web browsing.
- 22,219
- 19
- 68
- 102
For our Windows boxes we have a local WSUS server, and a standard window to apply the monthly patches. For the Linux systems (RHEL) we have a RHN satelite server on campus that they are all joined to. That provides a nice dashboard of every joined system that you administer, as well as the unapplied updates for each system. For those that are in puppet, we push out a script that auto applies patches during a regular window and sends an email notification with the results.
- 14,717
- 10
- 51
- 83
apt-cacher is handy for caching packages, it'll cache the first time they're needed rather than completing a full mirror of the entire repository thus saving disk and bandwidth. It's also handy as it streams the first request of a package direct to the requester while caching it at the same time so there's no additional delay.
- 7,147
- 2
- 28
- 42
Running a local repository is the best way to manage exactly what's on your local servers. It also lets you easily deploy custom backports or custom local packages. I've been known to make local 'meta packages' that are just a huge ilst of dependencies to make a local install easy. (eg. 'apt-get install local-mailserver'). This has the side effect of also letting you 'version' your config changes. (For more complicated config managment you'll need something like Puppet)
- 10,497
- 1
- 31
- 40
On OpenSuSE Linux, SLES, and Novell OES (all SuSE-based products), we've got a script that runs zypper and looks for packages that need to be updated. When it finds one, it submits a ticket to JIRA and assigns it to the sysadmins. When we install the updates, we close the ticket, which leaves an audit trail that tells us when it was installed and by whom. This can be reconciled/confirmed with the zypper and sudo logs, which are centralized on a syslogging server.
- 2,596
- 1
- 21
- 24
When using a package manager like Aptitude, do you keep an upgrade / install history, and if so, how do you do it?
apt keeps a log in /var/log/apt/, and dpkg uses /var/log/dpkg.log. dpkg in particular is fairly parsable.
- 14,122
- 19
- 73
- 129
You can have a local repository and configure all servers to point to it for updates. Not only you get speed of local downloads, you also get to control which official updates you want installed on your infrastructure in order to prevent any compatibility issues.
On the Windows side of things, I've used Windows Server Update Services with very satisfying results.
- 581
- 3
- 8