I actually strongly disagree with the answers so far, that's why I've added another perspective. My solution approach is roughly based upon one of @ Cyclic3's comments, but amended as required.
The actual attack vector:
When there already is a root
shell, why deliver the payload by package manager? apt-get
is not the main problem (as it is not essential to install software), but inappropriate software provisioning by unauthorized users is, which should better request the packages to be installed for them. Letting them surf the internet with a root
account would be about the same idiocy, when looking at it from a security perspective.
On a side note, independent from apt-get
, they always could wget
some 0-day exploit, which performs privilege escalation. This would require white-listing destinations on the firewall to prevent, which does not amuse users in general, because this shuts down all of their random internet usage (which may also include job-related web research). This aspect is often being completely ignored.
One simple self-service solution:
Since nobody wants to play the server serf ...why not write a simple web UI, which let's them select the packages and then provision them to the machine they've requested from? Then just ssh
into and run apt-get
. This is a method of white-listing (I only know yum
in detail, which does support black-listing but not white-listing). In general, this is a very simple way to almost guarantee zero risk, as the root
shell & apt-get install
are being kept outside of their reach. When triggering this as a batch job from the root
crontab (assuming the root
SSH keys match on all machines), one does not even have to switch the user account to do so. Just push a notify
when done and they'll be amazed.
As I used to say, when working in user support: What they cannot click, they cannot mess up.