5

I have been thinking about deploying completely linux based networks, both servers and desktops.

I'm familiar with linux servers and linux desktops but not with giving linux desktops to standard users. So the problems I'm expecting to hit are centralized configuration and user management, centralized software management, security policies etc.

What I would like are any must-read docs or books, or suggestions from the community on software being used or recommended distributions, or other information to get me started.

gacrux
  • 203
  • 2
  • 4

4 Answers4

11

The simplest configuration for this goes something like:

Servers

Build one or more servers:

  • a 'Level 0' server that has a DNS server such as BIND and NIS on it. NIS is by far the simplest identity management solution for Unix/Linux and has been in use since the days of Sun workstations. Read about NIS security and decide if this is acceptable. In most cases it is acceptable for use behind a firewall but not for a publically facing web server. The advantage of NIS is that it's dead simple and you can reasonably expect to figure out how to set it up in a few hours. Administration is also fairly straightforward; take a look at the NIS howto.

  • Another server running NFS for a file server plus any other services such as mail that you need. In addition, you can use this as a backup for your DNS and NIS servers. Unless you have a heavy load you probably don't need more than one machine for this. The individual apps play together much more nicely than their windows counterparts.

Tune the NFS server as necessary. There are several options for SMTP servers (e.g. postfix) and various IMAP servers and other bits of infrastructure kit. Select ones that you are comfortable with.

If you have application and database servers you can install lightly loaded ones on this server. On a larger 'general' server you could mount multiple volumes and put the database volumes on their own disks. This is particularly attractive if you are using a SAN.

In most cases you do not need servers at the rate of one per application. Linux apps tend to play nicely in their own space and do not usually trip each other up. Save admin and hardware costs by avoiding unnecessary proliferation of server hardware. You are probably better off going for fewer large servers than more small ones.

VMs are less of a win for this than they are for Windows apps where people tend to deploy each on their own server. There is probably no benefit to running the infrastructure on a VM. This also keeps it simple.

If you have an application with a heavy load you might be better off putting it on its own server so it doesn't affect other apps. This also lets you tune the server specifically for that application.

Workstations

Set up your workstations with the system installed on a local disk and /home mounted off the file server. Users' home directories are mounted off the NFS server and secured through standard system security. This configuration was historically called 'dataless' and gives you a single system image that can follow a user to any workstation with no local state.

  • Leave a scratch partition on the local disks if you have anyone who needs a large amount of local data storage but make it clear that anything they want backed up should be copied to the file server.

  • Create a one or more NFS shared volumes. When your users need shared directories make them on these and set up permissions appropriately.

Voila: instant network infrastructure. This is about as simple as linux network infrastructure gets and this type of architecture has historically been scaled to entire university campuses. If you need more secure authentication you can do something with kerberos/LDAP but this is much, much more complex than NIS.

Legacy windows interoperability

Where your users are stuck on windows, you have several basic solutions:

  • Terminal services: TS or citrix clients such as rdesktop or the Linux citrix client can be used to publish apps from a terminal server.

  • Emulation: WINE/Crossover or a VM can be used to run windows applications on a Linux desktop

  • Substitution: Find a substitute (e.g. OpenOffice for MS Office) and use that. In many cases you can do this with 95% of your users and let the 5% that absolutely must have Excel use it on a windows desktop. If possible, find substitutes that will run on Windows as well so they can be deployed to windows desktops where you need a mixed architecture.

  • Windows Desktops: Using Samba, you can publish the users' home directories so that they can be mounted on a windows machine. If you have a class of users with a legacy app that is not amenable to emulation (possibly a content creation app such as Adobe Indesign) they can run Windows locally and use an X server (xming or Starnet are the best options) to get at the linux apps. Be ruthless about this - make the user prove their dependency and make a business case to keep the windows desktop.

    The key here is to treat Windows as a legacy system. Users get to keep their windows desktop if and only if there is no credible substitute for the application - and running the application in emulation is not acceptable. Getting your network interoperability right allows you to migrate users in a phased manner, which avoids the need for a 'big bang' deployment.

Network Security

Add a firewall for the network and a DMZ as necessary. Public SMTP servers can be outsourced to your ISP or placed in the DMZ. However, servers in this zone should not use NIS for authentication. Consider using OpenBSD for any machines exposed to the public internet. Squid is the canonical unix web proxy software if you have a need to proxy your internet connection.

'Group Policy' has no direct equivalent in Linux as the concept is not relevant when you have centrally mounted user directories. 'Group Policy' is not a requirement but a windows specific kludge based on the single-user origins of Windows where user identity and machine configuration is a very heavyweight structure. Migrating user identity and permissions between windows desktops is a very complex affair.

On unix-derived systems, all per-user config is stored as files in their home directory. When the user logs in their .profile is executed and per-user settings appear in the environment. If they manage to break their environment a simple script will restore the config to a known default. It is very difficult for an individual non-privileged user to do something that damages the machine configuration.

There are a variety of ways to push updates out to workstations. These range from a central repository where systems automatically download updates from a central server (i.e. the default way that desktop distros do this now) to more elaborate enterprise configuration management systems such as cfengine.

  • I argue against OpenBSD not because I think it is bad security-wise but an OS is always only as secure as the admin configures it to be. Choosing OpenBSD because they have a good security record in the default install is no good if don't know how to keep it secure. – Martin M. Jun 10 '09 at 10:41
2

I'm also interested in what others have to say. Meanwhile, be aware that one of the very big obstacles you're going to find are your users.

They will be very reluctanct, and support will be swamped with very simple requests.

Start with the servers. Make the server-side transition transparent for your users.

Also, make sure to test all your hardware and applications first (in case you need Wine or VMware or similar), so any compatibility issues and quirks are resolved before deployment.

Ivan
  • 3,172
  • 3
  • 24
  • 34
  • The solution to recalcitrant users is a policy of 'two strikes and then you get sent on a mandatory training course'. Also, make sure you have good user documentation and maintain a FAQ when you start to see recurring themes. Refer people to the FAQ when they ask a FAQ. – ConcernedOfTunbridgeWells May 18 '09 at 19:09
2

You should definitely, set up a central configuration management (like puppet or cfengine)

When you make all your configuration through them, you'll have following benefits:

  • all (most) of the settings and operations are centralized.
    • Put the configurations into version control and you have all machines' history as a bonus.
  • Servers having the same role will be guaranteed to have the same configurations. You will not encounter surprise missing one-off setting in a fresh install.
  • Makes all of your configuration reproducible (and documented).
  • If a server explodes, install on new hardware from scratch, and let the server run relevant configuration directives.
    • Then restore your backups.
    • You should have a well tought backup solution as a first item.
hayalci
  • 3,611
  • 3
  • 25
  • 37
0

Well I'm going to try and take a quick and dirty stab at this. Depending on the resources that are available to you it might be beneficial to look at the Linux Terminal Server Project. This would take care of the centralized configuration and user management by itself.

If this isn't an option (for example if you are using a lot of old hardware and don't have one beefy server) I would look into the Fedora Directory Project. It won't provide you with centralized configuration but it will provide you with everything you could want in user management.

TrueDuality
  • 1,844
  • 5
  • 27
  • 37
  • If the target is to deploy a Linux only network LTS will not do any good. Rather keep Windows Terminal Services (probably on a box that is not too powerfull) so that end users will feel that Windows is considered legacy. Of course you need to outweigh that because you must not sacrifice productivity. As soon as people start to feel that working with the "hatred linux desktop" is faster/better they will leave the old system behind without force – Martin M. Jun 10 '09 at 10:44