2

I'm thinking about a convenient way to track my vps configurations and to deploy them in one click (or more or less one).

I have been reading up and looking into different config deployment tools of which puppet seems to be the most popular, only to come back to square one.

Even though the features offered by Puppet attract me alot and I will probably use it, I can't seem to find a way to do the simple thing easily.

I have to maintain and change configuration files in /etc and what I would really like to get rid of is ssh -> vim or nano, or sftp... I would like to have the entire folder on my development machine and enjoy the power and speed of sublime text to edit them, then do git commit, git push, -> restart some things and hopla, we're updated.

In the puppet paradigm it seems to be more like:

file {'/tmp/test1':
  ensure  => file,
  content => "Hi.\n",
}

It was initial concerns with git not exactly being a very suitable tool for this kind of deployment (and not being entirely sure of all the risks involved) that led me to read up on puppet/chef,...

Now how do you tackle this problem and are there production systems on which you do use git/git-deploy/etckeeper in order to edit your /etc configuration offline?

  • You'd definitely want to put your puppet config in version control. Is your concern the fact that you need to define which resources that you'd be managing, or? – Shane Madden Nov 17 '13 at 01:04
  • I'm worrying about the difference in convenience and speed between having the etc directory offline to play with, and having to write json script files for puppet. Other things also come into play such as how does `ensure => file , content => xxx` fit in with system updates? It would seem that etckeeper would give you a nicer overview of how your system evolves. –  Nov 17 '13 at 01:14
  • 1
    Not every change to the system can be done by simply changing a file in `/etc`. Configuration management tools define the state that you want servers to be in, while something like `etckeeper` is for keeping track of what you're changing directly in files. If you want your servers to be unique snowflakes that you're fiddling with files in `/etc` regularly on, then don't use a configuration management tool. If you want to define what state your servers should be in and have that state repeatable and easily deployed, then use a config management tool and don't worry about watching all of `/etc`. – Shane Madden Nov 17 '13 at 01:29

1 Answers1

3

Tools like puppet are designed to bring a configuration to a correct state. Tools like git are designed to track versions of files. Both of these tasks have been referred to as configuration management, which can lead to confusion. In the following, I will use version control to refer to tools like git, and Declarative configuration management to refer to tools like puppet. I will assume you don't need to track which version of the components are installed by the configuration management tools. This can be important when reproducing bugs. Configuration management tools can track the versions, and may use a version control tool to store the inventory.

Declarative configuration management allows you to specify the desired configuration and how to get there. Once you have defined the desired configuration, and started the tool, it will take the specified actions to install and configure the desired components. It is recommended that you use a version control tool to maintain this configuration.

EDIT: When running automated configuration, it is important to run the package manager in a mode that does not prompt for configuration values. (Ubuntu/Debian can use a pre-seed file to override default configuration values.) If the package manager updates files in /etc, you will need to evaluate the changes and may need to adjust the declarative configuration. (Ubuntu/Debian will track the replacement by adding an extension like .dpkg-old or .dpkg-new to the filename.) I scan for changes after upgrading the O/S version, which often generates a number of configuration file changes.

There are a variety of methods that can be used to update the configuration files in /etc. Three that I have used are: - using the platform's configuration tool to update the files (Ubuntu/Debian has a pre-configuration mechanism for this); - using the declarative configuration management tool to edit the configuration files; and - copying the configuration file from a central repository.

Each system will have a small number of configuration files which are host specific. These will include files related to host naming, static network configuration, and others. If you update a file by editing it, it will be possible to add local change which don't get corrected by the declarative configuration management tools.

Version control tools track the contents of files as they change. Normally, they require a manual commit action, so it is possible for the managed files to get significantly out of sync with the controlled versions. Discipline is required to track changes.

The approach I use to ensure atomic changes is to use a version control tool to supply the configuration for my declarative configuration management tool. - I edit the configuration in my own working directory. Once changes are complete, I verify the changes and commit them. - The master server begins its process by updating its configuration. (Larger installation could have two or more branches and master servers allowing changes to be pushed out to a small set of servers for testing.) - The clients pull their configuration changes from the master server. - The declarative configuration management tool compares the running configuration to their configuration and apply any required changes.

BillThor
  • 27,354
  • 3
  • 35
  • 69
  • Maybe I wasn't clear, but I'm thinking about configurations like: apache, mysql, php, tor, gitolite, ... server security hardening, everytime where updates ask whether you want to keep your version or the package maintainers version (and where you would ideally take the maintainers to then merge your customizations into the new version)... Do you do all of that in the syntax of your config management tool? Isn't that inconvenient? –  Nov 17 '13 at 02:27
  • @ufotds I've added a comment on how I work with the package manager's changes. My Apache configuration is mostly out of the box with a few custom files used to configure the virtual servers. Most other installs tend to use the default configuration, or minimal changes. – BillThor Nov 17 '13 at 02:44
  • thanks for the clarifications. That's interesting but it still seems so much more inconvenient to scan for dpkg files, and either diff them over ssh or ftp them to compare them when we have fancy colorful diff viewers like gitg I could use if I could just commit the lot and do git pull. I think in the end I will try to cook up a failsafe git solution which might work together with config management for enabling different states on my servers... (turn loggin on or of, manage user accounts, ...) –  Nov 17 '13 at 03:39
  • @ufotds It would be possible to have get the new files copied/moved to a git working directory where you can diff/merge and commit any changes. I don't believe a git tool would handle package installation for you. – BillThor Nov 17 '13 at 18:39