1

I'm moving from single app server to load balanced config.

In the single server config I synchronized the sources by S3 cmd sync s3 every few minutes and on hostup (Scalr event) I sync the sources back from S3. This ensures the server will run updated sources.

What is the best practice when the number of app servers is dynamic?

Suppose I have 2 servers and server A gets an updated sources. If the sync script runs on B first, server A will get the old file from B instead of the other way around.

What is the best practice in this case?

Niro
  • 1,371
  • 3
  • 17
  • 35

4 Answers4

2

The following are the main protocols used to centralize storage:

  • SSHFS - Transfers everything over SSH, so it's secure and may be used over the Internet with no worries. SSH supports transparent compression if you want it. However, SSHFS can be difficult to get working.

  • NFS - Inherently insecure, and uses IP addresses to distinguish rights, but usually very easy to get working.

Personally, I would try SSHFS, and fall back to NFS if it doesn't work.

Soviero
  • 4,306
  • 7
  • 34
  • 59
0

I'm not sure what you mean by sources, but I'm assuming you mean the web application code your servers are serving.

The best solution to this is to use a shared network file system to keep the sources updated dynamically. Two of the most popular are NFS and GlusterFS.

Kyle
  • 1,589
  • 9
  • 14
0

You can use inotify (inotifywait) and/or rsync. Depends on how many files you have, how often you update them and how large they are.

Mircea Vutcovici
  • 16,706
  • 4
  • 52
  • 80
0

Put your master source in S3. Have all servers only sync down (pull from S3) on startup.

When you want to update your servers:

  1. Update S3
  2. Start new instances (which will pull from S3 on start)
  3. Stop and/or Terminate your old instances running the old version
Matt Houser
  • 9,709
  • 1
  • 26
  • 25