1

On CentOS 7:

I have a set of servers that form an environment for a web application. Every client has their own set of servers that have similar software installed on the servers in every set.

So every set is the same, but not every server in the set is the same. During installation there are all kinds of bash commands ran to install all necessary software.

So far so good. However, a servers needs to be updated, but updates can break things, so I don't want to have an automated process for this: e.g. blatantly run yum update -y and hope things still work.

My plan is to have a "standard" for every server in the set, and update the same server in all sets to this standard.

E.g. when I upgrade PHP in the standard, I can then move that somehow to all similar servers without affecting local files of the web app. I'm only talking about software that supports the webapp, so for example PHP, NginX, Dovecot, Etc.

Is this somehow possible?

  • 1
    In this situation I would do a list of installed packages on each servers from dnf, pip3 etc... tools.. centralize them in concatenation the make it sorted/unique lines in another file you can then loop read each line to update or install the package (without risks of removing anything) ; and all servers will have AT LEAST this list (+some few others due to machines history) ; but here you can do a dnf makecache && dnf update (do not use the -y on first execution). Of course you do that after a cold backup of all servers (same method works with other distros (apt,pacman,etc..)) – francois P Oct 24 '21 at 11:19

0 Answers0