I manage several Linux servers for clients in several roles like email, caching, web serving, filtering, firewalling/routing, and so on.
Since I don't own these computers and just provide remote support, central management systems like Puppet don't seem like they are the correct tool. (Please correct me if you think I am wrong about this assumption)
What tools do you recommend to track changes of configuration files, package installs and so on?
I am thinking something like etckeeper may be near to what I need, but I want to know if there is something better.
Update
We will have backups of the systems, and I wouldn't expect this type of a tool to be an alternative to a backup. This is about keeping track of changes of configuration and having a system to know what changed when, by who, and hopefully why.
I've got etckeeper on my personal workstation, but I've not had to do much with it yet (other than have it track all my changes). Seems like it does a reasonable job of making sure you at least know what's been fiddled with.
I wouldn't write off Puppet as a solution -- as long as some of the services on the machine are your responsibility to maintain, then a system that makes sure that if someone jiggles your config that it gets put back the way you want it is a godsend.
On the other hand, if others make changes regularly (and they don't usually screw it up), you might have to resort to just tracking what other people have done for later disaster recovery. Don't forget that things will be changed all over the place, so a full-machine checkpoint tool might be better. I'd perhaps even consider going full-disk incremental backup on it (like rdiff-backup or something) to be sure you're tracking everything (maybe drop /home and other user-level areas out of the backup, if you just want to track administrative changes).
You may want to look at Tripwire or AIDE
Both will track config file changes on your machines.
I've looked at etckeeper, but I haven't used it. However, I have used Changetrack. I've been using it on all of my home machines for many years, and at my previous job it was part of our standard server install. We used it there for the last five years, and had it installed on about 200 boxes.
The setup is trivial (I created an RPM for it at my last job), and the configuration is really simple. I generally set it up to monitor all of /etc/.
For tracking package changes (installs, upgrades, etc) on RPM-based systems, as long as all changes are done with
yum
oryumex
, each package change is logged in/var/log/yum.log
.Other people have already answered tracking changes in
/etc
. Don't forget that you also want to track configuration changes tobind
which are partially in/var
(at least on many Linux distributions) and that web pages are under/var/www
on many Linux distributions. There will be directories outside/etc
that have important configuration information on them.Depending on how things are managed, you may also want to track
/usr/local/etc
and other directory trees (/opt
, some trees under/var
, and anything else that is specific to your customers).As a starting point, you might want to look at Blueprint, which analyzes a system configuration. While it's intended to create configurations for Puppet or Chef, there's no reason you couldn't use it just for reporting.
A simple file monitoring script is filemon .I use it on my home PC,and combined with crontab it does a simple and easy job . For a more complex solution of integrity check (file changes,new packages installed and many many other) I use OSSEC on a bunch of servers .
You could put /etc under a dvcs such as git. You would commit every time you make changes and then you could just git diff whenever you started a job and be shown all the changes.
LBackup has support for logging of file deletion, modification and addition.