I want to prevent my robots.txt file from syncing to the front end server. Here is my /etc/lsyncd.conf file:
settings = {
logfile = "/tmp/lsyncd.log",
statusFile = "/tmp/lsyncd.stat",
statusInterval = 1,
}
sync{
default.rsync,
source="/var/www/html/blog",
target="sync:/var/www/html/blog",
rsyncOpts="-ltus",
excludeFrom="/var/www/html/blog/robots.txt",
}
The /tmp/lsyncd.log tells
Normal: recursive startup rsync: /var/www/html/blog/ -> sync:/var/www/html/blog/ excluding
HELLO WORLD
Normal: Startup of '/var/www/html/blog/' finished.
Normal: Calling rsync with filter-list of new/modified files/dirs
/robots.txt
/
Normal: Finished a list = 0
This setup doesn't seems to work out.
I guess the
excludeFrom
dir is relative tosource
dir; thus,excludeFrom="/robots.txt"
might work.EDIT: Oh my God, forget what I wrote above.
excludeFrom
is an rsync parameter which specifies a text file containing a list of files to be excluded, one file or pattern per line. So, yourexcludeFrom
should contain a path to a file, which then contains this/var/www/html/blog/robots.txt
or/robots.txt
.Use
rsync
's--exclude
directive inlsyncd
's config parameterrsyncOpts
, or_extra
, depending on thelsyncd
version you use.per https://axkibe.github.io/lsyncd/manual/config/layer4/ :
note that https://community.rackspace.com/general/f/34/t/3503 :