I am writing a script that downloads some files and executes them. I only want each filename executed once by the hosts that are downloading them. I am trying to create a script that pulls the files, executes them if they havent already been executed, and then keeps track of each file that has been executed to prevent duplicate runs. So far all I seem to be able to do is execute all each time, and the backup record does not trigger the deletion of the duplicates.
(cd /opt/trunk/mythos/medusa/remote-scripts/ && wget -r --no-parent --reject "index.html*" http://server/medusa/scripts/)
chmod +x /opt/trunk/mythos/medusa/remote-scripts/server/medusa/scripts/*.sh
for each in /opt/trunk/mythos/medusa/remote-scripts/server/medusa/scripts/*.sh
do
if /usr/bin/test -e $each.bak
then
rm -rf /opt/trunk/mythos/medusa/remote-scripts/server/medusa/scripts/$each
fi
bash $each
mv $each $each.bak
done
What am I doing wrong?
Put
bash $each
andmv $each $each.bak
in anelse
block:That way the current script will be executed and moved only if
/usr/bin/test -e "$each.bak"
fails (i.e. the current script hasn't been executed and moved previously).I quoted all the variables with double quotes to prevent them from breaking the commands in case they end up containing weird characters / strings.