I have a linux box (Debian) which is up 24/7. I'd like to set it up in such a way, that once a day it connects to a specific FTP server and checks if any files have changed since the last check. If they have, it should send me an email with a list of the changed files.
The check doesn't have to be thorough - I'll be happy if it just compares file dates. But it has to be recursive.
How can I do this? I understand I can use cron to schedule the process, but what do I use to connect and check for changes?
The ncftp client has the ncftpls tool which can do a recursive long list.
Here is a Perl solution using Net::FTP. The below script should print the filename and timestamp for each file in the fileserver. You can easily extend this to suit your actual needs.
I'm pretty sure lftp can be scripted to do this; otherwise, I'd just use an FTP library for my favourite scripting language (Ruby/Perl/Python) and do the job that way.
Here is a shell script to help you on your way. If you can execute this over ssh, then you're done. If you have to do it over FTP, then you'll be meeting my old friend Expect. Good luck!
Use wget. First day:
This command will produce a local directory named YOURSERVER with a downloaded directory tree underneath, but each directory will contain only one file called ".listing". The .listing files can be compared to old copies with diff command.
Now each day:
This will print all lines that differ, and cron should automatically mail the result to you.