I'm examining a particular setup, and they have their crontab as follows:
0 * * * * lynx http://www.example.com/cron/scriptA.php
Of course, this relies on the 'security-by-obscurity' precept, because any person on the internet, knowing where those files are located, can call them and potentially overload the server.
Besides that, is there anything inherently wrong with the above 'model' of running that script.
When I tested lynx http://www.example.com/cron/scriptA.php from the command prompt as root, it prompted me to download a session cookie, so I'm thinking I should atleast be modifying the above to:
lynx -accept_all_cookies http://www.example.com/cron/scriptA.php
Or should I be using:
wget -q -O /dev/nul http://www.example.com/cron/scriptA.php
If you want to secure those files, you can configure your web server to allow connections to those particular scripts only from localhost. You didn't mention what webserver you use, but for example in Apache this could be done with the combination of Directory and Allow/Deny parameters, something like
For additional security you may modify your cron scripts to check the client address. If it's other than localhost, refuse to do the cron magic and return something else.
When it comes to tools, lynx and wget are both fine. When I use lynx in cron, I tend to use it with -dump flag, though.