I currently have a medium-sized website, that probably has a few security flaws. That's probably normal, I think. You can't catch everything. The problem is, I also have a couple script kiddies whom think its fun to try day and night to attempt to crack my website, to do who knows what. Probably something like deleting the DB. I have backups, but it's still a toll on RAM and CPU, and I'd prefer to stop it. Is there any way I can analyze the server logs to easily find out which entries are caused by the script kiddies? They'd probably be identified by multiple hits per minute, but it's a pain to go through and pick out those entries when I could be doing something worthwhile.
Imho you should rather spend your time fixing your website, then you won't have to scan your logfiles all the time..
AWStats or Webalizer are well-known webserver log -> statistic tools, maybe you could get some use out of that.
I don't know what "medium-sized" website is for you. But if it is large enough to have the DB and the webserver on two different servers then you could use a database firewall like e.g. GreenSQL. This will give you some more information how they want to do it. But you will still need a http log analyzer to find out where they are attacking (What form they try to missuse).
cat access_log | awk '{print $1}' | sort | uniq -c |sort -g
should produce an ordered list of ip addresses that are hitting your site, the first column will be the number of hits, the second the ip.
You might have to change the value $1 this is the position of the ip address field in the logfile line. On my webserver its first hence $1, otherwise a field is defined as 'separated by white space' so the next entry is $2 etc.
If you are using apache, you may want to look into implementing mod_security ( http://modsecurity.org/ ) - it can provide a level of protection against some kinds of attacks even if the underlying application is vulnerable. It doesn't catch everything, but it can help in the situation where you don't necessarily control the code you're running.
Here is a somewhat similiar question that I asked recently that may have a solution for you:
New to Ubuntu Server, which logs to monitor and what to do
There are alot of free log analyzers that will all tell you the same info. Do a Google search, test drive a few, and see what you like.
I dont think I would stop backing up the site. You should always have backups around. Too many things can happen that could require those backups.
So, check the logs, make backups, keep your system patched, use strong passwords, firewall (someone recently recommended etc...
I am sure you will get some good advice here.
Even if you do find the entries belonging to the script kiddies, I doubt that this would help you do anything about it. You can't just lock them out by blocking those IP addresses. Most people have dynamic IP addresses assigned by their ISP's.
Besides, could log files could not show you all attacks. For example, what about brute force SSH login attempts?
IMO the only sane approach for your problem is trying to fix as many security flaws as you can with reasonable effort and having backups in case of emergency.