When I was using Lighttpd I can easily achieve this by such entries. So all websites were protected.
Wget robots:
$HTTP["useragent"] =~ "Wget" {
$HTTP["url"] =~ "^/tagi(.*)" {
# $HTTP["url"] =~ "" {
url.access-deny = ( "" )
}
$HTTP["url"] =~ "^/tags(.*)" {
url.access-deny = ( "" )
}
$HTTP["url"] =~ "^/kom.php(.*)" {
url.access-deny = ( "" )
}
$HTTP["querystring"] =~ "^(.*)strony(.*)" {
url.access-deny = ( "" )
}
$HTTP["querystring"] =~ "^(.*)page(.*)" {
url.access-deny = ( "" )
}
$HTTP["url"] =~ "^(.*)/www/delivery/lg.php(.*)" {
url.access-deny = ( "" )
}
$HTTP["url"] =~ "^(.*)/reklamy/(.*)" {
url.access-deny = ( "" )
}
$HTTP["url"] =~ "^(.*)/ads/(.*)" {
url.access-deny = ( "" )
}
$HTTP["url"] =~ "^(.*)/www/delivery/ck.php(.*)" {
url.access-deny = ( "" )
}
}
Sites with fake traffic:
$HTTP["referer"] =~ "(.*)surfing.php(.*)" {
url.access-deny = ( "" )
}
$HTTP["referer"] =~ "(.*)promote.php(.*)" {
url.access-deny = ( "" )
}
$HTTP["referer"] =~ "(.*)trafficadder.php(.*)" {
url.access-deny = ( "" )
}
$HTTP["referer"] =~ "(.*)traffic.php(.*)" {
url.access-deny = ( "" )
}
$HTTP["referer"] =~ ".*loic*." {
url.access-deny = ( "" )
}
$HTTP["referer"] =~ ".*autosurf*." {
url.access-deny = ( "" )
}
How to do this the same in Apache ? I don't want to add this to .htaccess.
You can use mod_rewrite which requires a bit of effort. Here are some starting points:
http://httpd.apache.org/docs/2.4/rewrite/access.html
Notice in particular the section called "Blocking of robots": http://httpd.apache.org/docs/2.4/rewrite/access.html#blocking-of-robots
See also: http://en.linuxreviews.org/HOWTO_stop_automated_spam-bots_using_.htaccess