I created a Rewrite rule in the .htaccess of a website to block traffic with blank user-agents and user-agents with only "-", except when the request comes from the webserver itself. The rule only works for requests made to the site root (example.com), but not for pages within the site, existing or otherwise (example.com/someradompage). The webserver is apache2.
.htaccess config
RewriteCond %{HTTP_USER_AGENT} "^$|^-$" [NC]
RewriteCond %{REMOTE_ADDR} "!^x\.x\.x\.x$" <- Public IP address of the webserver.
RewriteRule ^.* - [F,L]
I tested this with the following curl commands:
curl -k -A "" https://example.com -> Returns status code 403 forbidden.
curl -k -A "" https://example.com/someradompage -> Serves the page or returns status code 404 not found if the page doesn't exist.
How can I also block requests from these blank and "-" user-agents for pages inside the site?
Apparently, this rule is conflicting with another rule that was generated automatically by the application.
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
It works as intended by putting the rule above before this rule. Not sure why this conflict happens in the first place though.