I would like to create a publicly accessible Google Apps site (i.e. users do not need to be authenticated to access the content) while maintaining a policy crawlers and bots exclusion with Robots.txt. Does anyone know how to do that?
I would like to create a publicly accessible Google Apps site (i.e. users do not need to be authenticated to access the content) while maintaining a policy crawlers and bots exclusion with Robots.txt. Does anyone know how to do that?
robots.txt doesn't prevent interactive browsers from using the site. It is only used by robots like crawlers, feedreaders, recursive download tools (though the latter will let the user override it).