Is there a method or service that gives me a detailed text report on the pornographic content or status of a particular web site? I need this to audit users account without having to get into a particular web page to view it contents and determine that it is inappropriate.
Without using screen captures or pulling content from the proxy or their caches...I'm not sure. You'd essentially be getting pornographic content no matter what because how would you see if something is "inappropriate" without seeing the inappropriate content in the first place?
I'm not sure what exactly you're trying to avoid?
Even if you had a service to hand it off too, you're relying on their judgment that something is inappropriate.
I suppose you could just go to the site with a text web browser if you want to just get text from the site without the fear of burning your eyes.
Are you being bound by a policy that says you as an admin are in charge of policing users without being able to view what they may be viewing? As sysadmins if you're in charge of this you kind of have to have the flexibility to not be bound by those rules if you're in charge of doing that; if there's a question about the behavior, you need the flexibility to check on this stuff without HR breathing down your neck for breaking the rules. It's trust that you're not abusing your position that keeps you from surfing for porn while at work and you have reason to be checking those websites in the first place outside the filter.
Do you have a filter installed at work? You can put one in and that would give you an audit of URL's being visited. Otherwise the only way I know of is to periodically view .gif's and .jpgs from the user's cache. Really your first step is ot have a proxy or filter set up (after having a written policy of what's appropriate and inappropriate in the workplace).
As written your question is actually, in my opinion and experiences, more a social/HR problem than a tech problem. The best answer I can give is to audit using a proxy or filter that records web traffic. Not too hard, many commercial devices are out there as well as free solutions like Squid and Squidguard, depending on features you want to use and how fine-grained you want to get.
there are some 'nanny' services that classify and rate sites. what i would first try is OpenDNS; by default they filter 'questionable' sites; i haven't checked if the DNS response include any kind of 'report' or just the redirection.
You could have a look for PICS/POWDER ratings - but IME most sites don't bother with them. Alternatively you could try implementing your own search engine to pull back HTML content from the sites and look for dodgy keywords in the text.
But, speaking as someone whose 11 year-old daughter's website about her pets has been branded as unacceptable by the thought police running the internet proxies at my work (actually they just install a block list sent to them by a very-dumb search engine maintained by the software supplier) without going and looking at the site yourself, you won't know if it contains any dubious content. You are going to look very silly someone challenges an action you take against a user without establishing for yourself whether the action is justifiable.