Is being Paranoid considered an (unspoken) 'requirement' for a Sys/Net admin to have (obviously for security reasons)?
Is there such a thing as being overly paranoid?, or should we be trustful of others and not completely dwell on questioning scenarios through schizophrenic goggles?
Is there a 'mid-ground' for this characteristic when it comes to security? (basically, what I'm asking is, who would YOU hire?)
UPDATE: I didn't expect people emphasizing so much on the word "PARANOIA". Please don't dwell too much on it, I could have used another word, but Paranoia is a word we commonly use with security. I've heard "too paranoid" and "need to be more paranoid" from a bunch of IT folk.
Paranoia is a dysfunctional personality trait where an individual is suspicious or untrusting without reason. Acting without reason is the antithesis of a good SA.
A system administrator needs to deeply understand the systems they support and be able to quickly analyze problems against business requirements, assess risks, and prescribe action to mitigate problems/risks/etc. An SA also needs to understand the systems enough to quickly develop theories to guide the problem troubleshooting process, but also needs to make decisions based upon facts gathered.
Sometimes those duties makes one appear paranoid on the surface.
You're only paranoid until it HAPPENS... after that you were just "well prepared". ;-)
Critical thinking is a required quality for a good SA. Obviously the clinical definition of paranoia is not what the OP was asking, but even the common definition is not "required".
To the unskilled eye, there may be little difference between a paranoid SA and one who thinks critically about issues like security.
Example: I block outbound SSH because I understand what you can do with SSH tunneling. I know of SAs who block it because "it's a security risk", without knowing what the specifics of that risk are. Am I a better SA for understanding the risk? Perhaps, but at the end of the day both of us took the same action.
Part of the art of being a SA is to know when something that you've been told requires more investigation before you act and when the information is trustworthy enough to act upon immediately.
I believe that pragmatic paranoia is a healthy trait in a sysadmin. Thinking about bad things that might happen and how to avoid them can be extremely useful-- thinking about security and other potential problems makes a system more robust.
The trick is being able to assign weights and probabilities to possible outcomes. You have to be able to estimate the probability of a problem, the severity of the outcome if it occurs, and the cost of avoiding it, and then make pragmatic decisions based on those incomes. Being reasonably paranoid about the company's core data is smart. Being unreasonably paranoid about someone getting to the company's list of corporate holidays seems unhealthy.
You have to balance security with usability.
If you run a bank's network infrastructure, you need more security, but you can also afford to have more security, since it costs money to train users, to purchase and install new technology, and so forth. If you're running a university student network, you can easily afford, say, not to hand out RSA SecurID (time tokens) to students to log in. It's just not necessary.
Yes, I use full-disk encryption on all my (work, non-server) machines with available data destruction features enabled, even on my iPod. Why? I have a sensitive contact list, emails, trade secrets and material covered by non-disclosure agreements on some of these machines.
However, when I was an undergraduate student with nothing but my (not-for-publication) papers to preserve, I would never have gone to such lengths. However, in grad school, with possible novel/patentable or for-publication papers, you might want to take a slightly more secure approach.
Soapbox: I also know a few people who use big tools like 256-bit full-disk encryption, and then use a backup mechanism that stores their data in the clear, or worse, on some random untrusted remote server. The whole chain is important!
It does require the ability to think in terms of what can go wrong instead of what you want to go right. This style of thinking often seems paranoid to those who don't need to engage in it. Occupational hazard.
If your systems administration practices are actually predicated on the idea that people are actively conspiring to harm you personally, though, you may be too paranoid. :)
Yes.
I think what you were actually getting at is whether a person's brain defaults to trust mode or distrust? A sysadmin fights a concentrated, neverending stream of confidence scam artists. From the site trying to serve out malware to your users to the patter of bots and script kiddies on your firewall, it's all about keeping entities from convincing your systems and users that they're trustworthy.
We do not install the default, we hit the "custom" button. We do not give access and then narrow down the "known bad" ports, we shut it all down and then open up what's necessary until it works. We do not click 'Yes' unless there's a compelling reason to do so. We opt out.
There are a lot of fields where you have to assume the worst. Law and medical professionals can't take what people say at face value, either.
Our polar opposite is the dear trusting user who sees a box pop up with dire warnings and assumes that the box is intended to help them.
And when one wonders whether it's required - how many other business functions get to deny prime access to the owners/VPs of the company? It would be entirely reasonable for our owner to have the keys to every door and file cabinet in this building, but he can't have domain admin rights. To me, that defines appropriate 'paranoia'.
disclaimer: it's possible that there may be trusting types who are perfectly stellar admins, but those I've met who really stand out have all had a very healthy tendency in the opposite direction
In any organization of significant size, trust is unavoidably delegated away from the sysadmin for practical (and sometimes other) reasons. Such as giving the help-desk the ability to handle password resets and account lock-outs, or allowing the identity-management automation to handle account enable/disable which requires delegating that ability to HR types. When bringing a new admin on-board it is a good thing to see how comfortable they are with your organization's level of delegation.
Over all, a sysadmin should have enough of a security mind-set to call a hard stop on something that sounds suspicious, even if it does come from a higher level manager. What we do is part of the information security apparatus of where we work, and that should be part of our job[1]. There is a level of trust that needs to be established between decision makers and implementers, otherwise things can descend into the hard-lock of paranoia.
Admins that don't trust worth beans probably shouldn't be in larger organizations where the technology is handled by multiple people.
[1] Unless it isn't. Some organizations have delegated InfoSec to a dedicated department, from which marching orders are issued to the relevant parties.
What you are calling paranoia is probably related to what Bruce Schneier calls The Security Mindset. Quoting from his blog post:
If you are considering security, then there is no such thing as too much paranoia.
Other than that, I try for "slightly paranoid realism" over pessimism (I'll be optimistic when there is reason to be so, pessimistic otherwise, and might give benefit-of-the-doubt on occasion and upgrade mild pessimism to neutrality or neutrality to mild optimism).
Though the old adage than a pessimist is never disappointed is usually not wrong.