Today, one of our developers had his laptop stolen from his house. Apparently, he had a full svn checkout of the company's source code, as well as a full copy of the SQL database.
This is one massive reason why I'm personally against allowing company work on personal laptops.
However, even if this had been a company owned laptop, we'd still have the same problem, although we would be in a slightly stronger position to enforce encryption (WDE) on the whole disk.
Questions are these:
- What does your company do about company data on non company owned hardware?
- Is WDE a sensible solution? Does it produce a lot of overhead on reads/writes?
- Other than changing passwords for things that were stored/accessed from there, is there anything else you can suggest?
The problem is that allowing people do unpaid overtime on their own kit is very cheap, so managers aren't so willing to stop it; but will of course be happy to blame IT when there's a leak... Only a strongly enforced policy is going to prevent this. It's down to management where they want to strike the balance, but it's very much a people problem.
I've tested WDE (Truecrypt) on laptops with admin-level workloads and it's really not that bad, performance-wise, the I/O hit is negligible. I've several developers keeping ~20GB working copies on it, too. It's not a 'solution' in itself; (It won't stop the data being slurped off an unsecured machine while it's booted, for instance), but it certainly closes a lot of doors.
How about blanket ban on all externally held data; followed by some investment in remote desktop services, a decent VPN and the bandwidth to support it. That way all code stays inside the office; the users get a session with local network access to resources; and home machines just become dumb terminals. It won't suit all environments (intermittent access or high letency might be a deal-breaker in your case) but it's worth considering if home working is important to the company.
Our company requires whole-disk encryption on all company-owned laptops. Sure, there's an overhead, but for most of our users this isn't an issue -- they're running web browsers and office suites. My MacBook is encrypted, and it hasn't really impacted things enough that I've noticed, even when running VMs under VirtualBox. For someone who spends much of their day compiling large trees of code it might be more of an issue.
You obviously need a policy framework for this sort of thing: you need to require that all company owned laptops are encrypted, and you need to require that company data cannot be stored on non-company owned equipment. Your policy needs to be enforced for technical and executive staff, too, even if they complain, otherwise you're just going to run into the same problem again.
I would focus less on the equipment itself, and more on the data involved. This will help avoid the problems you're running into now. You may not have the leverage to mandate policy on personally owned equipment. However, you had better have the leverage to mandate how company owned data is handled. Being a university, we have issues like this come up all the time. Faculty may not be funded in such a way that their department is able to buy a computer, or they could buy a data processing server on a grant. In general, the solution to these problems is to protect the data, and not the hardware.
Does your organization have a Data Classification policy? If so, what does it say? How would the code repository be classified? What requirements would be placed on that category? If the answer to any of those is either "No" or "I don't know", then I would recommend talking to your Information Security office, or whomever in your organization is responsible for developing policies.
Based on what you say was released, were I the data owner I would likely classify it as High, or Code Red, or whatever your highest level is. Typically that would require encryption at rest, in transit, and may even list some restrictions on where the data is allowed to be housed.
Beyond that, you may be looking at implementing some secure programming practices. Something that might codify a development life cycle and expressly disallow developers from coming in contact with a production database except in weird, and rare, circumstances.
1.) Working remotely
For developers, remote desktop is a very good solution unless 3D is required. The performance usually is good enough.
In my eyes, remote desktop is even safer than VPN, because an unlocked notebook with VPN active allows quite a bit more than a view to a terminal server would.
VPN should only be given to people who can prove they need more.
Moving sensitive data out of house is a no-go and should be prevented if possible. Working as a developer without internet access can be prohibited because the lack of access to source control, issue tracking, documentation systems and communications makes the efficiency iffy at best.
2.) Usage of non-company hardware in a network
A company should have a standard of what is required from hardware attached to the LAN:
Foreign hardware should either follow these guidelines or not be in the net. You could set up NAC to control that.
3.) Little can be done about the spilled milk, but steps can be taken to avoid reoccurence.
If the above steps are taken, and notebooks are little more than mobile thin clients, not much more is necessary. Hey, you can even buy cheap notebooks (or use old ones).
For sure you should only have company data stored on company devices no where else unless it has been encrypted by your IT Department
Any disk encryption software will have some overhead but it is worth it and all laptops and external USB drives should be encrypted.
You can also get remote wipe software like you would have in a BES environment for blackberries.
Computers not under your company's control shouldn't be allowed on the network. Ever. It's good idea to use something like VMPS to put rogue equipment in a quarantined VLAN. Likewise, company data has no business outside company equipment.
Hard disk encryption is pretty easy these days, so encrypt anything that leaves the premises. I've seen some exceptionally careless handling of laptops which would be a disaster without full disk encryption. The performance hit isn't that bad, and the benefit far outweighs it. If you need blazing performance, VPN/RAS into the appropriate hardware.
To go in another direction from some of the other answers here:
While protecting and securing the data is important, the probability that the person who stole the laptop:
Is pretty unlikely. The most likely scenario is that the person who stole the laptop is a regular old thief and not a corporate spy bent on stealing your company souce code in order to build a competing product and get it to market before your company, thereby driving your company out of business.
That being said, it would probably behoove your company to put some policies and mechansims in place to prevent this in the future but I wouldn't let this incident keep you up at night. You've lost the data on the laptop, but presumably it was only a copy and development will continue without interruption.
Corporate owned laptops, should be using encrypted disks, etc, of course but you ask about personal computers.
I don't see this as a technical problem but rather a behavioural one. There is very little you can do from a technology viewpoint to make it impossible for someone to take code home and hack away at it - even if you can prevent them from checking out all the source to a project on a formal basis they can still take snippets home if they are are determined to do so and if one 10 line "snippet" of code (or any data) happens to be the bit that contains your secret sauce / valuable and confidential customer information / location of the holy grail then you're still potentially just as boned by losing those 10 lines as you would be by losing 10 pages.
So what does the business want to do? It's perfectly possible to say that people absolutely must not work on company business from non company computers and make it a "gross misconduct" dismissal offence for people who break that rule. Is that an appropriate response to someone who is the victim of a burglary? Would it go against the grain of your corporate culture? Does the company like it when people work from home in their own time and is therefore prepared to balance the risk of property loss against the perceived gains in productivity? Is the code that was lost used to control nuclear weapons or bank vaults or life saving equipment in hospitals and as such a security breach can't be countenanced under any circumstances? Do you have a legal or regulatory obligation with regards to the security of the code "at risk" because of this loss?
Those are some of the questions I think you need to be considering, but no-one here can actually answer them for you.
In a situation where there is source code involved, and especially where the machine used can't be controlled by the company IT department, I would only ever allow the person to develop in a remote session hosted on a machine on company premises, through a VPN.
How about remote wiping software. This would of course only work if the thief is dumb enough to power up the computer to the internet. But there are tons of stories of people who even found their stolen laptops this way so you might be lucky.
Timed wiping might be an option too, if you havn't entered your password in X hours everything is deleted and you have to checkout again. Havn't heard of this before, maybe because it's actually quite stupid as it requires more work from the user that encryption. Maybe would be good in combination with encryption for those worried about performance. Of course you have the power-up problem here too but here no internet is required.