Resisting the PGP Whole Disk Encryption Craze 480
alaederach writes "I run a lab in a non-profit academic life sciences research institute. Our IT recently decided it would be a good idea to use PGP whole disk encryption on all of our computers, laptops and servers and picked PGP's suite of software. The main reason is that a small subset of our researchers work with patient information which we obviously are mandated to keep confidential. My lab does a lot of high-performance computational work (on genes from Tetrahymena, no humans here) and I am concerned that the overhead of complying with our ITs new security policy will be quite detrimental to my research program. For example, dynamically reallocating a partition on a PGP encrypted disk is apparently not possible. Furthermore, there is some evidence that certain forms of compression are also incompatible with PGP whole disk encryption. Interestingly, it is hard to find any negative articles on PGP, probably because most of them are written by IT pros who are only focused on the security, and not usability. I therefore ask the Slashdot community, what are the disadvantages of PGP in terms of performance, Linux, and high-performance computational research?"
Encryption is good for security, bad for performan (Score:5, Insightful)
Whole disk encryption is excellent for security, but it will bog you down in disk access times. Depends on a lot of things, but reading and writing files can slow down up to 50%, but usually the slow-down is much less. If you are doing something that involves a lot of disk access and it doesn't need to be encrypted, then create a special, non encrypted partition for that.
Policy fundamentalism (Score:4, Insightful)
Policy Exception (Score:3, Insightful)
You've got a good case for an exception from this policy. Just follow the exceptions process and have your management sign off on the risk. Case closed.
From experience (Score:2, Insightful)
It took 2 days to encrypt an entire 160gb ide hard disk with a K6-2 400mhz processor, and afterwards the computer could only server files at about 400k per second. With a 2ghz processor the performance difference is negligible, and could serve at full speed with only tiny cpu usage. So I think full disk encryption overheads is irrelevant on modern cpus.
As for not being able to resize a partition, well that's good because if your hard disk is to contain anything of importance then you would have to be inept to resize partitions and expect data to maintain it's integrity, no matter what the file system format or brochures on partitioning programs try to tell you.
feel-good actions (Score:3, Insightful)
if you want to actually protect your data you need to encrypt only whats sensitive and only mont it when neccicary. also PGP is closed source and what are you going to do if they stop supporting, use truecrypt or LVM, etc. Also dont neglect network protection where the real data is stolen
Re:Isolate sensitive data (Score:5, Insightful)
Surely what is required is to isolate the sensitive information, so that it can be protected.
That's a great idea that in practice will leak your information. The reason is that _every_ application that touches your data needs to know that it should keep your data confidential.
Broswers know to not cache data transfered over https. It knows the data was encrypted, it knows to be smart with it [for "protective" value of smart].
When you have a program that reads a file through a transparent layer of encryption, it never sees the "please-be-careful-with-this" label, and so the desktop search engine will index all the strings, the editor will write backups to . or /tmp, and so forth. All the apps think they need to do is respect what you meant by your mode bits (if you're on *nix), so it'll chmod/umask the /tmp copy the right way. If someone grabs your disk and you didn't encrypt /tmp, you lose.
And no, encrypting /tmp won't fix it: you need to know that everything the user of the data can write to is encrypted if you want to be sure. I only know one way that I can somewhat confidently say solves the problem: encrypt everything. [and then there's the network, but we'll save that for another decade ;)]
Only encrypting the sensitive data is like carrying water in bucket used for target practice: stuff will leak.
What are you trying to prevent? (Score:3, Insightful)
Their product doesn't seem [pgp.com] to run on Linux.
There is better, cheaper F/OSS software to do the same thing though; Ubuntu and FC9 already include a whole disk encryption option at install. (It's better because it's much less likely to have an NSA back door, although obviously never completely certain).
As for performance, when I tried it (luks encryption) on a desktop machine, it wasn't noticeable; but I wasn't moving hundreds of gigs around.
The question now is what are they trying to protect. Encrypting laptops is sensible, and in fact, given how easy & cheap it now is, it's rather stupid not to do it. On desktop PCs, it's not that clear. Whole disk encryption will only protect you against someone with physical access to the machine turned off. It certainly won't protect you against trojans or browser based vulnerabilities. So the question is, do random strangers roam your offices?
And encrypting servers/clusters? That's just silly; unless you expect the men in black to storm in your building.
Re:Isolate sensitive data (Score:4, Insightful)
Someone will write the passphrase down anyway. Isolate the data.
Re:From experience (Score:2, Insightful)
How could it possibly be in Microsoft's interest to allow or facilitate the resizing of partitions?
They want your hard drive to be one big C: NTFS partition with no room for a Linux partition.
If you run defrag on a fairly empty NTFS partition it's noticeable that some data will get shoved to the end of the partition and probably won't get moved back to the beginning.
If I were to be unkind, I would suggest that this is deliberate behaviour to prevent third party partition resizing applications reclaiming enough space to make a partition for a competing operating system install.
This has happened (Score:4, Insightful)
I've worked with people during various research projects who decided to encrypt, for some very good reasons. I've had one admin die, and one researcher have a stroke. In both cases they had information necessary for the project that nobody else could get to, even when their hard drives were retrieved. The results are that after several years, the stuff is still sitting somewhere unusable because the people who attempted to get to it were stymied. Enforcing PGP on an entire network could multiply this problem. I would think that enforcing PGP on users not needing it would be a royal pain for them.
What we've done and thought of since:
Have only those with sensitive information encrypt. Have them work on machines not connected to the net. If they need net access, have them connect only for the time necessary, and mandate pre-encryption back ups prior to connecting.
Preferred, but resisted, keep the sensitive machines off the net and have the researchers connect to the net via a different machine without the sensitive info on it. If they want to use it for transfers of such info, make them use sneakernet between the sensitive and connected machines. In this scenario, they only need PGP for what they're going to transfer to the connected machine and thus to outside. Both admins and researchers expect full connectivity throughout their net, but the best security is a nackered line.
I use the sneakernet method exclusively. What I transfer when necessary is hundreds of MB to tens of GB of data. It takes me 10 to 30 minutes to encrypt, burn the data to DVDs and carry it to the connected machine. Like most researchers, I'm busy and don't want to spend my time doing this, but I have assistants I can put the task on.
I've another fear (Score:1, Insightful)
For my company, data has to be kept secret. Yet we do not do encryption. The fear for corruption of data is far bigger than the chance that a cracker gets access.
As to personal experience: With TrueCrypt, changing between accounts (on a Mac) with TrueCrypt open can wreak havoc. The data can be copied but the secure thingie has to be re-created from scratch. We cannot have encryption working properly 99% of the time. It must be 100.00%.
Bert
Re:Here's a quick experiment (Score:3, Insightful)
"But: do the measurement in your own world. My software, hardware and artificial measured usage pattern may differ from yours, subtly but enough that my conclusion doesn't transfer. Be scientific about it :)"
Best advice I've seen - try and build up a representative sample of a day's work (or just a random sample if that's not easily determinable), copy it, run one copy on unencrypted disks and one on the mandated encryption.
If there's a significant difference take the evidence to your IT dept. or supervisor and hope for a favourable decision.
Drive Errors? (Score:5, Insightful)
My concern with encrypting an entire disk would be fault tolerance. If a sector goes bad on a non-encrypted drive, you might lose a file. If it goes bad on an encrypted drive, do you risk losing more data or even the entire drive?
Of course, one could say that's why you make backups. But presumably the backups would also be using encryption. Therefore, they would be susceptible to the same effect. If there is a greater chance of total data loss on each device, the chance of multiple device failures leading to unrecoverable data also increases.
Re:Encryption is good for security, bad for perfor (Score:5, Insightful)
I'm not sure that assuming that just because somethings done in hardware, that it happens in zero time (or even near zero time) is at all accurate. A review I read of a different encrypted drive, said it was 5-10% slower than it's non-encrypted equivalent. It wasn't the Seagate you're talking about, but I doubt that even hardware encryption can do it instantly, so I think your "zero" is an exaggeration.
Re:Policy Exception (Score:5, Insightful)
If there really is a performance loss, and you can quantify it, then you can attack it from another angle, eg an impact statement to management along the lines of "This will introduce a %% performance loss to our workloads, at a cost of $$$. In order to maintain the same level of productivity we will require upgraded hardware at a cost of $$$".
Having a manager who is concerned about his departments budgets on your side can help your case too :)
Re:Encryption is good for security, bad for perfor (Score:5, Insightful)
Linux software RAID 5 uses 2% CPU under heavy load.
Given the fact that you can always recover your data with any Linux livecd gives it a definite edge over a hardware raid solution where you need a similar model to read the data.
Why are you writing to slashdot? (Score:5, Insightful)
In the time you spent writing this post to Slashdot, you could have written a friendly letter to your IT department stating that you want some machines to not use this encryption, because these machines need maximum performance and anyway do not store any kind of personal information.
Think about the purpose of Full Disk Encryption (Score:5, Insightful)
People misunderstanding the question... (Score:5, Insightful)
The submitter is in a research institute. Some labs in that institute have patient data, and therefore require significant security like disk encryption.
His lab works with a protozoa, and has massive computational requirements. There will never be any patient data near his lab, because the people who work with patients are in a different lab (think different department in business). They do not need disk encryption.
You say Truecrypt has "1% overhead", PGP presumably has some other "% overhead." The submitter is asking what the details of that overhead for PGP, truecrypt etc are. Whats the CPU usage, memory usage? Are disk performance penalties constant, or are they dependent on average file size, number of files, format of those files, etc etc etc. "1% overhead" may hide whopping huge performance penalties for specialist users.
Re:Encryption is good for security, bad for perfor (Score:5, Insightful)
I have serious doubt we even need hardware RAID anymore with current CPU speeds.
At some point in time I believed the same thing. I did a test a few years ago to see if it's still worth it to bother with hardware RAID and configured an system with linux and software RAID.
This was for a fileserver in a high performance cluster so speed mattered. I don't have the exact figures here right now, but from what I remember two years ago the software RAID solution was between 7 and 15% slower. Once you start hitting the performance limit your processes hit I/O wait and your performance goes down. When I added LVM to that back then performance got shot to hell.
Now, it's not as bad as it seems, you still get decent performance (especially considering that your setup suddenly costs a lot less and can be done on commodity hardware), and with a fair bit of tinkering with blockdev and your read-ahead buffer (provided you have enough RAM, and your usage fits that particular pattern) you can still get some very nice performance.
The reason that we went with hardware RAID in the end was because hardware RAID isn't all that expensive, and the performance gains were noticeable especially on systems that have to run 24/7 at maximum throughput.
Again, for consumer systems and services where performance isn't a primary concern software RAID is an attractive option, especially if you're on a budget.
As for overhead with encryption: it would make a nice experiment but I think 1% overhead is very optimistic especially on a busy system. The only way to be sure is to compare your performance now to the performance when you encrypt the entire disk. The only time I tested truecrypt I got a throughput of 80MByte/s, while unencrypted I got 120MByte/s, and it's been a while since I tested this. Those truecrypt tests weren't finetuned either, it was basicly a test to see if it was easy to implement.
Anything I mention here has to be taken with a grain of salt since a lot of time has passed and a lot has changed since those tests.
If policy dictates that you have to setup X, the best way to become an exception to this policy is to prove that that policy is detrimental to your project and might end up costing a lot of money. Policy doesn't care about performance, but it cares greatly about money and lost time. Do your tests, do the math, add a pricetag and talk with your manager.
Re:Why are you writing to slashdot? (Score:2, Insightful)
Another option is to go higher than IT, to some administrator type. State that your research project is in jeopardy because of the new rules, and if you cannot get the project done (but you could if the restrictions were removed for your lab), there won't be any more grants. Administrators might be more concerned with the prestige of the lab than IT, so they'll pass a decree to the IT "automatons" (as EmagGeek said), which in turn will help you.
Re:Here's a quick experiment (Score:4, Insightful)
"There is lots of cpu cycles left"
Uhm. You are losing 30% cpu cycles, that is quite a lot. Yes there is amble power left for your office apps etc. but original poster says he is doing high performance computing - losing 30% of your throughput for reading data is a lot!
Re:Policy fundamentalism (Score:5, Insightful)
Sounds to me like the IT department in question knows what it's doing, and who it's clients are. It's rarely mentioned outside an IT department, but I'll share one of the big secrets: 98% of the job of any IT department is to protect users from their own stupidity. The smartest users are the ones who realize this and give the IT department enough space to operate, while at the same time learning as much as they can about what they do so they have a real understanding of how to specifically follow the rules while at the same time getting everything done.
It's not impossible at all.
Re:Encryption is good for security, bad for perfor (Score:5, Insightful)
It depends a lot on what you're doing with the data. If you've got a single-threaded process that's consuming 50MB/s and you can read 100MB/s from the disk and run 100MB/s decodes on the other core, you won't notice the speed difference. If you're doing random access then you will have, say, a 9ms seek time to get the data and then a few more ms to decompress it. If your process is already I/O bound (many scientific computing tasks are) then a 9ms decode per block will halve the speed of your computation.
The correct solution for this lab seems to be to borrow a policy from most defence-related sites. Have a secure and an insecure network. The secure network is allowed to access confidential data, the insecure network isn't. Run encryption on the machines on the insecure network, don't bother with it on the insecure machines. If one of the insecure machines is compromised or stolen then nothing confidential is lost.
Re:People misunderstanding the question... (Score:5, Insightful)
Oh, so -you're- the type of network administrator who implements policies and software for the good of the network, software that's detrimental to the productivity of the people who the network is supposed to be good for, without consulting the users about their needs prior to the rollout?
I'm glad we met. Have you ever considered a career in sales?
Re:People misunderstanding the question... (Score:5, Insightful)
Run an analysis on the performance hit, document it, make a report and give the report to the persons who want the analysis done, and also the persons who pay the bills. (They might be different people).
The report has a summary that says: I must install this software to comply with policy. I will then be accomplishing my work at only X% of the speed I was before. If that is not ok, then I will need to spend $Y to upgrade the equipment in order to maintain the previous rate of work. End of story. If they deny the upgrades then... that's their decision. If they approve the upgrades - hey, new equipment!
The only potential problem I see is this: If the submitter has his own budget, IE he pays the bills, yet still must both maintain rate AND comply with the encryption policies... Hmmmm, well, not so easy. Then there needs to be a report that says his lab won't ever see patient data, with proof. Assuming the budget isn't there.
Re:People misunderstanding the question... (Score:2, Insightful)
I think the GP was intended as $SARCASM, but I'm not sure. That, of course, makes it the best kind of $SARCASM.
Re:People misunderstanding the question... (Score:5, Insightful)
If you think that was sarcasm, head over to the ars forums and check the rabid response elicited when someone asks a question about plugging a switch into the drop in a conference room because multiple presenters need a wired connection.
Professional IT staff seem to get more bitter and hostile to the users daring to question their all-knowingness the more years in the industry they get. I'm glad I got out and into coding before I ever hit that level.
Re:This has happened (Score:2, Insightful)
Re:What are you trying to prevent? (Score:2, Insightful)
Harddisks fail. I've had several disks fail while under warranty, and I wouldn't be sending them in for replacement if the data on the disk wasn't encrypted. Consider a NAS with several disks in a RAID array -- replacing faulty disks isn't all that uncommon, let alone within the five year period that some manufacturers provide warranty for.
People misunderstanding words like 'require'. (Score:5, Insightful)
The submitter is in a research institute. Some labs in that institute have patient data, and therefore require significant security like disk encryption.
Repeat after me: "The first line of security is physical."
If the servers are locked in a room with limited access (like, oh, say, 95+% of servers in the corporate world), then the probably not.
Data security is about securing the data using reasonable compensating controls. If no one can get to the disks, and those who can comprise a limited list of, say, trusted sysadmins, then it doesn't matter whether they're encrypted or not.
Requirements, if properly written, never specify implementation details -- the means. They only specify what is needed. How that is achieved is irrelevant so long as it the requirement is achieved completely.
So other than for devices that are not in access-controlled environment (like laptops or, in some cases, workstations), the need for whole disk encryption at most places is nil.
May be a mandate from outside the company... (Score:2, Insightful)
Re:Policy fundamentalism (Score:2, Insightful)
You are confusing policies with guidelines. Guidelines are often optional and serve as a "rule of thumb" or "best practices" for employees; policies are not. Policies (especially security policies) are, or should be established with the advice of legal counsel, and should be issued and enforced from an executive level.
If you don't want policies, do not issue them, otherwise you are just confusing employees and encouraging them to disregard issues which are important to the organization.
Re:People misunderstanding the question... (Score:5, Insightful)
Have you ever worked for a medium to large company? This is the norm in such companies. Management doesn't care how much productivity was lost and of course, they still expect you to get your work done on time.
As well, just to correct you, its management enforcing recommendations from a security analyst. The network admin couldn't give a rat's ass, they just implement some of the policies. You forgot to mention unix/windows administrators, dba's, etc.. Share the hate. (BTW, i'm a DBA.)
Its the security analysts job to try and to prevent breaches, IT to implement and managements job to weight the cost of security with productivity. The problem is that management is too scared to set realistic security policies. All they care about is CYA.
Re:People misunderstanding the question... (Score:5, Insightful)
Those would be the lazy bad admins and, unfortunately, are usually the only ones left after a period of time as the better ones all get better jobs/pay as soon as they can.
A good IT staffer knows their network and the various ways they can implement the policy's goals. (Note, policy should be abstract things like keep bad guys out of network, not specific things like install single Firewall Brand A and cut all other connections between network and internet.) They would also know how to accommodate changing needs.
Re:People misunderstanding the question... (Score:4, Insightful)
There are a lot of good reasons to disallow non-managed network equipment. What if one of the devices behind that switch starts killing the network? The admin's only option is to disable that port, which kills everyone's connection, and then everyone will start bitching about it. What if someone brings in a router and plugs it in wrong? Now they're serving DHCP out to the building? People with unlimited IT budgets might say, "Get gear that kills unauthorized DHCP servers." People with limited IT budgets will get bitter and hostile at this point.
IT is there to support the users, but it's perfectly reasonable for IT to be the ones adding on to the network. Need more ports in the conference room? Let me put one of my switches in there.
Re:People misunderstanding the question... (Score:3, Insightful)
You're either the worst type of admin (and it certainly sounds like it with your ham-fisted policy statement) or your business has no real negative impact from your policy, in which case it's ok.
In the presented case, it sounds like there is significant disk I/O. Adding an encryption layer to disk I/O that's not hardware driven is going to slow down disk access, possibly significantly. The type of modeling discussed generally uses huge amounts of resources and can strain all current systems to near breaking points. I used to do similar work modeling large structures, and even the Crays and Convexes I used would take many hours to run highly optimized code that reduced memory requirements as much as possible. Output was measured in GBs, even compressed.
An encryption layer on a fully utilized machine would have significantly slowed down processing, as Disk I/O was already a bottle neck.
Re:People misunderstanding the question... (Score:3, Insightful)
Re:People misunderstanding the question... (Score:5, Insightful)
Fine, as long as you work my hours. I work in a job where I may be setting up at 0500 for a multi-person network-heavy presentation scheduled to go at 0630, and I have zero time for argument. I've had great support and lousy support, and yes, I bring my own network hardware in case the local admin doesn't have what I need.
That said, I almost never have a problem, because good network admins do indeed work with me, and lousy ones either (a) aren't there to complain, or (b) trust me far more than they should. Oh, and I ask (and explain and discuss and compromise) long before any equipment sees power. It's only polite.
I've never (ten years or so) had a local hardware issue extend into the host network. It seems to be fairly hard to do that if you're not an idiot (and if your own equipment is truly solid, which mine is).
Re:People misunderstanding words like 'require'. (Score:3, Insightful)
It has limited access - until a larger drive needs to be installed, and the the old one ends up in the spare parts bin and eventually gets sold as surplus, and somebody gets it home and finds your medical records on it.
Or, the service is in a locked room with limited access - but the DVD-Rs with the backups get lost on the way to the off-site storage facility.
Confidential data has been exposed in the past via both of these sorts of scenarios. Yes, perfect physical control would prevent the need for them. But it's a poor sort of security plan that relies on one layer being perfect.
The same reasoning applies to why a lab doing non-sensitive work would be subject to the same controls: it's more reliable to say "X for every server" that to say "X for every server of which it's true that Y and Z". Because Y and Z might not be true on that server today, but will tomorrow. Hardware gets moved around, servers get consolidated.
Re:Isolate sensitive data (Score:1, Insightful)
Only encrypting the sensitive data is like carrying water in bucket used for target practice: stuff will leak.
Yes, that's why he said isolate.
Example- you have a research lab. All the equipment in the lab is NOT connected to any outside network, period. The only people allowed in the lab have access to the data anyhow, just password protect the terminals.
Now, once you need to store the data for backup, or access it outside the lab, then is when you encrypt the data.
Re:It's impossible to compress encrypted data (Score:3, Insightful)
Re:People misunderstanding the question... (Score:3, Insightful)
I hope you don't also consider yourself to be some kind of capitalist or something. Excessive security that reduces your efficiency is not going to help your business compared to another company that has more flexible security policies that lock up actual sensitive data while letting those with non sensitive data work at maximum computational efficiency. Your attitude is that of a Soviet bureaucrat that policy consistency trumps actual on the ground working conditions, and it will only be to your loss when your competitor gets their computational work involving non security hazard done more quickly.
Re:Random Experiences with disk encryption (Score:3, Insightful)
You pop in the boot/recovery CD the encryption app forced you to create before it'd encrypt the drive? At least that's how it works on TrueCrypt.
Re:Repeat after me (Score:5, Insightful)
RTFA FTW!!!
The Submitter him/herself doesn't work with sensitive info, just other dept's. IT is enforcing an overly broad solution on everyone, with considering the downside. I agree with you that sensitive data needs to be secured, but rolling out disk encryption to everyone in a company when a subset of everyone is dealing with sensitive info is maybe overkill, and the impacts to the primary activity of other depts needs to at least be quantified and considered.
Comment removed (Score:3, Insightful)
Re:People misunderstanding the question... (Score:3, Insightful)
The problem is when the IT guys think that prohibiting a switch in a conference room to provide connectivity for a few hours falls under the heading of "reasonable".
Dilbert's concept of "preventer of information services" is more truth than farce. Most IT guys I've encountered are more interested in "keeping the network healthy" than actually letting people get work done. If you have a better way to let six people all connect to the network from that room, then fire away. But "sorry, you can't plug that in, and there's no other way to get what you want" is quite simply the wrong answer.
Re:People misunderstanding the question... (Score:3, Insightful)
I was mostly talking about actual, reasonable measures, not measures that are in place because the admins can't figure out something if it doesn't connect to the AD.
But that's quite possibly the situation the OP is in.
I've no idea what he is doing but, hypothetically, he's just spent $10m on a set of high end PCs to run his simulations.
Someone now proposes to reduce his computing capacity by some arbitrary amount because he has to run disk encryption.
It's perfectly reasonable to have a rule that, by default, every disk must be encrypted. But making that a blanket rule with no exceptions is foolish.
"I'm sorry. I realize that this is going to hamper your research and may require you to buy additional hardware to maintain performance but as you're processing medical X-rays that can, potentially, be linked back to a patient, you're going to have to use disk encryption on your compute farm because there are potentially too many people who have physical access to the machines and there's too much risk of someone walking off with one or two machines together with their disks."
"Ah, I see. Although You're processing medical X-rays, your compute farm is in the controlled server room. We already have processes in place to ensure that disks etc are not removed without being securely wiped. Yes, I think we can allow an exemption for those machines."
"Ah, yes. You're analysing the genome of protozoa[1]. Yes, we'll give an exemption for your compute farm. This exemption will only apply until your current research is complete. We'll reassess whether the exception is still be appropriate when your next project is in planning."
[1] I looked up tetrahymena - I'd never heard of it before
Tim.
Also a crash recovery nightmare (Score:1, Insightful)
Encrypted partitions are excellent for their intended purpose: To safeguard the confidentiality of sensitive data. But an "across the board" policy of encrypting every HD in a whole shop is simply nuts. In addition to the performance problems with intensive computation requiring constant read/write, the way to recover from a file system corruption problem with an encrypted partition is very often "Kiss your data goodbye, it is gone forever, period." So, add the cost of a major upgrade in your offsite backup process, in money and network and in-the-box processing overhead, to whatever the PGP license costs /and/ the vastly increased risk of major data loss.
Re:Repeat after me (Score:4, Insightful)
After you have done your analysis as to how much productivity is lost, be -certain- to equate that to a dollar figure, so it can be extrapolated over the quarter and over the year. Nothing will make or break a project more than being able to assign a hard-dollar figure to it.
If it takes you an additional hour a week to preform tasks, and your value is $100/hour, then you effectively cost an additional $5,200 a year for lost productivity. Multiply that times all users in your lab. Managers understand cost and budget impact more than passionate resistance.
Good luck!
Why whole disk? (Score:3, Insightful)
Considering that much of your hard drive consists of non-private data, like the operating system, program install files, and spurious user junk, why bother encrypting the whole disk? Why does anyone bother? Just have an encrypted directory, partition, or even a second small hard drive and save all super-top-secret files in there.
Re:People misunderstanding the question... (Score:3, Insightful)
Your datapoint is irrelevant. Because today's processors are much faster and because the cipher implementations have been improved, it is now much less costlier to encrypt data. Here are some "openssl speed" benchmarks of the RC4 symmetric cipher on a current processor and one released 8 years ago with a version of OpenSSL almost as old:
This is a 17x improvement in performance ! Run the quad-core processor in 64-bit mode and it would probably be 20x faster. By comparison, disk throughput has increased by only about 2x over the last 8 years (50 MB/s vs. 100 MB/s). So run the same test today but replace your 8 Xeon 800 MHz with 8 quad-core processors with 12 disks and you should see almost no speed decrease caused by a well-designed disk encryption app (I can vouch for dm-crypt).
Re:Repeat after me (Score:2, Insightful)