Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Encryption Security

Resisting the PGP Whole Disk Encryption Craze 480

alaederach writes "I run a lab in a non-profit academic life sciences research institute. Our IT recently decided it would be a good idea to use PGP whole disk encryption on all of our computers, laptops and servers and picked PGP's suite of software. The main reason is that a small subset of our researchers work with patient information which we obviously are mandated to keep confidential. My lab does a lot of high-performance computational work (on genes from Tetrahymena, no humans here) and I am concerned that the overhead of complying with our ITs new security policy will be quite detrimental to my research program. For example, dynamically reallocating a partition on a PGP encrypted disk is apparently not possible. Furthermore, there is some evidence that certain forms of compression are also incompatible with PGP whole disk encryption. Interestingly, it is hard to find any negative articles on PGP, probably because most of them are written by IT pros who are only focused on the security, and not usability. I therefore ask the Slashdot community, what are the disadvantages of PGP in terms of performance, Linux, and high-performance computational research?"
This discussion has been archived. No new comments can be posted.

Resisting the PGP Whole Disk Encryption Craze

Comments Filter:
  • by WK2 ( 1072560 ) on Thursday October 30, 2008 @05:04AM (#25566449) Homepage

    Whole disk encryption is excellent for security, but it will bog you down in disk access times. Depends on a lot of things, but reading and writing files can slow down up to 50%, but usually the slow-down is much less. If you are doing something that involves a lot of disk access and it doesn't need to be encrypted, then create a special, non encrypted partition for that.

  • by Pikiwedia.net ( 1392595 ) on Thursday October 30, 2008 @05:07AM (#25566457) Homepage
    An IT policy is a general rule which has to be interpreted and adopted. It's not supposed to be followed by the letter. Ask your IT department what they want to accomplish with the policy, and how you can help them accomplish that without having your work ruined.
  • Policy Exception (Score:3, Insightful)

    by Anonymous Coward on Thursday October 30, 2008 @05:10AM (#25566467)

    You've got a good case for an exception from this policy. Just follow the exceptions process and have your management sign off on the risk. Case closed.

  • From experience (Score:2, Insightful)

    by Anonymous Coward on Thursday October 30, 2008 @05:18AM (#25566515)

    It took 2 days to encrypt an entire 160gb ide hard disk with a K6-2 400mhz processor, and afterwards the computer could only server files at about 400k per second. With a 2ghz processor the performance difference is negligible, and could serve at full speed with only tiny cpu usage. So I think full disk encryption overheads is irrelevant on modern cpus.

    As for not being able to resize a partition, well that's good because if your hard disk is to contain anything of importance then you would have to be inept to resize partitions and expect data to maintain it's integrity, no matter what the file system format or brochures on partitioning programs try to tell you.

  • feel-good actions (Score:3, Insightful)

    by scientus ( 1357317 ) <instigatorircNO@SPAMgmail.com> on Thursday October 30, 2008 @05:27AM (#25566547)
    in these type of departments all the computer are on all the time anyways and whole-disk encryption is 100% vulnerable to hard-boot attacks. It may be remotely useful on laptops but for desktops its entirely useless

    if you want to actually protect your data you need to encrypt only whats sensitive and only mont it when neccicary. also PGP is closed source and what are you going to do if they stop supporting, use truecrypt or LVM, etc. Also dont neglect network protection where the real data is stolen
  • by jonaskoelker ( 922170 ) <`jonaskoelker' `at' `yahoo.com'> on Thursday October 30, 2008 @05:35AM (#25566595)

    Surely what is required is to isolate the sensitive information, so that it can be protected.

    That's a great idea that in practice will leak your information. The reason is that _every_ application that touches your data needs to know that it should keep your data confidential.

    Broswers know to not cache data transfered over https. It knows the data was encrypted, it knows to be smart with it [for "protective" value of smart].

    When you have a program that reads a file through a transparent layer of encryption, it never sees the "please-be-careful-with-this" label, and so the desktop search engine will index all the strings, the editor will write backups to . or /tmp, and so forth. All the apps think they need to do is respect what you meant by your mode bits (if you're on *nix), so it'll chmod/umask the /tmp copy the right way. If someone grabs your disk and you didn't encrypt /tmp, you lose.

    And no, encrypting /tmp won't fix it: you need to know that everything the user of the data can write to is encrypted if you want to be sure. I only know one way that I can somewhat confidently say solves the problem: encrypt everything. [and then there's the network, but we'll save that for another decade ;)]

    Only encrypting the sensitive data is like carrying water in bucket used for target practice: stuff will leak.

  • by Nicolas MONNET ( 4727 ) <nicoaltiva@gmai l . c om> on Thursday October 30, 2008 @05:37AM (#25566615) Journal

    Their product doesn't seem [pgp.com] to run on Linux.
    There is better, cheaper F/OSS software to do the same thing though; Ubuntu and FC9 already include a whole disk encryption option at install. (It's better because it's much less likely to have an NSA back door, although obviously never completely certain).
    As for performance, when I tried it (luks encryption) on a desktop machine, it wasn't noticeable; but I wasn't moving hundreds of gigs around.
    The question now is what are they trying to protect. Encrypting laptops is sensible, and in fact, given how easy & cheap it now is, it's rather stupid not to do it. On desktop PCs, it's not that clear. Whole disk encryption will only protect you against someone with physical access to the machine turned off. It certainly won't protect you against trojans or browser based vulnerabilities. So the question is, do random strangers roam your offices?
    And encrypting servers/clusters? That's just silly; unless you expect the men in black to storm in your building.

  • by calmofthestorm ( 1344385 ) on Thursday October 30, 2008 @05:45AM (#25566671)

    Someone will write the passphrase down anyway. Isolate the data.

  • Re:From experience (Score:2, Insightful)

    by Alpha Whisky ( 1264174 ) on Thursday October 30, 2008 @05:49AM (#25566689)

    How could it possibly be in Microsoft's interest to allow or facilitate the resizing of partitions?

    They want your hard drive to be one big C: NTFS partition with no room for a Linux partition.

    If you run defrag on a fairly empty NTFS partition it's noticeable that some data will get shoved to the end of the partition and probably won't get moved back to the beginning.

    If I were to be unkind, I would suggest that this is deliberate behaviour to prevent third party partition resizing applications reclaiming enough space to make a partition for a competing operating system install.

  • This has happened (Score:4, Insightful)

    by DynaSoar ( 714234 ) on Thursday October 30, 2008 @05:50AM (#25566695) Journal

    I've worked with people during various research projects who decided to encrypt, for some very good reasons. I've had one admin die, and one researcher have a stroke. In both cases they had information necessary for the project that nobody else could get to, even when their hard drives were retrieved. The results are that after several years, the stuff is still sitting somewhere unusable because the people who attempted to get to it were stymied. Enforcing PGP on an entire network could multiply this problem. I would think that enforcing PGP on users not needing it would be a royal pain for them.

    What we've done and thought of since:

    Have only those with sensitive information encrypt. Have them work on machines not connected to the net. If they need net access, have them connect only for the time necessary, and mandate pre-encryption back ups prior to connecting.

    Preferred, but resisted, keep the sensitive machines off the net and have the researchers connect to the net via a different machine without the sensitive info on it. If they want to use it for transfers of such info, make them use sneakernet between the sensitive and connected machines. In this scenario, they only need PGP for what they're going to transfer to the connected machine and thus to outside. Both admins and researchers expect full connectivity throughout their net, but the best security is a nackered line.

    I use the sneakernet method exclusively. What I transfer when necessary is hundreds of MB to tens of GB of data. It takes me 10 to 30 minutes to encrypt, burn the data to DVDs and carry it to the connected machine. Like most researchers, I'm busy and don't want to spend my time doing this, but I have assistants I can put the task on.

  • I've another fear (Score:1, Insightful)

    by kanweg ( 771128 ) on Thursday October 30, 2008 @05:50AM (#25566699)

    For my company, data has to be kept secret. Yet we do not do encryption. The fear for corruption of data is far bigger than the chance that a cracker gets access.

    As to personal experience: With TrueCrypt, changing between accounts (on a Mac) with TrueCrypt open can wreak havoc. The data can be copied but the secure thingie has to be re-created from scratch. We cannot have encryption working properly 99% of the time. It must be 100.00%.

    Bert

  • by LiSrt ( 742904 ) on Thursday October 30, 2008 @05:51AM (#25566703) Homepage

    "But: do the measurement in your own world. My software, hardware and artificial measured usage pattern may differ from yours, subtly but enough that my conclusion doesn't transfer. Be scientific about it :)"

    Best advice I've seen - try and build up a representative sample of a day's work (or just a random sample if that's not easily determinable), copy it, run one copy on unencrypted disks and one on the mandated encryption.

    If there's a significant difference take the evidence to your IT dept. or supervisor and hope for a favourable decision.

  • Drive Errors? (Score:5, Insightful)

    by CopaceticOpus ( 965603 ) on Thursday October 30, 2008 @05:55AM (#25566709)

    My concern with encrypting an entire disk would be fault tolerance. If a sector goes bad on a non-encrypted drive, you might lose a file. If it goes bad on an encrypted drive, do you risk losing more data or even the entire drive?

    Of course, one could say that's why you make backups. But presumably the backups would also be using encryption. Therefore, they would be susceptible to the same effect. If there is a greater chance of total data loss on each device, the chance of multiple device failures leading to unrecoverable data also increases.

  • by nmg196 ( 184961 ) on Thursday October 30, 2008 @06:01AM (#25566733)

    I'm not sure that assuming that just because somethings done in hardware, that it happens in zero time (or even near zero time) is at all accurate. A review I read of a different encrypted drive, said it was 5-10% slower than it's non-encrypted equivalent. It wasn't the Seagate you're talking about, but I doubt that even hardware encryption can do it instantly, so I think your "zero" is an exaggeration.

  • by jamesh ( 87723 ) on Thursday October 30, 2008 @06:03AM (#25566739)

    If there really is a performance loss, and you can quantify it, then you can attack it from another angle, eg an impact statement to management along the lines of "This will introduce a %% performance loss to our workloads, at a cost of $$$. In order to maintain the same level of productivity we will require upgraded hardware at a cost of $$$".

    Having a manager who is concerned about his departments budgets on your side can help your case too :)

  • Linux software RAID 5 uses 2% CPU under heavy load.

    Given the fact that you can always recover your data with any Linux livecd gives it a definite edge over a hardware raid solution where you need a similar model to read the data.

  • by Idaho ( 12907 ) on Thursday October 30, 2008 @06:15AM (#25566805)

    In the time you spent writing this post to Slashdot, you could have written a friendly letter to your IT department stating that you want some machines to not use this encryption, because these machines need maximum performance and anyway do not store any kind of personal information.

  • by hAckz0r ( 989977 ) on Thursday October 30, 2008 @06:24AM (#25566859)
    The only protection that Full Disk Encryption gives is if someone physically gets their hands on the machine that they can not boot the machine and read its contents. This make perfect sense for laptops but makes little sense for any pertinently fixed location workstations. A laptop will physically leave the premises so it leaves itself open to theft, but a workstation (assuming you have some decent form of physical security) is much less likely to need this protection. Once a workstation is booted and the disk drive unlocked digitally then any hacker that gets a foothold on the system would then have access to it, so all that overhead of full disk encryption does no good unless the encryption is done per-user-session. When you need assess to the data you authenticate and start decrypting then, and keep it encrypted across the network. Yes, that data that you speak of should be encrypted, but you must encrypt it at the correct level to actually increase its security rather than just slowing down the machine. Anything short of that level of control and you are just fooling yourself into thinking you have protected the data. Fool-Disk-Encryption is not always the answer.
  • by wanax ( 46819 ) on Thursday October 30, 2008 @06:26AM (#25566883)

    The submitter is in a research institute. Some labs in that institute have patient data, and therefore require significant security like disk encryption.

    His lab works with a protozoa, and has massive computational requirements. There will never be any patient data near his lab, because the people who work with patients are in a different lab (think different department in business). They do not need disk encryption.

    You say Truecrypt has "1% overhead", PGP presumably has some other "% overhead." The submitter is asking what the details of that overhead for PGP, truecrypt etc are. Whats the CPU usage, memory usage? Are disk performance penalties constant, or are they dependent on average file size, number of files, format of those files, etc etc etc. "1% overhead" may hide whopping huge performance penalties for specialist users.

  • by discord5 ( 798235 ) on Thursday October 30, 2008 @06:36AM (#25566931)

    I have serious doubt we even need hardware RAID anymore with current CPU speeds.

    At some point in time I believed the same thing. I did a test a few years ago to see if it's still worth it to bother with hardware RAID and configured an system with linux and software RAID.

    This was for a fileserver in a high performance cluster so speed mattered. I don't have the exact figures here right now, but from what I remember two years ago the software RAID solution was between 7 and 15% slower. Once you start hitting the performance limit your processes hit I/O wait and your performance goes down. When I added LVM to that back then performance got shot to hell.

    Now, it's not as bad as it seems, you still get decent performance (especially considering that your setup suddenly costs a lot less and can be done on commodity hardware), and with a fair bit of tinkering with blockdev and your read-ahead buffer (provided you have enough RAM, and your usage fits that particular pattern) you can still get some very nice performance.

    The reason that we went with hardware RAID in the end was because hardware RAID isn't all that expensive, and the performance gains were noticeable especially on systems that have to run 24/7 at maximum throughput.

    Again, for consumer systems and services where performance isn't a primary concern software RAID is an attractive option, especially if you're on a budget.

    As for overhead with encryption: it would make a nice experiment but I think 1% overhead is very optimistic especially on a busy system. The only way to be sure is to compare your performance now to the performance when you encrypt the entire disk. The only time I tested truecrypt I got a throughput of 80MByte/s, while unencrypted I got 120MByte/s, and it's been a while since I tested this. Those truecrypt tests weren't finetuned either, it was basicly a test to see if it was easy to implement.

    Anything I mention here has to be taken with a grain of salt since a lot of time has passed and a lot has changed since those tests.

    If policy dictates that you have to setup X, the best way to become an exception to this policy is to prove that that policy is detrimental to your project and might end up costing a lot of money. Policy doesn't care about performance, but it cares greatly about money and lost time. Do your tests, do the math, add a pricetag and talk with your manager.

  • by TuaAmin13 ( 1359435 ) on Thursday October 30, 2008 @07:17AM (#25567109)
    Bargain with IT. In exchange for not encrypting the drive, you can physically secure the machine, with stuff like door codes.

    Another option is to go higher than IT, to some administrator type. State that your research project is in jeopardy because of the new rules, and if you cannot get the project done (but you could if the restrictions were removed for your lab), there won't be any more grants. Administrators might be more concerned with the prestige of the lab than IT, so they'll pass a decree to the IT "automatons" (as EmagGeek said), which in turn will help you.
  • by Splab ( 574204 ) on Thursday October 30, 2008 @07:18AM (#25567119)

    "There is lots of cpu cycles left"
    Uhm. You are losing 30% cpu cycles, that is quite a lot. Yes there is amble power left for your office apps etc. but original poster says he is doing high performance computing - losing 30% of your throughput for reading data is a lot!

  • by yttrstein ( 891553 ) on Thursday October 30, 2008 @07:55AM (#25567303) Homepage
    I'm in agreement with Smertrios as well. It's easy to see why such a blanketing policy is necessary--have you ever worked with scientists? While possibly quite brilliant, most of them seem to have the same problem remembering to keep sensitive data encrypted. The only logical solution to this is to write a policy which requires everything to be encrypted.

    Sounds to me like the IT department in question knows what it's doing, and who it's clients are. It's rarely mentioned outside an IT department, but I'll share one of the big secrets: 98% of the job of any IT department is to protect users from their own stupidity. The smartest users are the ones who realize this and give the IT department enough space to operate, while at the same time learning as much as they can about what they do so they have a real understanding of how to specifically follow the rules while at the same time getting everything done.

    It's not impossible at all.
  • by TheRaven64 ( 641858 ) on Thursday October 30, 2008 @08:16AM (#25567399) Journal

    It depends a lot on what you're doing with the data. If you've got a single-threaded process that's consuming 50MB/s and you can read 100MB/s from the disk and run 100MB/s decodes on the other core, you won't notice the speed difference. If you're doing random access then you will have, say, a 9ms seek time to get the data and then a few more ms to decompress it. If your process is already I/O bound (many scientific computing tasks are) then a 9ms decode per block will halve the speed of your computation.

    The correct solution for this lab seems to be to borrow a policy from most defence-related sites. Have a secure and an insecure network. The secure network is allowed to access confidential data, the insecure network isn't. Run encryption on the machines on the insecure network, don't bother with it on the insecure machines. If one of the insecure machines is compromised or stolen then nothing confidential is lost.

  • by mikkelm ( 1000451 ) on Thursday October 30, 2008 @08:17AM (#25567407)

    Oh, so -you're- the type of network administrator who implements policies and software for the good of the network, software that's detrimental to the productivity of the people who the network is supposed to be good for, without consulting the users about their needs prior to the rollout?

    I'm glad we met. Have you ever considered a career in sales?

  • by flappinbooger ( 574405 ) on Thursday October 30, 2008 @08:22AM (#25567443) Homepage
    The solution for the story submitter is simple, then.

    Run an analysis on the performance hit, document it, make a report and give the report to the persons who want the analysis done, and also the persons who pay the bills. (They might be different people).

    The report has a summary that says: I must install this software to comply with policy. I will then be accomplishing my work at only X% of the speed I was before. If that is not ok, then I will need to spend $Y to upgrade the equipment in order to maintain the previous rate of work. End of story. If they deny the upgrades then... that's their decision. If they approve the upgrades - hey, new equipment!

    The only potential problem I see is this: If the submitter has his own budget, IE he pays the bills, yet still must both maintain rate AND comply with the encryption policies... Hmmmm, well, not so easy. Then there needs to be a report that says his lab won't ever see patient data, with proof. Assuming the budget isn't there.
  • by Software Geek ( 1097883 ) on Thursday October 30, 2008 @08:30AM (#25567479)

    I think the GP was intended as $SARCASM, but I'm not sure. That, of course, makes it the best kind of $SARCASM.

  • by b96miata ( 620163 ) on Thursday October 30, 2008 @08:36AM (#25567515)

    If you think that was sarcasm, head over to the ars forums and check the rabid response elicited when someone asks a question about plugging a switch into the drop in a conference room because multiple presenters need a wired connection.

    Professional IT staff seem to get more bitter and hostile to the users daring to question their all-knowingness the more years in the industry they get. I'm glad I got out and into coding before I ever hit that level.

  • by Growlor ( 772763 ) on Thursday October 30, 2008 @08:40AM (#25567535)
    Exactly. This is one of the reasons for using a mature solution (such as PGP or one of its successors and not something like Bitlocker) which offer centralized key management and recovery. It is EXTREMELY difficult to trace all the possible places a Windows OS might write data to (and maybe even a *NIX one too) and then make sure all that data is deleted and overwritten to prevent forensic recovery. This gets to be MUCH harder if you start copying the data to thumb drives (assuming that it is not just you and other people who might use the same drive as the one housing all their MP3's - and thus don't want to completely wipe it - or worse pirated games that might contain malware!)
  • by Lupu ( 815408 ) on Thursday October 30, 2008 @08:41AM (#25567539)
    You seem to be forgetting one very important aspect: hardware failure.

    Harddisks fail. I've had several disks fail while under warranty, and I wouldn't be sending them in for replacement if the data on the disk wasn't encrypted. Consider a NAS with several disks in a RAID array -- replacing faulty disks isn't all that uncommon, let alone within the five year period that some manufacturers provide warranty for.
  • by morgan_greywolf ( 835522 ) on Thursday October 30, 2008 @08:45AM (#25567567) Homepage Journal

    The submitter is in a research institute. Some labs in that institute have patient data, and therefore require significant security like disk encryption.

    Repeat after me: "The first line of security is physical."

    If the servers are locked in a room with limited access (like, oh, say, 95+% of servers in the corporate world), then the probably not.

    Data security is about securing the data using reasonable compensating controls. If no one can get to the disks, and those who can comprise a limited list of, say, trusted sysadmins, then it doesn't matter whether they're encrypted or not.

    Requirements, if properly written, never specify implementation details -- the means. They only specify what is needed. How that is achieved is irrelevant so long as it the requirement is achieved completely.

    So other than for devices that are not in access-controlled environment (like laptops or, in some cases, workstations), the need for whole disk encryption at most places is nil.

  • by Firstoni ( 1078997 ) on Thursday October 30, 2008 @09:14AM (#25567873)
    Considering the type of data that the OP is working with and the choice of the product to use, it may be that they fall under the government mandate of using encryption and that it HAS to be FIPS 140-2 approved. In this case TrueCrypt (as much as I like it) is not a valid choice, as it is NOT FIPS 140-2 approved.
  • by vvaduva ( 859950 ) on Thursday October 30, 2008 @09:20AM (#25567965)

    You are confusing policies with guidelines. Guidelines are often optional and serve as a "rule of thumb" or "best practices" for employees; policies are not. Policies (especially security policies) are, or should be established with the advice of legal counsel, and should be issued and enforced from an executive level.

    If you don't want policies, do not issue them, otherwise you are just confusing employees and encouraging them to disregard issues which are important to the organization.

  • by Stone316 ( 629009 ) on Thursday October 30, 2008 @09:31AM (#25568117) Journal

    Have you ever worked for a medium to large company? This is the norm in such companies. Management doesn't care how much productivity was lost and of course, they still expect you to get your work done on time.

    As well, just to correct you, its management enforcing recommendations from a security analyst. The network admin couldn't give a rat's ass, they just implement some of the policies. You forgot to mention unix/windows administrators, dba's, etc.. Share the hate. (BTW, i'm a DBA.)

    Its the security analysts job to try and to prevent breaches, IT to implement and managements job to weight the cost of security with productivity. The problem is that management is too scared to set realistic security policies. All they care about is CYA.

  • by Gr8Apes ( 679165 ) on Thursday October 30, 2008 @09:41AM (#25568243)

    Those would be the lazy bad admins and, unfortunately, are usually the only ones left after a period of time as the better ones all get better jobs/pay as soon as they can.

    A good IT staffer knows their network and the various ways they can implement the policy's goals. (Note, policy should be abstract things like keep bad guys out of network, not specific things like install single Firewall Brand A and cut all other connections between network and internet.) They would also know how to accommodate changing needs.

  • by Sancho ( 17056 ) * on Thursday October 30, 2008 @09:44AM (#25568311) Homepage

    There are a lot of good reasons to disallow non-managed network equipment. What if one of the devices behind that switch starts killing the network? The admin's only option is to disable that port, which kills everyone's connection, and then everyone will start bitching about it. What if someone brings in a router and plugs it in wrong? Now they're serving DHCP out to the building? People with unlimited IT budgets might say, "Get gear that kills unauthorized DHCP servers." People with limited IT budgets will get bitter and hostile at this point.

    IT is there to support the users, but it's perfectly reasonable for IT to be the ones adding on to the network. Need more ports in the conference room? Let me put one of my switches in there.

  • by Gr8Apes ( 679165 ) on Thursday October 30, 2008 @09:49AM (#25568399)

    You're either the worst type of admin (and it certainly sounds like it with your ham-fisted policy statement) or your business has no real negative impact from your policy, in which case it's ok.

    In the presented case, it sounds like there is significant disk I/O. Adding an encryption layer to disk I/O that's not hardware driven is going to slow down disk access, possibly significantly. The type of modeling discussed generally uses huge amounts of resources and can strain all current systems to near breaking points. I used to do similar work modeling large structures, and even the Crays and Convexes I used would take many hours to run highly optimized code that reduced memory requirements as much as possible. Output was measured in GBs, even compressed.

    An encryption layer on a fully utilized machine would have significantly slowed down processing, as Disk I/O was already a bottle neck.

  • by Wovel ( 964431 ) on Thursday October 30, 2008 @09:55AM (#25568497) Homepage
    Interesting but.. The absence of an event does not mean the burden your policies place on the end users was necessary. So you are saying you would not grant a waiver to a blanket full disk encryption policy for a lab that had higher performance needs and no sensitive data? Perhaps your policies are written better than the institute where the submitter works. Blanket security policies with no procedure to obtain waivers are nearly always bad and are generally indicative of an IT organization that is poorly managed and not designed to meet the needs of the user community.
  • by TheMohel ( 143568 ) on Thursday October 30, 2008 @10:12AM (#25568787) Homepage

    Fine, as long as you work my hours. I work in a job where I may be setting up at 0500 for a multi-person network-heavy presentation scheduled to go at 0630, and I have zero time for argument. I've had great support and lousy support, and yes, I bring my own network hardware in case the local admin doesn't have what I need.

    That said, I almost never have a problem, because good network admins do indeed work with me, and lousy ones either (a) aren't there to complain, or (b) trust me far more than they should. Oh, and I ask (and explain and discuss and compromise) long before any equipment sees power. It's only polite.

    I've never (ten years or so) had a local hardware issue extend into the host network. It seems to be fairly hard to do that if you're not an idiot (and if your own equipment is truly solid, which mine is).

  • If the servers are locked in a room with limited access (like, oh, say, 95+% of servers in the corporate world), then the probably not.

    It has limited access - until a larger drive needs to be installed, and the the old one ends up in the spare parts bin and eventually gets sold as surplus, and somebody gets it home and finds your medical records on it.

    Or, the service is in a locked room with limited access - but the DVD-Rs with the backups get lost on the way to the off-site storage facility.

    Confidential data has been exposed in the past via both of these sorts of scenarios. Yes, perfect physical control would prevent the need for them. But it's a poor sort of security plan that relies on one layer being perfect.

    The same reasoning applies to why a lab doing non-sensitive work would be subject to the same controls: it's more reliable to say "X for every server" that to say "X for every server of which it's true that Y and Z". Because Y and Z might not be true on that server today, but will tomorrow. Hardware gets moved around, servers get consolidated.

  • by Anonymous Coward on Thursday October 30, 2008 @10:41AM (#25569295)

    Only encrypting the sensitive data is like carrying water in bucket used for target practice: stuff will leak.

    Yes, that's why he said isolate.

    Example- you have a research lab. All the equipment in the lab is NOT connected to any outside network, period. The only people allowed in the lab have access to the data anyhow, just password protect the terminals.
    Now, once you need to store the data for backup, or access it outside the lab, then is when you encrypt the data.

  • by Sloppy ( 14984 ) on Thursday October 30, 2008 @10:42AM (#25569315) Homepage Journal
    Everyone compresses before they encrypt. Everyone. That's why I think the whole compression issue is bogus.
  • by mrraven ( 129238 ) on Thursday October 30, 2008 @11:00AM (#25569603)

    I hope you don't also consider yourself to be some kind of capitalist or something. Excessive security that reduces your efficiency is not going to help your business compared to another company that has more flexible security policies that lock up actual sensitive data while letting those with non sensitive data work at maximum computational efficiency. Your attitude is that of a Soviet bureaucrat that policy consistency trumps actual on the ground working conditions, and it will only be to your loss when your competitor gets their computational work involving non security hazard done more quickly.

  • by STrinity ( 723872 ) on Thursday October 30, 2008 @11:16AM (#25569911) Homepage

    Guess what happens if the corruption doesn't allow you to run the encryption app's boot loader?

    You pop in the boot/recovery CD the encryption app forced you to create before it'd encrypt the drive? At least that's how it works on TrueCrypt.

  • Re:Repeat after me (Score:5, Insightful)

    by Trails ( 629752 ) on Thursday October 30, 2008 @11:17AM (#25569919)

    RTFA FTW!!!

    The Submitter him/herself doesn't work with sensitive info, just other dept's. IT is enforcing an overly broad solution on everyone, with considering the downside. I agree with you that sensitive data needs to be secured, but rolling out disk encryption to everyone in a company when a subset of everyone is dealing with sensitive info is maybe overkill, and the impacts to the primary activity of other depts needs to at least be quantified and considered.

  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Thursday October 30, 2008 @11:30AM (#25570129)
    Comment removed based on user account deletion
  • by Free the Cowards ( 1280296 ) on Thursday October 30, 2008 @11:40AM (#25570277)

    The problem is when the IT guys think that prohibiting a switch in a conference room to provide connectivity for a few hours falls under the heading of "reasonable".

    Dilbert's concept of "preventer of information services" is more truth than farce. Most IT guys I've encountered are more interested in "keeping the network healthy" than actually letting people get work done. If you have a better way to let six people all connect to the network from that room, then fire away. But "sorry, you can't plug that in, and there's no other way to get what you want" is quite simply the wrong answer.

  • by locofungus ( 179280 ) on Thursday October 30, 2008 @12:14PM (#25570865)

    I was mostly talking about actual, reasonable measures, not measures that are in place because the admins can't figure out something if it doesn't connect to the AD.

    But that's quite possibly the situation the OP is in.

    I've no idea what he is doing but, hypothetically, he's just spent $10m on a set of high end PCs to run his simulations.

    Someone now proposes to reduce his computing capacity by some arbitrary amount because he has to run disk encryption.

    It's perfectly reasonable to have a rule that, by default, every disk must be encrypted. But making that a blanket rule with no exceptions is foolish.

    "I'm sorry. I realize that this is going to hamper your research and may require you to buy additional hardware to maintain performance but as you're processing medical X-rays that can, potentially, be linked back to a patient, you're going to have to use disk encryption on your compute farm because there are potentially too many people who have physical access to the machines and there's too much risk of someone walking off with one or two machines together with their disks."

    "Ah, I see. Although You're processing medical X-rays, your compute farm is in the controlled server room. We already have processes in place to ensure that disks etc are not removed without being securely wiped. Yes, I think we can allow an exemption for those machines."

    "Ah, yes. You're analysing the genome of protozoa[1]. Yes, we'll give an exemption for your compute farm. This exemption will only apply until your current research is complete. We'll reassess whether the exception is still be appropriate when your next project is in planning."

    [1] I looked up tetrahymena - I'd never heard of it before

    Tim.

  • by Anonymous Coward on Thursday October 30, 2008 @12:29PM (#25571115)

    Encrypted partitions are excellent for their intended purpose: To safeguard the confidentiality of sensitive data. But an "across the board" policy of encrypting every HD in a whole shop is simply nuts. In addition to the performance problems with intensive computation requiring constant read/write, the way to recover from a file system corruption problem with an encrypted partition is very often "Kiss your data goodbye, it is gone forever, period." So, add the cost of a major upgrade in your offsite backup process, in money and network and in-the-box processing overhead, to whatever the PGP license costs /and/ the vastly increased risk of major data loss.

  • Re:Repeat after me (Score:4, Insightful)

    by lionchild ( 581331 ) on Thursday October 30, 2008 @01:24PM (#25572095) Journal

    After you have done your analysis as to how much productivity is lost, be -certain- to equate that to a dollar figure, so it can be extrapolated over the quarter and over the year. Nothing will make or break a project more than being able to assign a hard-dollar figure to it.

    If it takes you an additional hour a week to preform tasks, and your value is $100/hour, then you effectively cost an additional $5,200 a year for lost productivity. Multiply that times all users in your lab. Managers understand cost and budget impact more than passionate resistance.

    Good luck!

  • Why whole disk? (Score:3, Insightful)

    by sherriw ( 794536 ) on Thursday October 30, 2008 @01:40PM (#25572351)

    Considering that much of your hard drive consists of non-private data, like the operating system, program install files, and spurious user junk, why bother encrypting the whole disk? Why does anyone bother? Just have an encrypted directory, partition, or even a second small hard drive and save all super-top-secret files in there.

  • by this great guy ( 922511 ) on Thursday October 30, 2008 @09:12PM (#25578117)

    Your datapoint is irrelevant. Because today's processors are much faster and because the cipher implementations have been improved, it is now much less costlier to encrypt data. Here are some "openssl speed" benchmarks of the RC4 symmetric cipher on a current processor and one released 8 years ago with a version of OpenSSL almost as old:

    • 32-bit OpenSSL 0.9.8e (Feb 2007), quad-core 1.9 GHz Opteron 2347: 1024 MByte/s (256 MB/s/core)
    • 32-bit OpenSSL 0.9.6e (Jul 2002), single-core 1 GHz Athlon: 60 MByte/s

    This is a 17x improvement in performance ! Run the quad-core processor in 64-bit mode and it would probably be 20x faster. By comparison, disk throughput has increased by only about 2x over the last 8 years (50 MB/s vs. 100 MB/s). So run the same test today but replace your 8 Xeon 800 MHz with 8 quad-core processors with 12 disks and you should see almost no speed decrease caused by a well-designed disk encryption app (I can vouch for dm-crypt).

  • Re:Repeat after me (Score:2, Insightful)

    by neonKow ( 1239288 ) on Thursday October 30, 2008 @09:17PM (#25578149) Journal
    So the best solution is to encrypt every drive on the campus? You can have security policies that are more specific than "PGP on every machine" vs "no disk encryption at all."

Kleeneness is next to Godelness.

Working...