Forgot your password?
typodupeerror
Encryption Security

Resisting the PGP Whole Disk Encryption Craze 480

Posted by samzenpus
from the what-do-you-think dept.
alaederach writes "I run a lab in a non-profit academic life sciences research institute. Our IT recently decided it would be a good idea to use PGP whole disk encryption on all of our computers, laptops and servers and picked PGP's suite of software. The main reason is that a small subset of our researchers work with patient information which we obviously are mandated to keep confidential. My lab does a lot of high-performance computational work (on genes from Tetrahymena, no humans here) and I am concerned that the overhead of complying with our ITs new security policy will be quite detrimental to my research program. For example, dynamically reallocating a partition on a PGP encrypted disk is apparently not possible. Furthermore, there is some evidence that certain forms of compression are also incompatible with PGP whole disk encryption. Interestingly, it is hard to find any negative articles on PGP, probably because most of them are written by IT pros who are only focused on the security, and not usability. I therefore ask the Slashdot community, what are the disadvantages of PGP in terms of performance, Linux, and high-performance computational research?"
This discussion has been archived. No new comments can be posted.

Resisting the PGP Whole Disk Encryption Craze

Comments Filter:
  • Overhead (Score:5, Interesting)

    by Anonymous Coward on Thursday October 30, 2008 @05:04AM (#25566447)

    Truecrypt Whole Disk Encryption has less than 1% over head. I can't see the problem. Surely the patent and IP information security outweighs this minimal overhead.

    • Repeat after me (Score:5, Interesting)

      by MosesJones (55544) on Thursday October 30, 2008 @05:43AM (#25566657) Homepage

      "Marketing is not a science even if its an Open Source project"

      Run some tests on a drive. Run TrueCrypt, re-run the tests, look the difference in CPU load and performance and then try and work out where the 1% number comes from.

      Personally I think its based on averaging time across when you aren't using the machine.

      • by Anonymous Coward on Thursday October 30, 2008 @07:37AM (#25567201)

        "Marketing is not a science even if its an Open Source project"

      • Re:Repeat after me (Score:5, Informative)

        by Trails (629752) on Thursday October 30, 2008 @09:17AM (#25567907)

        Parent is on the right track, imo. Submitter should work with the IT dept to assess the impact of this.

        Setup two machines running the same processing task that is actual work that he does, one with encryption and one without. Compare the difference in processing. If the performance loss is acceptable, all done. If it's not acceptable, submitter needs to start agitating now that this will seriously hamper his/her ability to do the job, and push IT to come up with a different solution.

        A previous employer rolled this out, and after my work productivity got killed, i found their assessment consisted of two guys opening MS Word, making some edits, saving, and exiting word.

        • Re: (Score:3, Insightful)

          by MikeURL (890801)
          I don't need to read any further. This is the most efficient and demonstrably accurate course that the researcher could use. I know asking /. is fun but really, in this case all you need is a controlled experiment where one variable is changed. It isn't that difficult.
        • Re:Repeat after me (Score:4, Insightful)

          by lionchild (581331) on Thursday October 30, 2008 @01:24PM (#25572095) Journal

          After you have done your analysis as to how much productivity is lost, be -certain- to equate that to a dollar figure, so it can be extrapolated over the quarter and over the year. Nothing will make or break a project more than being able to assign a hard-dollar figure to it.

          If it takes you an additional hour a week to preform tasks, and your value is $100/hour, then you effectively cost an additional $5,200 a year for lost productivity. Multiply that times all users in your lab. Managers understand cost and budget impact more than passionate resistance.

          Good luck!

    • by wanax (46819) on Thursday October 30, 2008 @06:26AM (#25566883)

      The submitter is in a research institute. Some labs in that institute have patient data, and therefore require significant security like disk encryption.

      His lab works with a protozoa, and has massive computational requirements. There will never be any patient data near his lab, because the people who work with patients are in a different lab (think different department in business). They do not need disk encryption.

      You say Truecrypt has "1% overhead", PGP presumably has some other "% overhead." The submitter is asking what the details of that overhead for PGP, truecrypt etc are. Whats the CPU usage, memory usage? Are disk performance penalties constant, or are they dependent on average file size, number of files, format of those files, etc etc etc. "1% overhead" may hide whopping huge performance penalties for specialist users.

      • by kefa (640985) on Thursday October 30, 2008 @07:40AM (#25567211) Journal

        His lab works with a protozoa, and has massive computational requirements. There will never be any patient data near his lab...

        Crikey Alaederach! Get that encryption software installed pronto. Your personal details are already being leaked on to the web!

      • by Lumpy (12016) on Thursday October 30, 2008 @07:54AM (#25567295) Homepage

        I can tell you that when we ran a PGP encrypted disk partition on a 12 disk raid 50 I had MAJOR performance losses compared to a standard raid 50. This was on older hardware, I had tested it on a 8 processor Xeon PIII 800 system with only 4 gig of ram installed, but it had a significant impact on data transfer rate.

        and yes I like re-purposing the Killer SQL servers of yester-year into a "Holy CRAP THAT's YOUR NAS??!??!"

        The hit was NOT on the drives, it was on the processors. It was enough of a hit to slow down data transfer rate out the GB connection to be as slow as a consumer single disk NAS.

        • by flappinbooger (574405) on Thursday October 30, 2008 @08:22AM (#25567443) Homepage
          The solution for the story submitter is simple, then.

          Run an analysis on the performance hit, document it, make a report and give the report to the persons who want the analysis done, and also the persons who pay the bills. (They might be different people).

          The report has a summary that says: I must install this software to comply with policy. I will then be accomplishing my work at only X% of the speed I was before. If that is not ok, then I will need to spend $Y to upgrade the equipment in order to maintain the previous rate of work. End of story. If they deny the upgrades then... that's their decision. If they approve the upgrades - hey, new equipment!

          The only potential problem I see is this: If the submitter has his own budget, IE he pays the bills, yet still must both maintain rate AND comply with the encryption policies... Hmmmm, well, not so easy. Then there needs to be a report that says his lab won't ever see patient data, with proof. Assuming the budget isn't there.
          • Re: (Score:3, Interesting)

            by twostar (675002)
            Don't forget to calculate the increase personnel time required to meet current output rates. This is often many times more $$$ then what some upgrade hardware costs. Then if they're ok with decreased productivity, and therefore higher man hours, then it's purely a management decision. You can also point to this report at a performance review as a reason goals were not met.
        • by Agar (105254) on Thursday October 30, 2008 @02:21PM (#25572993)

          Did you know that PGP WDE isn't officially supported on RAID configurations? I think it says a lot that the product worked in your environment, but a 12-disk RAID 50 configuration isn't exactly the sweet spot for a product targeted at laptop users.

          No surprise that performance would be poor given that WDE is neither tested nor optimized for that use case. ...yes, I work for PGP.

      • by morgan_greywolf (835522) on Thursday October 30, 2008 @08:45AM (#25567567) Homepage Journal

        The submitter is in a research institute. Some labs in that institute have patient data, and therefore require significant security like disk encryption.

        Repeat after me: "The first line of security is physical."

        If the servers are locked in a room with limited access (like, oh, say, 95+% of servers in the corporate world), then the probably not.

        Data security is about securing the data using reasonable compensating controls. If no one can get to the disks, and those who can comprise a limited list of, say, trusted sysadmins, then it doesn't matter whether they're encrypted or not.

        Requirements, if properly written, never specify implementation details -- the means. They only specify what is needed. How that is achieved is irrelevant so long as it the requirement is achieved completely.

        So other than for devices that are not in access-controlled environment (like laptops or, in some cases, workstations), the need for whole disk encryption at most places is nil.

        • Especially when we're talking about security.

          If your requirement is "I need my disks to be encrypted", for example, and your requirements go no further than that, then you may find your vendor of choice decides to encrypt your disk by XORing it with "TheSuperSecretPasskey". Technically encrypted, but not very useful.

          Think that's unlikely? That's how some eBook DRM schemes work.

          In the US and Canada, there's something called FIPS 140-2 [wikipedia.org] which describes in painful detail exactly what constitutes a "secure sys

        • Re: (Score:3, Insightful)

          by Mr. Slippery (47854)

          If the servers are locked in a room with limited access (like, oh, say, 95+% of servers in the corporate world), then the probably not.

          It has limited access - until a larger drive needs to be installed, and the the old one ends up in the spare parts bin and eventually gets sold as surplus, and somebody gets it home and finds your medical records on it.

          Or, the service is in a locked room with limited access - but the DVD-Rs with the backups get lost on the way to the off-site storage facility.

          Confident

      • by mysidia (191772) on Thursday October 30, 2008 @08:53AM (#25567631)

        I think the strategy should be to perform some speed comparison tests, to see if your research can be done with full disk crypto. Setup some vmware or other virtual machines.. and your test physical server.. Plug in a spare Hard drive, install a fresh OS, do testing of some virtual machines _with_ and without full disk encryption (on both host and on the VM), and tell them that the full disk encryption is slow if it is: reduces the effectiveness of disk cache, wastes memory, bogs down the CPU of machines that are needing to be used 100%, and better hardware is needed to run full disk encryption.

        You're in research, and such a major change to your environment deserves to be looked at a little before you implement it...

        I suspect with full disk crypto on your hardware backing the virtual disk, VM I/O performance will tank.

        Show them nice graphs of research computing productivity on the same equipment WITHOUT full disk decryption, and WITH it.

        Use "full disk encryption" policy as immediate justification for additional better hardware to compensate for the fact that the encryption is parasitic.

        And note the migration costs and loss of research time that results in having to make such drastic changes.

        Once you show the extra cost involved, they perhaps rethink the full-disk encryption blanket policy.

        Just make sure the cost you show is high... (much higher than any imagined savings through simplified policy and assured security)

        If you can't so much as justify a position against it, then why is PGP such a problem? If it doesn't hurt you... it certainly makes your research more secure from being stolen.

        1% overhead is still a hit if you are using your equipment 100%.

        But actually, I don't believe for a second that TrueCrypt or PGP is limited merely 1% overhead, the figure is deceptive in that running disk encryption has effects other than measurable disk I/O slowdown.

        There is also CPU usage of the encryption, and memory and reduces page cache effectiveness.

        i.e. The heavy cost of encryption must now in all likelihood be performed before data can be written to the page cache. This reduces system throughput.

        You may measure simple operations as only impacted by 1%, but in reality, there are certain write patterns that this will hurt severely.

        Just plain SELinux has overhead in excess of 10%.

        I would expect full-disk encryption of 30% or higher.

        It may be difficult to measure its true overhead if you don't fully use your hardware.

    • Re:Overhead (Score:5, Interesting)

      by stranger_to_himself (1132241) on Thursday October 30, 2008 @06:26AM (#25566885) Journal

      Truecrypt Whole Disk Encryption has less than 1% over head. I can't see the problem. Surely the patent and IP information security outweighs this minimal overhead.

      I work in a similar environment and we use truecrypt when transferring between labs and for data collection. For all other purposes we don't encrypt at all. What we do is keep medical information on a secure network but stored with with no personal identifiers, only a study id. The personal data as far as we need it is kept in a separate location on a machine that is not networked and is physically protected so that only the study admin team can use it (ie the same level of security as the paper records). The medical records and the personal identifiers do not usually need to be kept together for research purposes.

    • Re: (Score:3, Informative)

      by N Monkey (313423)

      Truecrypt Whole Disk Encryption has less than 1% over head. I can't see the problem. Surely the patent and IP information security outweighs this minimal overhead.

      That's what we got told when our laptops were "whole disk encrypted" with a competing product.... but it now means that a windows hibernate and restore take of the order of several minutes(!) rather than 10s of seconds.**

      I have not experienced PGP so maybe it has a much more efficient system, but I have my doubts.

      **Yes I know that MS make it impossible for these systems (apart from their own 8-|) to guarantee security of the hibernate file but I can't see how that would affect the performance.

    • Re:Overhead (Score:5, Informative)

      by wireloose (759042) on Thursday October 30, 2008 @07:54AM (#25567299)

      The patient information is a pretty serious concern. Any breach or loss of data covered under HIPAA, SOX, FERPA, or Privacy Act can result in some pretty severe expenses. The cost of notification to the individuals whose data was lost or exposed can run to more than $1,500 per individual, depending on the size of the breach. Base expenses start at $1-2M and go up fast. Litigation and fines can cost millions more. Anything that gets hacked or breached, that has information that should be protected, could put a company these days on the wrong side of the balance sheet.

  • by WK2 (1072560) on Thursday October 30, 2008 @05:04AM (#25566449) Homepage

    Whole disk encryption is excellent for security, but it will bog you down in disk access times. Depends on a lot of things, but reading and writing files can slow down up to 50%, but usually the slow-down is much less. If you are doing something that involves a lot of disk access and it doesn't need to be encrypted, then create a special, non encrypted partition for that.

    • Re: (Score:3, Interesting)

      by QuantumG (50515) *

      Do you have any numbers to back this up or are you just repeating common knowledge from decades ago?

      TrueCrypt claim a 1% overhead. With multi-processor machines, I doubt that's even accurate anymore.

      • by imsabbel (611519) on Thursday October 30, 2008 @05:35AM (#25566597)

        Sorry, they "claim" that.

        But on my core 2 2.4 Ghz machine, windows boottime more than doubled after encoding the system partition.

        Yeah, i can get 100Mbyte/s linear reads and writes.
        But for some reason, random or semi random access get hosed quite a bit.
        Maybe it messes with the comand queueing, or the internal prefetch alorithmns, i dont know. Never had a problem on data partitions, but the performance impact on the system drive was enourmous (up to the point that even with 6Gbyte RAM, it wasnt fun anymore)

        Ah, and i forgot one thing: the 100Mbyte/s is nearly 100% cpu load on both cores. I dont know where you get 1% overhead from... Even the in-memory benchmark only gets about 150Mbyte under full load on two cores.
        S

        • by TheRaven64 (641858) on Thursday October 30, 2008 @08:16AM (#25567399) Journal

          It depends a lot on what you're doing with the data. If you've got a single-threaded process that's consuming 50MB/s and you can read 100MB/s from the disk and run 100MB/s decodes on the other core, you won't notice the speed difference. If you're doing random access then you will have, say, a 9ms seek time to get the data and then a few more ms to decompress it. If your process is already I/O bound (many scientific computing tasks are) then a 9ms decode per block will halve the speed of your computation.

          The correct solution for this lab seems to be to borrow a policy from most defence-related sites. Have a secure and an insecure network. The secure network is allowed to access confidential data, the insecure network isn't. Run encryption on the machines on the insecure network, don't bother with it on the insecure machines. If one of the insecure machines is compromised or stolen then nothing confidential is lost.

      • Do you have any numbers to back this up?

        Here's some numbers: http://ask.slashdot.org/comments.pl?sid=1012285&cid=25566509 [slashdot.org]

        Make of them what you will :)

      • Re: (Score:3, Interesting)

        by GoulDuck (626950)

        TrueCrypt claim a 1% overhead. With multi-processor machines, I doubt that's even accurate anymore.

        Yeah - with version 6 of TrueCrypt, they introduced support for multiple cores, with almost double speed on a dual core system over a single cores system.

        I use a TrueCrypt encrypted USB disk to store and run VMWare virtual machines and I see no difference in speed over using a non-encrypted USB disk (same model).

        • by IWannaBeAnAC (653701) on Thursday October 30, 2008 @05:57AM (#25566713)

          That is interesting - if the overhead was really 1%, then why even bother with optimizations for multi cores?

          The other thing I cannot understand is why anyone would want to run whole-disk encryption on a compute server. Even the US DoD machines that are used for classified research do not do this!

          • by KStrike155 (1242390) on Thursday October 30, 2008 @06:26AM (#25566877)
            I work with the DoD on a classified program. You're right, we don't use encryption on any of our desktops, but the only reason is because you go through 2 security gates with guards, then finally enter a closed room with a giant digital lock with a badge swipe and keypad on the door, not to mention a giant separately digitally controlled deadbolt in addition to the digital lock.

            You better bet your ass that we use whole-disk encryption on any machine that would leave the building, though (such as laptops). And those are unclassified!
            • Re: (Score:3, Interesting)

              by IWannaBeAnAC (653701)
              I wasn't even talking about desktops though, I was talking about compute servers! I have used a few clusters at LANL, and yeah they have separate classified and unclassified machines (or sometimes, sections of machines) that are partitioned off for classified work, but even the classified part never (as far as I know) uses whole-disk encryption. The original question specifically said that they were intending to encrypt their servers as well.
    • by calmofthestorm (1344385) on Thursday October 30, 2008 @05:43AM (#25566665)

      The numbers on my machine are about 20% slower read and 30% slower write. I'm using 256 bit LUKS with serpent-xts-essiv:sha256.

      Might I also suggest hardware encryption? Seagate (and others I believe) make drives that do AES128 (good enouhg for this sort of thing I believe) in hardware. Zero performance hit. No software required. Set a drive password and go.

      • by nmg196 (184961) on Thursday October 30, 2008 @06:01AM (#25566733)

        I'm not sure that assuming that just because somethings done in hardware, that it happens in zero time (or even near zero time) is at all accurate. A review I read of a different encrypted drive, said it was 5-10% slower than it's non-encrypted equivalent. It wasn't the Seagate you're talking about, but I doubt that even hardware encryption can do it instantly, so I think your "zero" is an exaggeration.

        • by xouumalperxe (815707) on Thursday October 30, 2008 @06:10AM (#25566773)

          Presumably, he meant that encryption done on the disk itself is transparent to the rest of the computer. What you see is a comparatively slow hard drive, not the existing resources (ie, CPU) being eaten up by the encryption job and low disk throughput. Same all other dedicated controllers: you're offloading processing to a dedicated chip, so, for the purpose of generic programs on the CPU, you can assume there's no performance hit.

        • by calmofthestorm (1344385) on Thursday October 30, 2008 @06:10AM (#25566775)

          It may incur overhead but it need not. Consider that you don't need "instant" encryption, you simply need a device inside the hard drive between the computer interface and the actual storage medium that is capable of encrypting and decrypting at or above the drive's maximum throughput speed. This need not be "instant", it merely need be fast enough block-by-block to pass the data along. Consider that hard disks store data in blocks, not streams.

      • Re: (Score:3, Interesting)

        by bhima (46039) *

        The FBI has already demonstated that it is extremely easy to bypass the security on those drives. I would not use them.

        • "those drives"? as if they're all the same?

          Come come. Software encryption is trivially vulnerable to a coldboot attack.

          In any case I'd want to see a link, you can't lump all "encrypting drives" together as if they use the same method (unless they do).

          Of course, I wouldn't really be surprised if it were breakable, I'd simply like to see support. Me, I use hard and soft becaues I'm just tinfoil like that. Don't even have anything to hide either, just like my privacy.

          • Re: (Score:3, Interesting)

            by gr8dude (832945)

            A coldboot attack is trivial on paper and in a controlled environment, but not in real world scenarios.

            First you need to get hold of an unattended machine that works and the disks are mounted. You can minimize the probability of that by enforcing certain policies such as never leaving the machine unsupervised, closing access to the computer's case or even locking the case, etc.

            Trust me, cold boot attacks are not the greatest concern.

    • I used to have my laptop hard disk encrypted (using LUKS) but the hardware is getting pretty old now and I was starting to have problems with timing-sensitive applications such as audio and video. I think it was more bad timing interaction between the crypto layer, LVM, ext3 and the memory cache than raw throughput issues. I had a lot of layers and they weren't quite talking to each other right. Most of the time this was fine but occasionally it would add a tiny bit of latency to a disk request and audio wo

  • by mlts (1038732) * on Thursday October 30, 2008 @05:06AM (#25566453)

    Perhaps one answer for storing data securely, but allowing it to dynamic expand is to create a PGPDisk that is dynamically expanding. Then, the data in can be safe, but the file can be moved to bigger RAID arrays if need be.

  • by Pikiwedia.net (1392595) on Thursday October 30, 2008 @05:07AM (#25566457) Homepage
    An IT policy is a general rule which has to be interpreted and adopted. It's not supposed to be followed by the letter. Ask your IT department what they want to accomplish with the policy, and how you can help them accomplish that without having your work ruined.
    • by Smertrios (550184) on Thursday October 30, 2008 @07:01AM (#25567045)
      I hate to disagree, but I have to. IT policy is a law that must be followed. What the problem here is the people creating the law sees only the end goal and not the road that needs to be traveled. Talk to them and show them what is required of you during the research. Tell them that other ways need to be looked at in achieving the goal before this is implemented. More harm is done and time lost by people trying to circumvent the policies then it is by sitting down with them and stating the procedures that are done and stating why a different method is needed.
    • by whoppo (218875) * on Thursday October 30, 2008 @07:31AM (#25567179)

      I'm with Smertrios on this one.. IT policy is just that.. a corporate policy. It's not subject to end-user interpretation, it's a definition of how IT resources are to be deployed and utilized. The written policy itself is what gives the company the "teeth" to discipline employees who choose to make their own interpretations and NOT comply.

      Now back on topic: Whole disk encryption? For removable / transportable media, ABSOLUTELY! For enterprise data backups, ABSOLUTLEY! For live data on active servers, meh.. not as critical. If your data center employs appropriate physical, network and host security, your data is reasonably safe. If someone compromises your network -> system security, they've got your data.. encrypted or not. It's wonderful that your IT department has the desire to achieve the highest level of security possible, but there is always a balance that needs to be struck between the holy grail of ultimate security and the ability to do business. The OP needs to help everyone find that balance. A good place to start would be his local neighborhood HIPAA expert to make sure that no "business needs" prevent the company from maintaining regulatory compliance. Once the specific requirements for his continues compliance have been identified, then anything beyond that becomes somewhat negotiable.

    • by yttrstein (891553) on Thursday October 30, 2008 @07:55AM (#25567303) Homepage
      I'm in agreement with Smertrios as well. It's easy to see why such a blanketing policy is necessary--have you ever worked with scientists? While possibly quite brilliant, most of them seem to have the same problem remembering to keep sensitive data encrypted. The only logical solution to this is to write a policy which requires everything to be encrypted.

      Sounds to me like the IT department in question knows what it's doing, and who it's clients are. It's rarely mentioned outside an IT department, but I'll share one of the big secrets: 98% of the job of any IT department is to protect users from their own stupidity. The smartest users are the ones who realize this and give the IT department enough space to operate, while at the same time learning as much as they can about what they do so they have a real understanding of how to specifically follow the rules while at the same time getting everything done.

      It's not impossible at all.
  • by sugarmotor (621907) on Thursday October 30, 2008 @05:10AM (#25566465) Homepage

    Surely what is required is to isolate the sensitive information, so that it can be protected.

    Blanket encryption may impress some people, but it hardly solves the problem.

    Details of how to implement isolation and protection would depend on the data, and which subsets are used in the calculations.

    Stephan

    • by Anonymous Coward on Thursday October 30, 2008 @05:29AM (#25566559)

      You really want blanket encryption because you to worry about such things as swap space, scratch copies made and then deleted and people forgetting to encrypt files.
      If the encryption is done at the block device level (such as dmcrypt on linux) the impact is minimal on how things work and overhead and you are fairly well protected (unless the machine is accessed while powered up by someone wants the data as opposed to just the machine).
      Fedora can make all partitions except /boot encrypted during install.

      • by postbigbang (761081) on Thursday October 30, 2008 @05:47AM (#25566681)

        I second that.

        If you're looking for an excuse not to protect the data, that's one thing. But TrueCrypt has lots of support and does a good job. PGP in general is well-known and has been refined frequently. That's the reason you don't find a lot of negative criticism-- there isn't any because it works fairly seemlessly. You'll find hard disk controllers don't help the process much, but if the machine does work in batches, and you backup frequently (presuming you're backing up an encrypted partition) and you use a UPS (or your controller supports battery-backed write cache), you can use various write cacheing driver options and techniques to boost performance dramatically. What write cacheing *can* do is to also cause transactional integrity problems if there's a machine hickup. Otherwise, writes are queued up and get batched onto disk. Performance can be 10x, so long as you understand the potential evils involved. It takes the sting out of the disk I/O degradation, but how much will vary with the duty cycles of your application's I/O profile.

    • by jonaskoelker (922170) <jonaskoelker@gnu. o r g> on Thursday October 30, 2008 @05:35AM (#25566595) Homepage

      Surely what is required is to isolate the sensitive information, so that it can be protected.

      That's a great idea that in practice will leak your information. The reason is that _every_ application that touches your data needs to know that it should keep your data confidential.

      Broswers know to not cache data transfered over https. It knows the data was encrypted, it knows to be smart with it [for "protective" value of smart].

      When you have a program that reads a file through a transparent layer of encryption, it never sees the "please-be-careful-with-this" label, and so the desktop search engine will index all the strings, the editor will write backups to . or /tmp, and so forth. All the apps think they need to do is respect what you meant by your mode bits (if you're on *nix), so it'll chmod/umask the /tmp copy the right way. If someone grabs your disk and you didn't encrypt /tmp, you lose.

      And no, encrypting /tmp won't fix it: you need to know that everything the user of the data can write to is encrypted if you want to be sure. I only know one way that I can somewhat confidently say solves the problem: encrypt everything. [and then there's the network, but we'll save that for another decade ;)]

      Only encrypting the sensitive data is like carrying water in bucket used for target practice: stuff will leak.

      • Re: (Score:3, Interesting)

        by mdielmann (514750)

        Surely what is required is to isolate the sensitive information, so that it can be protected.

        Only encrypting the sensitive data is like carrying water in bucket used for target practice: stuff will leak.

        He said isolate, not encrypt. It changes the context a bit, doesn't it?

      • Re: (Score:3, Interesting)

        by Karellen (104380)

        I agree. I do things the other way around, and have almost everything encrypted.

        I then have a unencrypted disk mounted at /mnt/unencrypted/ with per-user subdirectories (usually symlinked from ~/unencrypted for each user) which can be used for data - usually large files - that are known to be non-sensitive.

        Everything users work on is protected by default. Users need to make a conscious decision to put stuff in the unencrypted partition, so they tend to only do it after they've noticed a performance problem,

    • by calmofthestorm (1344385) on Thursday October 30, 2008 @05:45AM (#25566671)

      Someone will write the passphrase down anyway. Isolate the data.

  • Policy Exception (Score:3, Insightful)

    by Anonymous Coward on Thursday October 30, 2008 @05:10AM (#25566467)

    You've got a good case for an exception from this policy. Just follow the exceptions process and have your management sign off on the risk. Case closed.

    • by jamesh (87723) on Thursday October 30, 2008 @06:03AM (#25566739)

      If there really is a performance loss, and you can quantify it, then you can attack it from another angle, eg an impact statement to management along the lines of "This will introduce a %% performance loss to our workloads, at a cost of $$$. In order to maintain the same level of productivity we will require upgraded hardware at a cost of $$$".

      Having a manager who is concerned about his departments budgets on your side can help your case too :)

  • by jonaskoelker (922170) <jonaskoelker@gnu. o r g> on Thursday October 30, 2008 @05:18AM (#25566509) Homepage

    what are the disadvantages of PGP in terms of high-performance computational research?

    O(1) ;)

    Here's a brief experiment I ran: dd if=/dev/zero of=/home/jonas/zeroes bs=1048576 count=1024; that is, writing one gig of zeroes to a disk encrypted with ubuntu's disk encryption from the 8.04 alternative installer.

    I saw a roughly constant ~30% CPU usage from kcryptd, going from 25% to 35%, on a 2.13GHz Pentium M (in a thinkpad t43p). So I have 1.5 GHz worth of cycles left.

    Hard disk write speed was about 30 megs per second, but oscillating in big leaps. I did my observations with conky, sampling in one-second intervals, but conky is known to sometimes merge two samples. That's probably not the only factor, disk writes are most efficient when clumped together into one big (much preferably sequential) write, so I'd assume the kernel does this.

    You haven't told us what your disk usage patterns are. But if you're doing one big read, one big computation, and then one big write, there's going to be zero impact (almost): there was lots of CPU capacity left.

    Another low impact scenario is that you have a server that reads work units from disk, hand them to clients, gets results and writes the results back [I assume clients don't need any disk activity]. There you can read a bunch of work units in advance while the server is idle, then hand them out instantaneously when needed.

    Aside: bugger, fault in my experiment: I didn't look at the CPU usage of kernel code that's not in the process table. Take what I say with a grain of salt.

    But: do the measurement in your own world. My software, hardware and artificial measured usage pattern may differ from yours, subtly but enough that my conclusion doesn't transfer. Be scientific about it :)

    • Re: (Score:3, Insightful)

      by LiSrt (742904)

      "But: do the measurement in your own world. My software, hardware and artificial measured usage pattern may differ from yours, subtly but enough that my conclusion doesn't transfer. Be scientific about it :)"

      Best advice I've seen - try and build up a representative sample of a day's work (or just a random sample if that's not easily determinable), copy it, run one copy on unencrypted disks and one on the mandated encryption.

      If there's a significant difference take the evidence to your IT dept. or supervisor

    • by Splab (574204) on Thursday October 30, 2008 @07:18AM (#25567119)

      "There is lots of cpu cycles left"
      Uhm. You are losing 30% cpu cycles, that is quite a lot. Yes there is amble power left for your office apps etc. but original poster says he is doing high performance computing - losing 30% of your throughput for reading data is a lot!

    • Re: (Score:3, Interesting)

      Are you sure your gig of zeros are treated exactly like any other data? If I screw up my simulations they usuually end up processing lots of zeros. It's obvious when this happens because they finish very quickly. Maybe you should generate a gig of random numbers instead?
  • feel-good actions (Score:3, Insightful)

    by scientus (1357317) <.instigatorirc. .at. .gmail.com.> on Thursday October 30, 2008 @05:27AM (#25566547)
    in these type of departments all the computer are on all the time anyways and whole-disk encryption is 100% vulnerable to hard-boot attacks. It may be remotely useful on laptops but for desktops its entirely useless

    if you want to actually protect your data you need to encrypt only whats sensitive and only mont it when neccicary. also PGP is closed source and what are you going to do if they stop supporting, use truecrypt or LVM, etc. Also dont neglect network protection where the real data is stolen
  • Incompatible? (Score:4, Interesting)

    by Bromskloss (750445) < ... <at> <gmail.com>> on Thursday October 30, 2008 @05:30AM (#25566563)

    Furthermore, there is some evidence that certain forms of compression are also incompatible with PGP whole disk encryption.

    What do you mean by "incompatible"? At first glance, you seem to mean that there are certain file formats, making use of compression, that cannot be stored on the encrypted drive. That certainly can't be true.

  • by Nicolas MONNET (4727) <nicoaltiva@nOSPAM.gmail.com> on Thursday October 30, 2008 @05:37AM (#25566615) Journal

    Their product doesn't seem [pgp.com] to run on Linux.
    There is better, cheaper F/OSS software to do the same thing though; Ubuntu and FC9 already include a whole disk encryption option at install. (It's better because it's much less likely to have an NSA back door, although obviously never completely certain).
    As for performance, when I tried it (luks encryption) on a desktop machine, it wasn't noticeable; but I wasn't moving hundreds of gigs around.
    The question now is what are they trying to protect. Encrypting laptops is sensible, and in fact, given how easy & cheap it now is, it's rather stupid not to do it. On desktop PCs, it's not that clear. Whole disk encryption will only protect you against someone with physical access to the machine turned off. It certainly won't protect you against trojans or browser based vulnerabilities. So the question is, do random strangers roam your offices?
    And encrypting servers/clusters? That's just silly; unless you expect the men in black to storm in your building.

  • by ekran (79740) * on Thursday October 30, 2008 @05:38AM (#25566617) Homepage

    Positive:
    - added security

    Negative:
    - worse performance
    - you may forget the password (it has happened before.)
    - has to be mounted manually (or at least type in password each time you need access to the data.)
    - it's painful to backup
    - it's painful to do a proper file systems check
    - if the discs are somehow taken by the authorities you might have to give up your password (or be sentenced for whatever they think you have on the discs.)
    - discs are only secure if they are not mounted.

    There are a few negative sides, but usually they make up for the positive, i.e. if you really need the security then of course this is the way to go. Also remember to secure the other aspects of the machine, like physical access (including fire/theft), software protection (anti malware and virus) and network protection (firewalls, etc.)

  • Disk Encryption (Score:3, Interesting)

    by mseeger (40923) on Thursday October 30, 2008 @05:40AM (#25566629)
    Hi,

    we're selling a different solution, but some remarks from our real life experiences:

    • Performance is not the problem. Compared to other problems, this one is insignificant. It gets even more insignificant with multi core CPUs.
    • Encryption is also not the problem, it's the decryption that gives you headaches. Users loose passwords, tokens, certificates, etc... You must be able to help them when they are somewhere in Africa and need to recover their lost password for the disk encryption.
    • Encrypted disks are significantly harder to recover from a head crash or other HW related problems. Try the procedures your manuacturer gives you at least once before you need them...
    • There are a lot of other issues to consider:
      • You need to check the compatability with your disk encryption with each new OS release and new hardware. As for all enterprise projects: try to use as little different hardware/software as possible.
      • Service and Helpdesk personel needs to be trained
      • Think about how to do the rollout
      • Do people with encrypted notebooks travel to countries where it might be illegel?
      • How do you handle requests from law enforcement if they suspect one of your users?
    • General rule: Every hour you put into the project before the rollout saves you 10 hours support :-)

    Sincerely yours, Martin

    • Re:Disk Encryption (Score:4, Informative)

      by TheRaven64 (641858) on Thursday October 30, 2008 @08:24AM (#25567459) Journal

      Performance is not the problem. Compared to other problems, this one is insignificant. It gets even more insignificant with multi core CPUs.

      I'm sorry, but this is just wrong. Encryption, with a sufficiently fast CPU, will not affect your throughput. It will, however, affect your latency. I know, from the results of part of my PhD, that in an I/O bound scientific computation process, a 0.5% decrease in average latency can give around a 20% better running time. If decrypting a block takes 1ms, added to the 9ms for seeking, then you can easily be slowing down the kind of task that the original poster is talking about by 50% or more.

      Most users won't notice encryption because most users don't do much that's I/O limited, and when they do it's often limited by throughput, not latency. Try running full-disk encryption on your database server, or on a scientific computing machine, and you will see serious performance problems.

  • by CuteSteveJobs (1343851) on Thursday October 30, 2008 @05:43AM (#25566659)
    If you do encrypt why use PGP? It costs money and its proprietary. Use Truecrypt which is free and open source, does whole disk encryption which according to this can sometimes actually *boost* performance. I use Truecrypt daily and its awesome. http://en.wikipedia.org/wiki/Truecrypt#Performance [wikipedia.org] http://www.truecrypt.org/ [truecrypt.org]
  • This has happened (Score:4, Insightful)

    by DynaSoar (714234) on Thursday October 30, 2008 @05:50AM (#25566695) Journal

    I've worked with people during various research projects who decided to encrypt, for some very good reasons. I've had one admin die, and one researcher have a stroke. In both cases they had information necessary for the project that nobody else could get to, even when their hard drives were retrieved. The results are that after several years, the stuff is still sitting somewhere unusable because the people who attempted to get to it were stymied. Enforcing PGP on an entire network could multiply this problem. I would think that enforcing PGP on users not needing it would be a royal pain for them.

    What we've done and thought of since:

    Have only those with sensitive information encrypt. Have them work on machines not connected to the net. If they need net access, have them connect only for the time necessary, and mandate pre-encryption back ups prior to connecting.

    Preferred, but resisted, keep the sensitive machines off the net and have the researchers connect to the net via a different machine without the sensitive info on it. If they want to use it for transfers of such info, make them use sneakernet between the sensitive and connected machines. In this scenario, they only need PGP for what they're going to transfer to the connected machine and thus to outside. Both admins and researchers expect full connectivity throughout their net, but the best security is a nackered line.

    I use the sneakernet method exclusively. What I transfer when necessary is hundreds of MB to tens of GB of data. It takes me 10 to 30 minutes to encrypt, burn the data to DVDs and carry it to the connected machine. Like most researchers, I'm busy and don't want to spend my time doing this, but I have assistants I can put the task on.

  • 5 reasons (Score:3, Interesting)

    by Knightman (142928) on Thursday October 30, 2008 @05:53AM (#25566705)

    There are several reasons why a policy of having all disks encrypted is bad:

    1. Sensitive data should not be stored on a computer that can be carried away or easily accessed, with or without encryption.
    2. Blanket security measures just means that the employees will find ways around them which usually means that you probably end up with bigger security problems.
    3. Failing or failed disks goes from a serious problem to a critical problem for recovering data.
    4. If you are running I/O "happy" software you are going to take a perfomance hit.
    5. It's not a "green" solution since the encryption is done in software and the computer is going to use more power.

    Oh, and let me re-iterate: Sensitive data should not be stored on a computer that can be carried away or easily accessed, with or without encryption. Just look on how MI5 left laptops all over the place.

    The policy we use when working on sensitive data is that it's all stored centrally with rigorous security measures for accessing it and the only way to access the data is through a Sun Ray thin client. That way we minimize the risks for electronic information leakage, ie. someone mailing information etc.

  • Drive Errors? (Score:5, Insightful)

    by CopaceticOpus (965603) on Thursday October 30, 2008 @05:55AM (#25566709)

    My concern with encrypting an entire disk would be fault tolerance. If a sector goes bad on a non-encrypted drive, you might lose a file. If it goes bad on an encrypted drive, do you risk losing more data or even the entire drive?

    Of course, one could say that's why you make backups. But presumably the backups would also be using encryption. Therefore, they would be susceptible to the same effect. If there is a greater chance of total data loss on each device, the chance of multiple device failures leading to unrecoverable data also increases.

  • by quietwalker (969769) <pdughi@gmail.com> on Thursday October 30, 2008 @06:11AM (#25566781)

    My workplace recently mandated that all laptops/portable media be encrypted. The impact to the system cpu usage isn't that significant to be honest, except when attempting to access, say, USB drives.

    What's more important is the reliability of the disk itself.

    As everyone knows, drivers shipped with laptops tend to be the first casualties of boot-sector-loading programs, like disk encryption and certain virus scanners.

    Guess what happens when your encrypted disk can't be booted? You can't boot under a windows/emergency restore disk, because your partition is not readable. You can't boot off anything other than the hard drive. Guess what happens if the corruption doesn't allow you to run the encryption app's boot loader? Only solution is to format the disk.

    Some of us who have been hit by this already have gone through the trouble of ensuring that any data we want to keep is stored on a shared drive, and that all work is done in a VM, which is occasionally uploaded to the shared drive as well. Since any given windows or driver-affecting update could kill our machine at any minute and make it entirely unrestorable, that's what's required.

    So in essence, we're switching back to storing the media on a non-encrypted device because the loss of the data is more important than the security of the data.

    This reminds me of the policies surrounding passwords I've seen at many companies; limiting the set of choices by making password creation requirements, and forcing them to change so often that people end up writing them down and leaving them on their desk. Defeats much of the purpose of having them in the first place.

    • Re: (Score:3, Insightful)

      by STrinity (723872)

      Guess what happens if the corruption doesn't allow you to run the encryption app's boot loader?

      You pop in the boot/recovery CD the encryption app forced you to create before it'd encrypt the drive? At least that's how it works on TrueCrypt.

  • by Idaho (12907) on Thursday October 30, 2008 @06:15AM (#25566805)

    In the time you spent writing this post to Slashdot, you could have written a friendly letter to your IT department stating that you want some machines to not use this encryption, because these machines need maximum performance and anyway do not store any kind of personal information.

  • by hAckz0r (989977) on Thursday October 30, 2008 @06:24AM (#25566859)
    The only protection that Full Disk Encryption gives is if someone physically gets their hands on the machine that they can not boot the machine and read its contents. This make perfect sense for laptops but makes little sense for any pertinently fixed location workstations. A laptop will physically leave the premises so it leaves itself open to theft, but a workstation (assuming you have some decent form of physical security) is much less likely to need this protection. Once a workstation is booted and the disk drive unlocked digitally then any hacker that gets a foothold on the system would then have access to it, so all that overhead of full disk encryption does no good unless the encryption is done per-user-session. When you need assess to the data you authenticate and start decrypting then, and keep it encrypted across the network. Yes, that data that you speak of should be encrypted, but you must encrypt it at the correct level to actually increase its security rather than just slowing down the machine. Anything short of that level of control and you are just fooling yourself into thinking you have protected the data. Fool-Disk-Encryption is not always the answer.
  • by ACK!! (10229) on Thursday October 30, 2008 @06:48AM (#25566991) Journal

    My windows work laptop went from a fast little Duo Core fairly recent Dell which was quick but felt pretty damn cheap to a complete slow dog.

    Not sure if its the software my company used or if its the disk IO overhead or what.

    I do know after encrypting my entire disk I now get the PGP login screen immediately after the CMOS screen and before the Windows loader. No Problem.

    The real problem is after that. The minute Windows loads up the disk starts churning and barely ever stops.

    It just churns and churns and that little hd light just keeps going and going.

    And everything just slows right down after that. Oh yes I am not the only one saying this either. Almost everyone reporting the same sort of results.

    I actually thought it was a good idea - considering the amount of travel many deployment personnel in my company commit to in a year.

    But do your research.

    Try out whatever solution with your heavy hitting power users.

    Don't settle for security that hampers performance.

  • by segedunum (883035) on Thursday October 30, 2008 @06:53AM (#25567013)
    I don't understand people who think that if they encrypt something it automatically becomes secure. For that data to be of any use to someone it will need to be decrypted and relevant people given access, so that destroys the notion of defacto encryption for security right there.

    Encryption assumes that bad people are going to get access to your data whatever happens, and if you are using whole disk encryption then you really need to be seriously asking yourself who has physical access to your disks and where your data is located. That needs to be sorted out first, and once it is with data held centrally, I doubt whether disk encryption will be needed. You will probably need some form of encryption between the data and the remote users though. Using full disk encryption gives you something else to go wrong, is a variable in performance impairment you probably can't account, is something else to support for and will almost certainly be unnecessary once you've taken other steps first.

    If you're keeping confidential patient information where it would be a Bad Thing(tm) if it ever got mislaid (even if it is encrypted, you don't want a computer with stuff on it lost I assume), in the name of all that is holy, please centralise your data and vet access. Stop people from passing around Excel spreadsheets of data, regardless of when and how it is encrypted.

    I really am aghast as to how stupid people are about how and where their data needs to be protected. PGP is the wrong solution here, if you can call it a solution.
  • by Zashi (992673) on Thursday October 30, 2008 @07:51AM (#25567279) Homepage Journal
    If you've got enough money lying around, you could get a Blet--er.. probably shouldn't use the code name. You could get a MR10is "VAULT" RAID adapter [ibm.com] from LSI and IBM (for SAS and sata drives). I got to QA test it, put it through its paces. It seems to be pretty decent (now) and lets you fully, transparently encrypt your hard drives.

    They're over $1,000, but if performance and security are that important to you it may be worth it. The VAULT only supports internal drives, but I think a morg--er.. I don't even know what the non-code name for those cards are... I think an encrypted version of the MR10m, which is for external SAS/SATA hard drive enclosures, is in the works.
  • by cheros (223479) on Thursday October 30, 2008 @08:20AM (#25567423)

    If there is one HUGE problem with whole disk encryption it is recovery from disk failure. I'm not talking about your average Windows crash, PGP whole disk crypto is OK with that. I'm talking about a more massive failure which makes a mess of the NTFS indexing (Windows can do that too).

    Normally, you have three options:

    - restart and pretend you don't have a problem. Rather hard if you're missing a lot of files :-).

    - permit CHKDSK to clean up the disk. In my experience that is a sure way to guarantee you will never be able to access your files in a sensible state ever again. No idea why, but Microsoft doesn't appear to have focused on file recovery with CHKDSK, more on returning the disk to a consistent state. Or maybe I need to do some RTFM :-)

    - use another tool to access the disk which doesn't need a 100% clean NTFS layout to still scrape the files off. This can be typically done with a Linux live CD as the read-only NTFS mount of Linux is substantially less picky about how consistent the file system is. I introduced this idea to a large consultancy when I worked there and they have saved a good amount of data this way over the years.

    When you use full disk crypto, forget about booting up another OS to recover data. Installing full disk crypto without adding a good backup solution (encrypted, of course) is asking for trouble.

    What I like about PGP is the ability to use additional keys which are split, so you need to involve multiple people before you can backdoor it. However, it always makes me wonder if there isn't an additional key of which we don't know anything..

  • Let's be clear (Score:3, Interesting)

    by ratboy666 (104074) <fred_weigel@ h o t m a i l .com> on Thursday October 30, 2008 @08:31AM (#25567481) Homepage Journal

    USE THE CRYPTO

    and yes, I'm shouting. This has been resisted for too long -- its kind of like garbage collection in programming systems.

    The slow part is the physical drive. The crypto can be done FASTER than the physical drive, which means, at worst, that an additional processor needs to be assigned.

    Indeed, the ONLY time I recommend crypto not be used is when dealing with US border. There I use file crypto on specific projects, along with dd to overwrite freespace and swap (before crossing).

    But, if everyone (looking at US citizens) starts USING crypto, a fully encrypted laptop would not raise suspicion.

  • by managerialslime (739286) on Thursday October 30, 2008 @09:02AM (#25567725) Journal
    "Performance" is only a valid topic after addressing reliability.

    In my company, we gave up on PGP's whole disk encryption after it consistently locked up (but was ok after many multiple reboots) on both Panasonic Laptops and Lenovo Laptops.

    For the last few months, we have been trying TrueCrypt on the above brand laptops and also and HP desktops with no issues (as of yet).

    If you load RAM by opening a bunch of simultaneous Windows and then run some mathematical loops that represent the kind of calculations your environment demands, you can then determine whether the overhead of TrueCrypt (or whatever) is worth the security benefit.

    Good luck.

    No matter where you go . . . there you are. - Buckeroo Bonzai
  • No problems (Score:3, Interesting)

    by octaene (171858) <bswilson@gmail . c om> on Thursday October 30, 2008 @09:49AM (#25568391) Homepage
    I'm a big fan of TrueCrypt, and have used it for about 3 years. In that same time, my company has rolled out PGP Full Disk encryption. Honestly, we've had nothing but positive feedback about how unintrusive the product is. This is easily the security tool with the highest positive employee feedback we've ever deployed, hands down.
  • Risk vs Reward (Score:3, Informative)

    by PrivacyDeath (1186903) on Thursday October 30, 2008 @11:35AM (#25570209)

    alaederach wasn't looking for a sales pitch on Truecrypt. The decision has been made. He is looking to the slashdot community to empower him with a good argument to resist encryption. I hope that he chooses to embrace encryption, while recognizing that it is not applicable to every environment or computer. He can still make an informed argument against it in his case, provided he is correct in his assessment.

    POLICY

    alaederach, I believe the folks that posted advice about resolving this through the proper channels to get an exception to the policy is your best route. Dont start argumentatively. Explain your concerns and keep an open mind about them. Start with a member of the team that is deploying PGP and ask what the proper procedure is to get an exception to the policy. If there is a project manager assigned, that would be the person to start with. Project managers are usually more open to the needs of your area, and have the power to address issues that are raised during the implementation process. Kindly explain your concern, and ask if a high performance system can be benchmarked and tested prior to the roll out of PGP.

    PERFORMANCE

    As a proud tin foil hat wearing network administrator whom has rolled out PGP, I did not find a performance hit that was enough justification to make an exception in our environment. However, the identified risk of data loss and theft was a concern for the traveling laptops. The servers were less of a risk due to the physical security controls that were in place. PGP was only rolled out to laptops in my environment. I would recommend extensive testing prior to the roll out for high performance machines. Boot times were slower, but were measured in seconds vs minutes. In every case where performance was an issue, it was typical problems that one might find on a windows machine, and was unrelated to the encryption.

    SECURITY

    Every time I have worked as a member of a team deploying a security measure, the same argument is claimed by someone. "There is no reason to do X as it can be subverted." That goes for policy, physical access controls, software, and hardware. Encryption is no exception to this. Yes, warm and cold boot attacks are possible. Yes, highly motivated individuals, groups, and governments may have the ability to access your data. Security is best used with many layers. It can be highly effective at reducing risk, and keep higher percentages of the population from accessing or corrupting your data. alaederach, your best argument here is risk vs reward. This is where you kindly make your claim that risk is low due to the low impact of data loss in your environment. At the same time, if you have good physical security controls, you might want to include that in your argument. If the data that your work produces is valued higher by the decision makers than what you are sharing with us, then you may want request the performance testing and explain the risk of lower production due to performance. Geeks love performance testing, and if the highest risk is determined to be your computing performance, you just might find an exception to the policy.

    MYTHS

    A network adminstrator that gets hit by a bus, will cause your data to be lost. FALSE. The majority of organizations that have the funds to implement a project such as this, will also have determined off site storage of encryption keys as well as any othe data that would be backed up. Usually it is a different geographical location that utilizes high physical security controls. Yes there will be members of the staff that will have access. That is why there are Human Resource controls in place to vet the administrators. I.E. background checks.

    An encrypted drive can not be accessed to retrieve data. FALSE. encrypted or unencrypted, proper data backup methods should be in place. With PGP specifically, I created a bartPE cd that allowed retrieval of data on a hard

  • by Todd Knarr (15451) on Thursday October 30, 2008 @12:34PM (#25571193) Homepage

    Your IT people need to remember that whole-disk encryption only protects against some threats, not all. It's mainly going to protect against physical theft of the drives themselves, or the computer they're in. That means it's going to mainly benefit laptops that're out in the world where they can be easily stolen. Office desktops, if they're stolen that means someone had physical access to the building to take them. If the IT department can't name the last time a desktop was stolen from the building, theft is probably not an issue. Servers aren't likely to be stolen at all, they're locked up in a presumably secured data center and I just don't see an outsider being able to get in there let alone unrack a server and walk out with it under their arm. Again, if IT can't name the last time a server was stolen it's probably a non-issue.

    And even in the case of a laptop, the encryption only protects the disk while the computer's powered off or in a state where the encryption software's discarded the key and won't decrypt the disk again without you re-entering the password. We found where I work that the standard suspend mode of the laptops does not trigger PGP to prompt for the password on resume, for instance. Since most of our people leave their laptop suspended while carrying it around rather than turning it completely off (to speed up start-up), the PGP encryption essentially isn't protecting the disk at all since the thief won't need the password to get the data decrypted. I don't count the normal screen lock, since if that were sufficient you'd just force password lock on the screen saver and not need encryption at all.

    And of course whole-disk encryption won't protect you at all from viruses, trojans and other malware that gets onto the system and starts sending data back home. That stuff's running after you've helpfully given PGP the password and it's cheerfully decrypting data for you, and it's running as you so PGP thinks it's you accessing the data. Again, for office desktops and servers remote access by malware's probably a bigger concern than physical access to the machines and you need something other than whole-disk encryption to protect against those threats.

    To be honest, I'm much more of a fan of removeable media. Put the patient data on a USB stick, then plug the stick in to access the data and remove it when you're done. If the sensitive data isn't on the computer then nobody can get it by stealing the computer. Just don't fall victim to those "encrypted" USB sticks, many of them either use algorithms that're trivial to break or they fail miserably at some point (eg. leaving the encryption key in unencrypted unprotected space where it can be extracted and used by a thief). It's much easier to lock some USB sticks or CD/DVDs up in a secure drawer than it is to protect a computer.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (2) Thank you for your generous donation, Mr. Wirth.

Working...