Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Security IT

Should Companies Share Criminal Blame In ID Theft? 328

snydeq writes "Deep End's Paul Venezia criticizes the lack of criminal charges for corporate negligence in data breaches in the wake of last week's Best Western breach, which exposed the personal data of 8 million customers. 'The responsibilities attached to retaining sensitive personal identity information should include criminal charges against the company responsible for a leak, in addition to the party that receives the information,' Venezia writes. 'Until the penalties for giving away sensitive information in this manner include heavy fines and possibly even jail time for those responsible for securing that information, we'll see this problem occur again and again.' As data security lawyer Thomas J. Smedinghoff writes, data security law is already shifting the blame for data breaches onto IT, thanks to an emerging framework of complex regulations that could result in grave legal consequences should your organization suffer a breach. To date, however, IT's duty to provide security and its duty to disclose data breaches does not include criminal prosecution. Yet, with much of the data security framework being shaped by 'IT negligence' court cases over 'reasonable' security, that could very well be put to the test some day in court." It's a slippery slope to be sure, but where should the buck stop?
This discussion has been archived. No new comments can be posted.

Should Companies Share Criminal Blame In ID Theft?

Comments Filter:
  • Yes/No (Score:5, Insightful)

    by HappySqurriel ( 1010623 ) on Monday August 25, 2008 @04:08PM (#24741551)

    I think it is entirely appropriate to investigate a company when large ammounts of personal information ends up being 'stolen' ... If it turns out that the company did not take the necessary steps to protect people's personal information they should face some consequences. At the same time, there has to be an understanding that even the best technologies available and best practices may not prevent all personal information theft so a company should not face harsh consequences if they took the necessary steps to protect people's information.

    • Re:Yes/No (Score:5, Insightful)

      by penix1 ( 722987 ) on Monday August 25, 2008 @04:12PM (#24741611) Homepage

      I've got a better idea. Ban the collection of personal information beyond the time required for the transaction. I don't but it that companies somehow need to store all this info on people especially years after the transaction has occurred. If you are going to be light on them when they lose it, then be heavy on what they can keep.

      • Re:Yes/No (Score:5, Interesting)

        by kannibal_klown ( 531544 ) on Monday August 25, 2008 @04:18PM (#24741743)

        I've got a better idea. Ban the collection of personal information beyond the time required for the transaction. I don't but it that companies somehow need to store all this info on people especially years after the transaction has occurred. If you are going to be light on them when they lose it, then be heavy on what they can keep.

        Well what about long-term services like Life Insurance? A service like that would need to keep your Name, Birthday, Social Security Number, address, next of kin, etc until you died and someone collected. And what about Banks and Loan offices?

        A friend of mine got a notice from his life insurance firm saying that a laptop was stolen that probably had his records on it. Why it would be on a flippin LAPTOP I have no idea, something like that should be a server only accessible from the company's encrypted network (but I digress).

        I could also see the benefit of some stores keeping some light data on you (name, address, phone) so they can contact you but I think they should get rid of your credit card info after X days/weeks.

        In all, it's a mixed bag of blame. Personally I think the government and law enforcement should take Identify Theft a lot more seriously, with major penalties against these fraudulent jerks.

        • Re:Yes/No (Score:5, Interesting)

          by thesolo ( 131008 ) * <> on Monday August 25, 2008 @04:41PM (#24742087) Homepage

          A friend of mine got a notice from his life insurance firm saying that a laptop was stolen that probably had his records on it. Why it would be on a flippin LAPTOP I have no idea, something like that should be a server only accessible from the company's encrypted network (but I digress).

          $10 says someone was either creating top-line reports or other such nonsense based on spreadsheets full of live data, and they brought it home/outside of the office to continue working on it past business hours.

          I can't even tell you how many times I've seen people in insurance companies take live data home with them so they can whip up statistical reporting. People don't follow IT protocol when it becomes inconvenient for them to do so. (i.e. staying late at the office vs going home & working there.)

          • Re: (Score:3, Insightful)

            by baggins2001 ( 697667 )
            This happens all the time. I've had VP's admit it to me and when I tell the CEO he doesn't really care.
            So therefore I don't care anymore.
            Security becomes a business cost that they didn't anticipate or aren't willing to accept.
            In fact during the latest briefing, we were told that we were looking to go public in a foreign exchange where the regulations weren't as strict.
        • Re: (Score:3, Interesting)

          by nine-times ( 778537 )

          Well what about long-term services like Life Insurance?...A friend of mine got a notice from his life insurance firm saying that a laptop was stolen that probably had his records on it.

          It seems like you could have a rule to dispose of data after the transaction except in businesses/industries where it's necessary, and then regulate those businesses/industries better than we do now. How about it's illegal for a company to put that sort of data onto a laptop?

          • Re: (Score:3, Insightful)

            by jd ( 1658 )

            Yes, and often far more data is held than is necessary. Also, if you subscribe to the notions of grid computing and cloud computing, why store the data at all? All you need to do is tell an authorized holder of the data what operation you wish to perform, and get the results, entirely black-box. You need never see the data at all.

            In terms of liability, I would argue that the rule should be a generic one: if you assume control of data, you assume responsibility for that data - its accuracy, its security and

        • Re:Yes/No (Score:5, Insightful)

          by Foofoobar ( 318279 ) on Monday August 25, 2008 @04:52PM (#24742257)
          Nobody needs to store SSN's except the government that issues them. The fact that people made the MISTAKE of standardizing on SSN's as primary keys for users to begin with is their own fault. Mainly because SSN's are horrible primary keys since they REPEAT!!! Yes look it up... they DO get reissued after death and with longterm storage, this will only cause issues for storage of personal data.

          Second, data loss is a quick route to a lawsuit as a result of storing SSN's; People and companies need to stop this bad procedure especially since laws in several states have been passed banning the practice. Good security can only do so much as human error is inevitably your final point of failure. And do you want to have a couple million social security numbers relying on the security of a backup tape in the back of your juniour sys admin's Pinto overnight?
          • Re: (Score:3, Interesting)

            by mpe ( 36238 )
            Nobody needs to store SSN's except the government that issues them. The fact that people made the MISTAKE of standardizing on SSN's as primary keys for users to begin with is their own fault.

            It's also rather daft since it complicates matters if they need to deal with customers who don't have SSNs, e.g. corporations.

            Mainly because SSN's are horrible primary keys since they REPEAT!!! Yes look it up... they DO get reissued after death and with longterm storage, this will only cause issues for storage of pe
          • Re: (Score:3, Interesting)

            by arminw ( 717974 )

            ...Nobody needs to store SSN's except the government that issues them...

            Tell that to your friendly DMV who are now mandated to collect this information by the federal government. It so happens that in any computerized database, a unique record identifier is needed. For any database that could contain information of potentially anyone in any state, the SS is more likely to be unique than any other number currently assigned to nearly everyone.

            Instead of making the legitimate owner of the identity responsible

            • Re: (Score:3, Insightful)

              by hedwards ( 940851 )

              Since a state issued ID is considered to be valid identification for the federal government. And the federal government uses SSNs to identify people, it seems fine to me that they'd use that information.

              If one is going to be using it to board a plane, as identification for a passport, to register to vote in federal elections, it seems fair to me for the federal government to expect that state issued IDs are going to be recorded against the SSNs.

        • Re: (Score:3, Insightful)

          Getting rid of the credit card data after X weeks seems like an excellent idea.

          It's not easy to get a room at a decent hotel without a credit card. Certainly in some places you can pay cash in advance--but you can't use the phone, order a meal, connect to their network. If they require a credit card, or make it too difficult to procure their services without one, then they should absolutely be held accountable for the safety of the information.

          Organizations of all sorts--retail, airlines, hotels, hospitals,

        • Re: (Score:3, Interesting)

          by dfm3 ( 830843 )

          I could also see the benefit of some stores keeping some light data on you (name, address, phone) so they can contact you but I think they should get rid of your credit card info after X days/weeks.

          Indeed, this is the heart of the problem: When X = 52 weeks, or 2 years, or forever. I can understand why a hotel would want to keep my information on file for a short while, say a week or two to assure that I've been charged for my visit, or held responsible if I happened to break a lamp or a window, but I see absolutely NO REASON why a company has to keep my credit card details on file for an entire year after I have concluded a business transaction with them.

          Less critical information such as my name, ad

      • Re: (Score:3, Interesting)

        by jellomizer ( 103300 )

        Great idea lest threw business back 2 decades. This data is used beyond just advertising and marketing it is used to improve the business on the whole.

        Eg. When you call your credit card company you can usually get your balance and access what most usually called features right away. I bet if you call them a few times and not go that route that the phone system may change for you to get you on and off the line quicker making you happy as you are spending less time on the line and them happy not having to pay

    • Code violations (Score:3, Insightful)

      by Brain-Fu ( 1274756 )

      Most forms of construction must adhere to a code. Why should software be any different?

      It would be nice, IMO, if we could formulate a set of minimum requirements for any kind of personal-data-handling software (including codes for operating procedures). Things like "all passwords in the system must use strong encryption" and "backups of the data cannot be stored on personal laptops" and the like.

      Then legally require businesses to higher some ratio of software developers who have passed a code certificatio

      • by blueg3 ( 192743 )

        Things like "all passwords in the system must use strong encryption" and "backups of the data cannot be stored on personal laptops" and the like.

        Sadly, that sounds about accurate for the results if such a code was written.

        Passwords don't use encryption of any sort, and data backups shouldn't be stored on any laptop, personal or not (nor on an individual user's work desktop, nor on any personal machine...).

    • Re: (Score:3, Funny)

      by zappepcs ( 820751 )

      So all is ok if the stolen laptop had everything encrypted? That would seem legally equivalent to someone hacking at a server in the company's data center but not getting in. Then what kind of paperwork etc. is required for a contractor to use laptops from the company contracting them? The point being, how far can culpability be extended through the food chain? If an employee is not a security expert and does what IT told them to do but a compromise still happens, is the company or an employee guilty? If my

    • Re:Yes/No (Score:5, Interesting)

      by Sylver Dragon ( 445237 ) on Monday August 25, 2008 @04:32PM (#24741975) Journal
      I think there is a way to go about it that would work.
      The first thing that would have to be done is that we would need some guidelines as to what a "reasonable" level of security is, and even that might be scaled based on the type of information stored. This should then be re-evaluated yearly by a commission of qualified IT managers from industry. There are other limitations which should be placed on the commission, but that's outside the scope of this uninformed rant.

      Just as an example:
      Storing customer names and addresses - Database encryption and basic perimeter security may be considered reasonable. Losing data and not being there should result in fines and maybe some jail time.

      Storing Credit Card info - Same as above, but add backup encryption, laptop hard-disk encryption, internal firewall for DB servers and source code audit on all applications with DB connections. Failure to comply and losing data would be hefty fines, jail time for those responsible for the systems, and civil liability to those people affected.

      Storing Social Security Numbers - All the above, but damages increase substantially, as does jail time, with c-level execs getting in on the PMITA action. And civil liability is increased to "the affected customers now own your ass" level.

      The problem, of course, is that it would be the government doing it, so they would invariably screw it up.
      • Re:Yes/No (Score:4, Informative)

        by __aagmrb7289 ( 652113 ) on Monday August 25, 2008 @04:46PM (#24742165) Journal
        The credit card industry has mandatory PCI compliance. This basically covers your concerns. Supposedly, those companies not compliant will not be allowed to process credit cards - and the requirements must be audited and proven by an outside firm. It's QUITE expensive. The problem is whether or not these rules are being enforced. They ARE getting more stringent as time goes forward.
    • Re:Yes/No (Score:4, Insightful)

      by CodeBuster ( 516420 ) on Monday August 25, 2008 @05:10PM (#24742565)
      The issue is one of negligence [] not the relative efficacy of the available security technologies. If a company is found, upon discovery, to have exhibited a complete or reckless disregard for the potential consequences of a breach then some liability is in order. The "reasonable man" test can be used by juries to decided whether or not the circumstances surrounding the breach amount to negligence and what the appropriate remedy should be. The negligence tort has already been well litigated in common law countries (like the US, UK, and Australia) so the only thing different here are the details (IT technical details) which might require expert witnesses to testify or offer their opinions, but the basic law in negligence is well settled (at least as far as I understand it, but IANAL so please do not take this as formal legal advice) once the details or facts of a particular matter have been determined.
  • civil not criminal (Score:5, Interesting)

    by v(*_*)vvvv ( 233078 ) on Monday August 25, 2008 @04:09PM (#24741559)

    This would be a great civil class action case, but criminal? The slope is quite slippery, and like previous posters have said, the cops don't do much when it comes to non-violent, non-domestic, non-street crimes.

    Of course, some would argue that the banks and lenders behind the whole sub-prime mortgage crisis deserve to be criminally punished for causing a global recession and for the number of lives they've destroyed.

    • Re: (Score:3, Insightful)

      by sm62704 ( 957197 )

      the cops don't do much when it comes to non-violent, non-domestic, non-street crimes.

      I know a man who was charged with home invasion and attempted murder for breaking into a man's home and trying to kill him with a butcher knife, and plea bargained down to two weeks in the county jail.

      A woman I know spent four months in Dwight Correctional Center for a non-violent drug offense (possession). It seems to me that being careless with thousands of peoples' lives, let alone attempted murder, should carry a far he

    • by diodeus ( 96408 )

      In the movie The Corporation we learn that corporations have pretty well the same legal rights as people, but they lack the personal responsibility that goes with it.

      Perhaps we should round up all those Best Western hotels and put them in prison.

    • This would be a great civil class action case, but criminal? ...

      When I was doing standards work, I was introduced to the notion that only "must" and "shall" (i.e., imperative words) mean something you have to do. Words like "should" are really synonymous with "don't really have to at all" in standards lingo. They just mean you have to answer for something in words when someone calls you on it, but ultimately that no one can force you.

      So too the real difference between civil and criminal is that ci

  • by religious freak ( 1005821 ) on Monday August 25, 2008 @04:09PM (#24741561)
    If a COMPANY is being prosecuted criminally, you obviously cannot have jail time, because it's a non-person.

    However, you can (and IMO should) have much stiffer penalties than civil courts allow. When a data security breach is so bad to as harm society itself, it should be prosecuted criminally - this is the doctrine for criminal prosecution of companies. Criminal penalties can range from massive monetary damages, to shutting the entire company down, or forcing changes in management. This is the correct route to go.

    Obviously, if the implication is that the IT workers themselves should be thrown in jail, this is absurd and would cause all kinds of damage, both foreseeable and unintended.
    • by sm62704 ( 957197 ) on Monday August 25, 2008 @04:24PM (#24741841) Journal

      Freezing a companies' assets and disallowing any business for two years would be the equivalent of putting a human in prison for two years. So you could, in fact, "jail" a corporation. You could shield its employees (at least the ones not responsible) by forcing the company to pay them anyway. If it goes bankrupt, well, people go bankrupt after incaration, why shouldn't businesses?

      Or converseley, put its CEO and Board of Directors in a maximum security prison with the other criminals, many of whom caused far less damage to people, or none at all.

      The thing is, the corporations are deemed too valuable to be punished. THIS is what should change.

      • [Citation needed] (Score:3, Insightful)

        by jabithew ( 1340853 )

        The thing is, the corporations are deemed too valuable to be punished. THIS is what should change.

        Seriously, what the hell? Consider the HSWA (1974) [], the Environmental Protection Act (1990) [] and the Data Protection Act (1998) [], all of which carry the possibility of fines and a jail term if breached?

    • by TubeSteak ( 669689 ) on Monday August 25, 2008 @04:47PM (#24742171) Journal

      If a COMPANY is being prosecuted criminally, you obviously cannot have jail time, because it's a non-person.

      It is tiring that this line of reasoning keeps getting trotted out.
      WTF do you think executive officers are for?

      "The Company" doesn't do anything illegal, the corporate officers & various (vice) presidents are the ones in charge and they have always born the responsibility of the company's actions.

  • by frith01 ( 1118539 ) on Monday August 25, 2008 @04:10PM (#24741591)

    You have a choice, allow organizations to report the data breach, or have them cover it up to avoid the penalty.

    [ Why would anyone report a data breach when that means they would face jail time ? ]

    Remember, the odds of an external entity finding out about the data breach is extremely small (except for the ones taking the data of course ).

    • Re: (Score:3, Interesting)

      by MozeeToby ( 1163751 )

      Easy, make the peanalty dependent upon the companies handling of the situation. If the company comes clean the penalty is X dollars per victim. If the company attempts to hide the situation the penalty is 100 * X dollars per victem.

      • That just motivates them to either cover it up really well, or else maintain some level of plausible deniability. You just can't make something illegal with a stiff penalty and then expect that people will come forward and report themselves.
    • by sampson7 ( 536545 ) on Monday August 25, 2008 @04:49PM (#24742183)
      I completely disagree with your assertion that a company would not self-report. As a compliance officer with a major international corp (albeit in a different field), we are often faced with the difficult question of whether to self-report a potential violation. We are generally faced with three options when a potential violation arises:

      1. Self-report the violation, fix the problem/install appropriate controls, get the "credit" for active compliance, take the medicine and move on.

      2. Document the potential violation internally, fix the problem/install the appropriate controls, establish the paper record documenting the potential violation, but explaining why it is arguably not a violation or that there is no affirmative duty to self-report.

      3. Actively attempt to conceal the violation or ignore a clear legal requirement to self-report.

      Pop quiz! Which of these three "options" could lead to massive fines by the appropriate governmental regulator, share-holder lawsuits, top managers being fired and even the destruction of your company?

      Anybody who thinks a potential release of information could not bite you in the ass needs to imagine the type of risk/reward analysis the company goes through. I can easily envision the following scenario. Company loses critical personal information. Company actively hides the loss and/or actively ignores legal obligation to self-report. The thief attempts to use the stolen credit card numbers/whatever. Thief is caught. Thief tells police where he acquired the information. Police investigate the breach. Internal emails/IMs reveal that the company knew about the breach but did nothing. Company faces multiple class action lawsuits from: (1) the people harmed by the breach of their personal information; and (2) shareholders who should have been informed in the quarterly SEC-required disclosures that the Company faced a potential liability.

      Now some fly-by-night company might reach a different cost-benefit analysis. But any large company should immediately recognize that the potential harm of trying to cover something like this up. When you're talking about a bank or large medical company? Would you as CEO or internal compliance officer risk millions or even billions on something that is so likely to become discovered? Even if the chances are 10,000-to-1 against the breach ever coming to light? Frankly, the rewards are simply not worth the risk.
  • Hard to say (Score:2, Insightful)

    by Anonymous Coward

    Almost any system can be hacked by someone sooner or later. If a crack was found in SSH that allowed a root shell, would the person responsible for the code be held responsible? or the guy who admins the server?

    • Re:Hard to say (Score:5, Insightful)

      by hairyfeet ( 841228 ) <bassbeast1968@gm ... minus herbivore> on Monday August 25, 2008 @04:44PM (#24742141) Journal

      The problem ISN'T hackers and thieves,the problem is rampant King Kong sized stupidity. How about we only bust them for gross negligence? Let's face it,it is these morons that have thousands of customer records on unencrypted laptops,or leave an unencrypted backup tape sitting in the parking lot in their car,or the idiots at my local phone company who put a bunch of machines on the curb without bothering to wipe the drives first.

      I think we can all agree that there is a BIG difference between taking precautions and getting hacked and these brain trusts that don't even bother to show even the tiniest bit of common sense. We need to have penalties for the ones that don't even bother to try,otherwise why would they spend the money on security when they aren't really going to be punished when they screw everybody? And I agree with the earlier poster that there needs to be a time limit for most of this stuff. While a previous poster used the example of an insurance company the simple fact is there are way too many companies that hang onto every scrap of information that comes there way for years. We should come up with a set of criteria that has to be met before you are allowed to keep data for longer than the transaction requires. But as always this is my 02c,YMMV

  • Yes (Score:5, Insightful)

    by sm62704 ( 957197 ) on Monday August 25, 2008 @04:12PM (#24741621) Journal

    Not only should there be criminal damages, but attempting to keep the thieft secret should carry an even heavier penalty.

    There should also be, upon conviction in criminal court, monetary redress for the poor slobs whose data was compromised, and it should be a LOT more than it cost the compromised person. Say, enough to buy a new car.

    Why can't we have the death penalty for corporations? The standard answer is "all those people who get trhrown out of work", but there IS a death penalty for corporations; ENRON suffered the death penalty, but the people in charge (at least the ones that didn't go to prison) suffered no penalty at all.

    How about a "death penalty" where the victims are given the company itself?

    • by nasor ( 690345 )

      Not only should there be criminal damages, but attempting to keep the thieft secret should carry an even heavier penalty.

      Although I can appreciate the sentiment behind this, I think a better solution would be for companies to stop pretending that something like a social security number can act as a magic password that magically proves people are who they claim to be on a credit card or cell phone application. Then it wouldn't particularly matter if our "personal information" gets out.

      • Of course pointing at the problem and solving it are completely different tasks.

        What fool proof identification system do you propose?

        I've always figured going with all three identifying items.

        1) Something you have. (IE the credit card)
        2) Something you know. (IE the PIN number)
        3) Something you are. (IE fingerprint, retina scan, DNA, etc)

        1 & 2 can easily enough be changed or updated if a breach happens, 3 is something you can always have verified by some kind of identification authority.

        • by nasor ( 690345 )
          I don't know of any fool-proof identification schemes, but "Ah, you know a social security number, so CLEARLY you are the person who that social security number belongs to!" is about as idiotic as you can get.
    • That reminds me of when Bart owned a factory downtown, which made Frank Grimes hate Homer even more.

      Is this the fate you wish to subject me to?
    • There seems to be something being forgotten here. In any pure game of cat and mouse, the cat always wins. The game ends when the cat catches the mouse. There is no end-game scenario for the mouse "gets away". When it comes to securing something, physical or electronic, the game of cat and mouse becomes the game of cops and robbers.

      In any pure game of cops and robbers, the better funded group always wins. When it comes to physical property, robbers need to break locks, sneak in, sneak out, and escape ca

    • by mstahl ( 701501 )

      How about a "death penalty" where the victims are given the company itself?

      What would they do with it?

    • Not only should there be criminal damages, but attempting to keep the thieft secret should carry an even heavier penalty.

      And the famous part of the Fifth Amendment hits that head on:

      "... nor shall [any person] be compelled in any criminal case to be a witness against himself, ..."

      So it's not going to happen in the US. Give it up.

      = = = =

      The people harmed are easily identified. It makes more sense for this to be a civil matter, with heavy financial penalties being paid by the company to the victims of the i

    • Re:Yes (Score:4, Insightful)

      by oyenstikker ( 536040 ) < minus city> on Monday August 25, 2008 @05:20PM (#24742727) Homepage Journal

      Won't fly. The shareholders will then claim to be victims as well.

  • by Brigadier ( 12956 ) on Monday August 25, 2008 @04:12PM (#24741625)

    If your going to store my private data without my expressed permission. In other words I didn't specifically request it (as opposed to having it thrown in as a caveat on some user agreement). Then you are responsible for all mishaps that may be incurred by your actions.

    If I ask you to save my data then I accept that I am giving permission to said company as is. In other words it now is my responsibility to look over all disclosures.

    The inherent problem however is there is no means of specifically identifying a person. first and last name no longer work. you can assign them a unique code but most people get tired of bringing around and ID card for every business they do business with. Thus you are forced to use a.) a phone number which is subject to change, social security ID, or credit card number.

    So though I do believe they should be held responsible for negligence and saving information without expressed permission. I do think the credit industry as a whole is responsible. There needs to be a fixed ID system which is separate from the credit system (as in credit score) and governmental ID systems.

    This one ID bullshit needs to stop. Each person should have a superficial ID which can be changed at request. A credit ID which requires in person transactions (loan etc) a government ID and a health care ID. all of which should be maintained by different independent agencies.

    • Re: (Score:3, Insightful)

      by nine-times ( 778537 )

      There needs to be a fixed ID system which is separate from the credit system (as in credit score) and governmental ID systems.

      Part of the problem is just that everyone wants everything to be easy, and "easy" doesn't get along well with "secure". Like with social security numbers-- they're being treated as a piece of secure information in order to identify people (which it wasn't intended to do). But then as a result, you have to give it to people *all the time*. Because so many things require your social security number and people are encouraged to give it so freely, it's effectively out in the open, and not a piece of secure i

      • Re: (Score:3, Interesting)

        by Todd Knarr ( 15451 )

        What I don't understand is why ID is needed in the first place. It seems to be tied to the idea of the merchant making a charge against the purchaser's bank account, which means the merchant needs to identify the purchaser to make the charge. But why does the merchant need to make the charge? Instead, have the merchant provide a merchant ID and transaction number to the consumer, who then logs into their bank's site and initiates a payment to the merchant for the transaction. Nobody can initiate a payment w

  • by lena_10326 ( 1100441 ) on Monday August 25, 2008 @04:13PM (#24741649) Homepage

    Stop giving out credit to every person who walks up to a cash register. Stop warehousing critical information that can be used to apply for credit. Stop approving credit based on only Name/SSN/Address. Stop this culture of unlimited, unchecked credit to anyone, any time, any place.

    The problem is the lending system, not the fact your data is leaked. In web terms, credit applications need to be double opt-in, not single opt-in.

    • Re: (Score:3, Funny)

      by db32 ( 862117 )
      Clearly you are confused. If we take away the ability for people to spend themselves into oblivion with easy credit the terrorists win! I want the prices of everything on the market artificially inflated by peoples spending habits of imaginary money. I am simply not satisfied until I have to pay $50 for a $5 item because the supply and demand curve is completely screwed due to the massive influx of imaginary money into the consumers hands!

      You must be some kind of dirty pinko commie bedwetter if you wan
    • by lena_10326 ( 1100441 ) on Monday August 25, 2008 @04:30PM (#24741933) Homepage
      100% on-topic. Data breach => identify theft => credit and lending fraud. Fix it at the tail end by making the data useless to fraudsters. Think it through next time, mod. Just think it through.
  • Criminal Charges? (Score:5, Insightful)

    by db32 ( 862117 ) on Monday August 25, 2008 @04:13PM (#24741653) Journal
    Sure...while we are at it lets put a cop in jail every time someone in their city gets mugged, murdered, raped, etc.

    I will be exiting the field the moment some kind of stupidity like what is suggested goes in place. I have a family, and I have no intention spending time in jail being a scapegoat for something like this. It is stupid to expect an individual to be held accountable criminally for something like this. Why should I spend time in jail or face fines personally because Vendor X couldn't be bothered to employ better programmers or test their stuff. Nevermind there will ALWAYS be vulnerabilities. Or maybe I go to jail because some worker brought in an infected USB photo frame. The only way you can really secure the desktop computer completely from the user is to cut the power cable and give them a pad of paper and a pen.

    That said...I think there should be something to "encourage" companies to actually invest the resources in protecting that data, or just to stop collecting it. Seems to me not collecting it is far easier and more viable in many many cases. I agree that there is a problem in the value that data provides the company and their lack of "encouragement" to protect it. The notion of holding already overtaxed administrators criminally liable will only make the problem worse. The field will shrink even further and I imagine many of the competent ones will find work elsewhere not wanting to be a whipping boy under idiotic laws like this.
    • Re: (Score:3, Informative)

      by blindd0t ( 855876 )

      That said...I think there should be something to "encourage" companies to actually invest the resources in protecting that data, or just to stop collecting it.

      Chargebacks (card holders disputing charges with their credit card company) are good incentive. Ultimately, it is the vendor that looses money when a user claims a charge is unrecognized and the vendor is unable to provide sufficient proof that it was a legitimate purchase (though the CVV2 number helps the vendors here). To add to that, even more in

    • Perhaps we should indeed hold law-enfocrcement responsable when, for example, they leave a cell-door unlocked and a criminal escapes and commits crimes.

      That really is the better analogy.

      I wonder how many of the security breaches really come down to bad IT, and how many can be traced to individual users. In my experience, the biggest danger is from people putting data where they should not, leaving their laptops lying around, leaving their passwords on pieces of paper, etc.

  • Not IT, but business (Score:5, Informative)

    by Ohrion ( 814105 ) on Monday August 25, 2008 @04:15PM (#24741677) Journal
    I disagree with the prospect of placing blame directly on IT/IS. I do believe however that much of the blame needs to be placed at the company level. Many times the risks are known ahead of time by both IT and the business, but the business has decided not to spend the money to fix the problem and have signed off on the risk. Sometimes there is nothing further the IT department can do without the express permission of business. In fact, this is fairly frequent.

    I also disagree with this blame being in the form of a crime, unless it is negligence or gross negligence. Fines maybe, but jail-time no. The exception to this, is if the theft is an inside job. Of course, there are already laws to deal with that.
  • Possibly too far (Score:2, Interesting)

    by avatar4d ( 192234 )

    I am not sure that criminal charges are necessarily needed. Who would get the jail time? I mean does the SA have to prove that he recommended better security to the PHB? Does management automatically go directly to jail?

    I might be happy enough with the company being responsible for any identity theft of the people listed in their data. Maybe only for the next 5 or 10 years, but if their credit starts getting messed up, then the company which lost the data should be responsible to take the blame and also par

  • Worrisome... (Score:2, Insightful)

    by tekiegreg ( 674773 ) *

    Forgive me for not RTFA in advance but...

    I'm a developer, I've worked on many an app that has stored credit cards, social security numbers, and other pieces of juicy data. I've always acted with integrity and you'll never find a credit card or social security number posted on the Internet of my own free will. Generally I take best efforts to secure this information. Using appropriate technology such as hashing, encryption, access controls and authentication as appropriate for the information, etc. Docum

  • by ScentCone ( 795499 ) on Monday August 25, 2008 @04:20PM (#24741769)
    Leaked data, by itself, isn't a crime in this regard. No harm comes to anyone until someone with criminal intent actually does something to it. Not counting, of course, the harm of feeling appropriately uneasy as you wonder if/when someone will do something with it following a leak - but I'm not sure that sort of anxiety rises to the level of crime on the part of the hotel chain... you could have the same anxiety about whether or not someone holding your data will at some point have a leak that hasn't even happened yet, and likely never will.

    There's a reason that someone who sues McDonalds over the hot coffee she dumps in her own lap doesn't ask a DA to go after them criminally. Likewise with slipping on a wet restroom floor that doesn't have one of those "caution" signs put up by the maintenance crew. Being bad (or even, unlucky) at your job could well be grounds for a civil suit, but it isn't usually - and shouldn't usually - be considered an actual crime. That's pretty dangerous stuff, there.

    When some wackadoo in full-on tinfoil hat mode brings a gun or a knife to work and kills the PHB he's hated for years, and is now convinced is working for Alien Overlords... is the employer who didn't see that coming an accessory to the crime that was committed, for having failed to prevent it?

    If data is leaked, and no crime (based on the use of that data) is ever committed, and the laptop gets recovered with no expectation of it having been compromised... did a crime take place, not counting the person who ripped off the laptop from an employee's luggage? Is the employer actually a criminal because that happened? The opportunities for Really Bad Precedents here are vasty.
  • by Todd Knarr ( 15451 ) on Monday August 25, 2008 @04:23PM (#24741819) Homepage

    I'm of the opinion that the liability should depend in part on whether the data's being kept longer than needed for the transaction or purpose it was provided for or not. For instance, if I buy something from an on-line merchant they need to keep my name and address on file at least long enough to ship my item, and almost certainly for the length of time I'm allowed to return the item for a refund or replacement. They need to keep my credit-card number on file long enough to authorize it, possibly long enough to settle the charges (depending on how they're set up with their clearing house), and possibly as long as I'm allowed to ask for a refund (if for instance the clearing house requires the card number to credit the money back). When a company keeps information around longer than needed, they should be held to a higher standard since now it's their choice that the data's being kept. And "needed" should be determined by the purpose or transaction the data was provided for, not by what the company wants to do. When I provide a billing/shipping address for a purchase, I'm not providing it so the company can do better advertising later. If they insist that I create a profile and leave that information on file permanently for their convenience or benefit, they should be taking more responsibility for it's security than if they're keeping it just long enough to do what I asked of them and then discarding it.

  • Best Western claims that it was a single hotel [], and that they purge older data when it's not needed.

    Of course, as it's been so widely reported, the chances of people believing anything other than the worst case scenario is unlikely, as how many blogs are going to post a 'oh, nevermind, I was wrong' article? (and the newspapers would hide it somewhere on page 24)

  • You can't just say the IT department is at fault in all cases. It would have to be looked at on a case by case basis and it certainly wouldn't just be IT. The company as a whole can determine how well an IT department runs.

    If a company flat out does something stupid then of course there should be some sort of compensation or punishment for the company.
  • No taxation without representation.

    And its converse: No profit without responsibility.

    The latter also covers cases like Monsanto, which wants to profit from the wind blowing their GM seeds to other fields (sue the farmer for using the seeds without paying), but denies responsibility when those same seeds cause problems (contaminating the crops of organic farmers). If you want to be the beneficiary of a product or mechanism, they you must also be liable for any negative consequences of that product or mecha

    • The whole point of exporting Intellectual Property through trade agreements and so on is to own the brains of the poorer countries - recolonise them without having to actually maintain force of arms there.

      I'm sure Rudyard Kipling would have called it "the corporate man's burden." [] It's for their own good, I'm sure.

  • by RobertB-DC ( 622190 ) * on Monday August 25, 2008 @04:31PM (#24741953) Homepage Journal

    In Texas (and in other states, it seems), it is against the law to leave your keys in the ignition []. I haven't yet figured out exactly what the purpose is for that law, except to remind people that leaving your keys in the car invites theft. I certainly haven't heard of anyone being prosecuted for the "crime".

    Perhaps a similar nominal criminal sanction should be in place for the company that leaves the keys to my identity in their corporate "ignition"? The penalty would be a slap on the wrist, or less -- because a stiff penalty would lead to coverups. But the law would still be on the books.

    That would allow the bean counters to add an item on the balance sheet for "secure client data -- compliance required by law". That would carry more weight than "secure client data -- compliance with 'best practices' guidelines".

    • Wonder how that works if my car is started with a toggle switch because the real ignition switch went bad... Is it illegal to leave my toggle switch on the harness?

    • by Ungrounded Lightning ( 62228 ) on Monday August 25, 2008 @05:44PM (#24743053) Journal

      In Texas (and in other states, it seems), it is against the law to leave your keys in the ignition. I haven't yet figured out exactly what the purpose is for that law, ...

      It reduces car theft, thus reducing the load on law enforcement and insurance rates. It also makes it harder to steal getaway cars and increases the likelihood of catching the perps before they do something like rob a bank, reducing that victimization.

      Or at least that's the sort of theory I'd expect to be behind the rule.

      (At least one rural western state has had a requirement that any gun carried in a car must be loaded - so it can be used by the driver to defend against its own theft. They'd had a lot of trouble with walkaways from prison jacking good samaritans who rescued them in the desert.)

  • What the UK needs [] is for the government to get the bill for breaches [] ;-)

    Seriously, the Information Commissioner has actually served enforcement notices on the most incompetent departments [] and the Conservative opposition has called for prosecutions [].

  • The vast majority of computer security "incidents" we hear about, and most of the ones we don't hear about, would never have taken place if this was the stance adopted 10 or 15 years ago. Not IT liability... corporate liability. Ultimately it's the corporate level where goals and policies are set and approved, and budget decisions reign supreme.

    If the first large-scale data security breach that happened to a retailer or a bank had been made into an example, we wouldn't be seeing what we see today.

  • have some sort of confidentiality agreement. If they do not live up to that agreement then they should be held liable. If they promise to keep my data confidential then it is their responsibility to implement the necessary security to actually keep that data confidential. I especially think hotels, car rental agencies, airlines or anyone else that requires that I transmit a cc number in some form or another, need to be audited and approved for security on a regular basis.

  • And the concept of IT security negligence is little different from bank physical security or workplace safety negligence.

    If a bank is robbed, of course you go after the robbers. But if the robbers cleaned out your safety deposit box, and it is shown that the bank was failing to use best practices with respect to security, you have an action against the bank as well.

    If you suffer a workplace injury, and it can be shown that the company was not following safety regulations and requirements, then you can go a

  • Erm... we already do (Score:5, Informative)

    by jimicus ( 737525 ) on Monday August 25, 2008 @04:40PM (#24742079)

    In the UK (and, I believe, Europe), anyway.

    The Data Protection Act briefly states:

    • Data may only be used for the specific purposes for which it was collected.
    • Data must not be disclosed to other parties without the consent of the individual whom it is about, unless there is legislation or other overriding legitimate reason to share the information (for example, the prevention or detection of crime). It is an offence for Other Parties to obtain this personal data without authorisation.
    • Individuals have a right of access to the information held about them, subject to certain exceptions (for example, information held for the prevention or detection of crime).
    • Personal information may be kept for no longer than is necessary.
    • Personal information may not be transmitted outside the EEA unless the individual whom it is about has consented or adequate protection is in place, for example by the use of a prescribed form of contract to govern the transmission of the data.
    • Subject to some exceptions for organisations that only do very simple processing, and for domestic use, all entities that process personal information must register with the Information Commissioner.
    • Entities holding personal information are required to have adequate security measures in place. Those include technical measures (such as firewalls) and organisational measures (such as staff training).

    It's not clear which country the Best Western incident took place in but if the systems were hosted in the UK and they processed bookings from UK customers, it looks like a fairly cut and dried breach of that law to me.

    There is, however, the minor issue that I don't think anyone's ever been successfully prosecuted for not having inadequate security systems in place...

  • I don't think criminal prosecution is the way to go. It's bad, but typically I'm not a fan of making incompetence in private matters criminal.

    What I do believe should happen is twofold:

    1) Any breach should come with mandatory disclosure and civil liability. Basically, we should be able to get a class action suit going for the time and effort necessary to change all of our card numbers, etc. in the event of a breach, plus costs for checking credit reports, etc. I'm sorry, but my credit card company change

  • What's a crime is that companies which issue credit cards, auto loans, mortgages, etc will accept your name, ssn, and mother's maiden name, as proof of identity.

    These items just aren't secrets anymore so there's no reason for banks (etc) to go on thinking that only the "real" john smith would know them.

    Banks that lend out money in my name should be forced to absorb resultant losses themselves. Equifax and trans union should be targets for libel lawsuits when they ding your credit rating because of ID theft

  • The blame should be shifted to the companies who lose the data. Hopefully doing that will get them to question their procedures of collecting the data in the first place. What really needs to happen is a serious reform in the way credit is issued. It's one thing to have a data breach. The real problem comes in when that data is then used to open accounts. The financial institutions need to do a better job of identifying the people who are asking for credit. If a company wants to give me $10,000 worth
  • Better than fining companies for security breaches, why not require a certain amount of security based on the type of data the business is collecting. Allow for periodic and random inspections and issue fines if the company isn't up to the required level. If theft occurs, a more detailed inspection is conducted until the cause of the theft is identified and fines can be issued if the theft should have been avoided by following the required security measures.

    This is essetially what would happen if you allo

  • As all decisions end up being their responsibility in the long run.

    Crap may run downhill, but legal responsibility runs uphill.

    At least in a world set in reality, that's how it should be...
    Of course, they'd claim "we didn't know" and try to weasel their way out of it....

  • Data Protection (Score:4, Interesting)

    by Antony T Curtis ( 89990 ) on Monday August 25, 2008 @05:07PM (#24742501) Homepage Journal

    The USA needs something like the Data Protection Act [] which the UK has... It gives individuals rights to access and correct data held about them and it mandates that organizations must take adequate steps to protect and secure the data. Failure to do so is a criminal offense.

    IANAL... If any of Best Western's compromised data details reservations at any of Best Western's hotels in the United Kingdom, they may have opened themselves up for prosecution under this law. All organizations and businesses in the UK which may store details on more than around 500 individuals must register and adhere to the DPA. I am sure that Best Western has had more than 500 customers in their UK operations!

  • by fuzzyfuzzyfungus ( 1223518 ) on Monday August 25, 2008 @05:33PM (#24742907) Journal
    Attempting to legally define responsibility for "reasonable" security is a tricky one. You don't want a situation where corporate can, say, consistently shirk on security implementation, then hang the poor bastard who had to make the best of a bad job out to dry when the time comes(not that that would ever happen, no, definitely not, never). On the other hand, having a checklist of "OMG Industry Best Practice!!!1!~) ass-covering steps is pretty much writing the script for security theatre.

    I suspect that going after the type, quantity, and duration of data storage is a much more productive avenue. For any given commercial relationship, certain data storage will be necessary, for a certain amount of time. Not much we can do about that. Anything beyond that level, though, should be open to stiff liability in the event of a breach. You want the advantage of storing extra data? You take the risks, like it or shove off. The trouble(particularly bad in the US, though hardly good elsewhere) is that there is essentially nothing, other than the low and falling costs of storage, counterbalancing the desire to hoard as much customer(no, I'm not going to say "consumer") data as possible. Make anybody who stores more than the necessary minimum of data liable for damage caused by breach or inaccuracy and the problem should be considerably reduced.

    Even if the above seems, shall we say, unrealistic, there are some basic steps we should have taken ages ago. FFS, companies that have data stolen aren't even obligated to warn people in some jurisdictions!(See the ChoicePoint debacle a while back, they warned California customers, because the evil commie nanny state had the crazy idea that people ought to be warned when somebody fucks up and gives their data to criminals; but everybody else just had to puzzle it out) That is absolutely insane.
  • by erroneus ( 253617 ) on Monday August 25, 2008 @05:38PM (#24742973) Homepage

    It's the responsibility of the people who created this system that people cannot reasonably opt out of.

    With "drug laws" as they are, there are limits to the amount of cash anyone can carry without it potentially being seized by cops. You can't pay for everything in gold can you? With the majority of banks out there simply refusing to do business with you for not having a social security number, it is essentially impossible for people to exist in society without allowing your identity to be entered into various systems and databases. The credit and banking system has created this potential for abuse of our identities and it is the credit and banking system that should be held accountable for the abuse of the system that we are all but involuntarily required to be a participant in.

    Furthermore, since so many businesses feel it is in in their interests to collect our information and put it at risk, they should also maintain responsibility for its abuse when it leaves their control. Once again, as a condition for doing business and ultimately for leading a "normal" mainstream life, we are essentially powerless to opt out and are otherwise defenseless and unable to protect ourselves from what may happen when mismanagement and abuse of our trust occurs.

    What a great system they have where they reap all the benefits and we burden all the risk? I think it's more appropriate that they bear the risk along with the benefit. If they want to have the benefit of collecting private information, they should bear the consequences when the information is abused as a result of their own abuse or negligence.

  • by kbahey ( 102895 ) on Monday August 25, 2008 @05:42PM (#24743007) Homepage

    Part of the issue is storing identifying information, the other issue is storing credit card info. There should be no excuse for storing credit card info.

    I was at Home Depot (Canada), returning something I bought earlier, and I reached for my wallet to give the guy the credit card to refund the item. He said, "Oh, we don't need that Sir, it is all stored in our system". I said: "You store credit card data on your computer"? He says: "Oh, we don't have access to it".

    The point is, not the employees having access to it, but the data getting copied or stolen by criminals, such as the Best Western case.

    Some credit card gateways provide a token based approach to recurring charges, such as monthly subscriptions, but it is not a standard that can be used everywhere with any card, and any merchant.

  • by CopaceticOpus ( 965603 ) on Monday August 25, 2008 @06:17PM (#24743521)

    Wouldn't this lead to all companies needing to purchase a data loss insurance policy, much like doctors need malpractice insurance? The end result would be richer lawyers and insurance companies, more wasted time in court, and companies not needing to change because they have insurance.

    I do think these companies need to be held responsible, but I think that they are already afraid of the PR hit from losing data, and their IT managers should already be afraid for their jobs if a data breach occurs. I really doubt that this sort of law is going to help.

  • by cdrguru ( 88047 ) on Monday August 25, 2008 @06:34PM (#24743759) Homepage

    Well, according to the FBI, this includes all forms of credit card fraud. This is mostly why "identity theft" is getting so much attention and seems to be growing by leaps and bounds.

    I have been subjected to credit card fraud many times, as have many people I have known. I have yet to meet anyone that has experenced any loss, even the supposed $50 that you might be liable for. Zero loss, get a new card and move on. Sometimes a minor hassle.

    The sort of "identity theft" that most people associate with the term is where someone obtains credit under false pretenses. I don't know what the actual incidence of this is and because of the FBI combining it with credit card fraud, we will probably never know the true impact of this. What I want to know is how often this is really happening and has anyone, ever, been a victim of something beyond credit card fraud because of one of these disclosures.

    I don't see any point to trying to make a bigger deal out of it if there have in fact been zero occurrences where this information has been used to someone's detriment.

  • by suck_burners_rice ( 1258684 ) on Tuesday August 26, 2008 @03:32AM (#24748381)
    That makes NO sense! I know that theoretically it's the company's responsibility to secure the data, but if some 1337z h4x04z figure out some crazy way into the system, then why should the company's top people face criminal charges? If you don't want to risk your information getting stolen, then don't give it to anyone. The company is also a victim in this case. Charging the victim is like this: You have bars on your windows and locks on your door. One night, a burglar busts in someone and jacks your PS3. You get charged with a crime. Does that make sense? No. And neither does this.
  • by rew ( 6140 ) <> on Tuesday August 26, 2008 @08:22AM (#24749865) Homepage

    If you try to jail the CEO, he will say it's the CTO's job to secure the systems. He in turn blames the head-of-IT-ops, who in turn blames the lonely sysop. So who's going to jail? All of them? The top? The bottom?

    If YOU do something bad, YOU have to pay the price. We've got several gradations here: pay a fine, go to jail, both in different amounts.

    If a company does something bad, what can we do to make it pay? Well, exactly that: Make it pay.

    Now, if YOU know that a fine for XYZ is $1, and it's easier for you to do XYZ than something else, then you'll easily do XYZ. Besides that the chances of getting caught are usually small, the fine is such that you can easily pay up. If you have to pay $10000 as the fine most of us will think twice, and be really careful.

    In the case of a big company, $10000 is nothing. So fines you put on companies should be proportional to their size. Faking profits or losses is easy. So it should be proportional to their turnover.

    Here in Europe, MicroSoft got fined EUR 1 billion for ignoring antitrust laws. This is an amount that even a company like MicroSoft feels.

    With several situations, legally someone is responsible. But after they have "paid" in whatever way that is, they might then be able to hold someone else responsible. For example, if I buy a stereo here in The Netherlands, I've got warranty service from the shop. They can claim: "factory warranty: 1 year" all they want, but the law gives me the right to ask the shop to fix problems in the product during a "reasonable time" no matter what they claim. (i.e. warranty: 1 week will not work either!).

    So, if a company pays a fine, and finds that this evidently the fault of a certain employee, they can sue that employee afterwards.

    The problem of scale then kicks in. If the company pays a $1M fine, but this is evidently the fault of precisely one employee. (Say he was told not to do X, but he did so anyway, finding clever ways to escape the regular checks of the company to see if he was complying with the order) Then how can that single employee pay the $1M "damages" to the company?

Any sufficiently advanced technology is indistinguishable from a rigged demo.