Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security IT

Should Companies Share Criminal Blame In ID Theft? 328

snydeq writes "Deep End's Paul Venezia criticizes the lack of criminal charges for corporate negligence in data breaches in the wake of last week's Best Western breach, which exposed the personal data of 8 million customers. 'The responsibilities attached to retaining sensitive personal identity information should include criminal charges against the company responsible for a leak, in addition to the party that receives the information,' Venezia writes. 'Until the penalties for giving away sensitive information in this manner include heavy fines and possibly even jail time for those responsible for securing that information, we'll see this problem occur again and again.' As data security lawyer Thomas J. Smedinghoff writes, data security law is already shifting the blame for data breaches onto IT, thanks to an emerging framework of complex regulations that could result in grave legal consequences should your organization suffer a breach. To date, however, IT's duty to provide security and its duty to disclose data breaches does not include criminal prosecution. Yet, with much of the data security framework being shaped by 'IT negligence' court cases over 'reasonable' security, that could very well be put to the test some day in court." It's a slippery slope to be sure, but where should the buck stop?
This discussion has been archived. No new comments can be posted.

Should Companies Share Criminal Blame In ID Theft?

Comments Filter:
  • Yes/No (Score:5, Insightful)

    by HappySqurriel ( 1010623 ) on Monday August 25, 2008 @04:08PM (#24741551)

    I think it is entirely appropriate to investigate a company when large ammounts of personal information ends up being 'stolen' ... If it turns out that the company did not take the necessary steps to protect people's personal information they should face some consequences. At the same time, there has to be an understanding that even the best technologies available and best practices may not prevent all personal information theft so a company should not face harsh consequences if they took the necessary steps to protect people's information.

  • by religious freak ( 1005821 ) on Monday August 25, 2008 @04:09PM (#24741561)
    If a COMPANY is being prosecuted criminally, you obviously cannot have jail time, because it's a non-person.

    However, you can (and IMO should) have much stiffer penalties than civil courts allow. When a data security breach is so bad to as harm society itself, it should be prosecuted criminally - this is the doctrine for criminal prosecution of companies. Criminal penalties can range from massive monetary damages, to shutting the entire company down, or forcing changes in management. This is the correct route to go.

    Obviously, if the implication is that the IT workers themselves should be thrown in jail, this is absurd and would cause all kinds of damage, both foreseeable and unintended.
  • by frith01 ( 1118539 ) on Monday August 25, 2008 @04:10PM (#24741591)

    You have a choice, allow organizations to report the data breach, or have them cover it up to avoid the penalty.

    [ Why would anyone report a data breach when that means they would face jail time ? ]

    Remember, the odds of an external entity finding out about the data breach is extremely small (except for the ones taking the data of course ).

  • Hard to say (Score:2, Insightful)

    by Anonymous Coward on Monday August 25, 2008 @04:11PM (#24741601)

    Almost any system can be hacked by someone sooner or later. If a crack was found in SSH that allowed a root shell, would the person responsible for the code be held responsible? or the guy who admins the server?

  • Re:Yes/No (Score:5, Insightful)

    by penix1 ( 722987 ) on Monday August 25, 2008 @04:12PM (#24741611) Homepage

    I've got a better idea. Ban the collection of personal information beyond the time required for the transaction. I don't but it that companies somehow need to store all this info on people especially years after the transaction has occurred. If you are going to be light on them when they lose it, then be heavy on what they can keep.

  • Re:Yea! (Score:5, Insightful)

    by corsec67 ( 627446 ) on Monday August 25, 2008 @04:12PM (#24741619) Homepage Journal

    Next step:
    Actually punishing companies that break laws, in such a way they can't just dissolve the front and start with a new name and the same people.

  • Yes (Score:5, Insightful)

    by sm62704 ( 957197 ) on Monday August 25, 2008 @04:12PM (#24741621) Journal

    Not only should there be criminal damages, but attempting to keep the thieft secret should carry an even heavier penalty.

    There should also be, upon conviction in criminal court, monetary redress for the poor slobs whose data was compromised, and it should be a LOT more than it cost the compromised person. Say, enough to buy a new car.

    Why can't we have the death penalty for corporations? The standard answer is "all those people who get trhrown out of work", but there IS a death penalty for corporations; ENRON suffered the death penalty, but the people in charge (at least the ones that didn't go to prison) suffered no penalty at all.

    How about a "death penalty" where the victims are given the company itself?

  • by lena_10326 ( 1100441 ) on Monday August 25, 2008 @04:13PM (#24741649) Homepage

    Stop giving out credit to every person who walks up to a cash register. Stop warehousing critical information that can be used to apply for credit. Stop approving credit based on only Name/SSN/Address. Stop this culture of unlimited, unchecked credit to anyone, any time, any place.

    The problem is the lending system, not the fact your data is leaked. In web terms, credit applications need to be double opt-in, not single opt-in.

  • Criminal Charges? (Score:5, Insightful)

    by db32 ( 862117 ) on Monday August 25, 2008 @04:13PM (#24741653) Journal
    Sure...while we are at it lets put a cop in jail every time someone in their city gets mugged, murdered, raped, etc.

    I will be exiting the field the moment some kind of stupidity like what is suggested goes in place. I have a family, and I have no intention spending time in jail being a scapegoat for something like this. It is stupid to expect an individual to be held accountable criminally for something like this. Why should I spend time in jail or face fines personally because Vendor X couldn't be bothered to employ better programmers or test their stuff. Nevermind there will ALWAYS be vulnerabilities. Or maybe I go to jail because some worker brought in an infected USB photo frame. The only way you can really secure the desktop computer completely from the user is to cut the power cable and give them a pad of paper and a pen.

    That said...I think there should be something to "encourage" companies to actually invest the resources in protecting that data, or just to stop collecting it. Seems to me not collecting it is far easier and more viable in many many cases. I agree that there is a problem in the value that data provides the company and their lack of "encouragement" to protect it. The notion of holding already overtaxed administrators criminally liable will only make the problem worse. The field will shrink even further and I imagine many of the competent ones will find work elsewhere not wanting to be a whipping boy under idiotic laws like this.
  • by Anonymous Coward on Monday August 25, 2008 @04:15PM (#24741689)

    I'm a professional engineer (PE). My wife is a physician. I we screw up, ruining somebody's life, we get sued.

    IT is not more complicated than medicine, yet seems to fail at security all the time. Perhaps it's time for malpractice/negligence to whip companies into shape.

  • Code violations (Score:3, Insightful)

    by Brain-Fu ( 1274756 ) on Monday August 25, 2008 @04:17PM (#24741733) Homepage Journal

    Most forms of construction must adhere to a code. Why should software be any different?

    It would be nice, IMO, if we could formulate a set of minimum requirements for any kind of personal-data-handling software (including codes for operating procedures). Things like "all passwords in the system must use strong encryption" and "backups of the data cannot be stored on personal laptops" and the like.

    Then legally require businesses to higher some ratio of software developers who have passed a code certification and logged sufficient hours under the apprenticeship of a certified master, and cite them if any such developers blow the whistle on them.

    It is not a perfect solution. It has problems with implementation. And of course M$ will do its darndest to ensure that codes require the use of its software. But it it is still better than the situation we have now.

  • Worrisome... (Score:2, Insightful)

    by tekiegreg ( 674773 ) * <tekieg1-slashdot@yahoo.com> on Monday August 25, 2008 @04:18PM (#24741735) Homepage Journal

    Forgive me for not RTFA in advance but...

    I'm a developer, I've worked on many an app that has stored credit cards, social security numbers, and other pieces of juicy data. I've always acted with integrity and you'll never find a credit card or social security number posted on the Internet of my own free will. Generally I take best efforts to secure this information. Using appropriate technology such as hashing, encryption, access controls and authentication as appropriate for the information, etc. Documenting as throughly as possible to make sure that nothing happens, and what to do to further protect things.

    Despite all this, if my programming is ever compromised, am I now jail potential? I'm finding a new job...

  • by sm62704 ( 957197 ) on Monday August 25, 2008 @04:19PM (#24741749) Journal

    the cops don't do much when it comes to non-violent, non-domestic, non-street crimes.

    I know a man who was charged with home invasion and attempted murder for breaking into a man's home and trying to kill him with a butcher knife, and plea bargained down to two weeks in the county jail.

    A woman I know spent four months in Dwight Correctional Center for a non-violent drug offense (possession). It seems to me that being careless with thousands of peoples' lives, let alone attempted murder, should carry a far heavier burden than a crime with no victim.

  • by sm62704 ( 957197 ) on Monday August 25, 2008 @04:24PM (#24741841) Journal

    Freezing a companies' assets and disallowing any business for two years would be the equivalent of putting a human in prison for two years. So you could, in fact, "jail" a corporation. You could shield its employees (at least the ones not responsible) by forcing the company to pay them anyway. If it goes bankrupt, well, people go bankrupt after incaration, why shouldn't businesses?

    Or converseley, put its CEO and Board of Directors in a maximum security prison with the other criminals, many of whom caused far less damage to people, or none at all.

    The thing is, the corporations are deemed too valuable to be punished. THIS is what should change.

  • by lena_10326 ( 1100441 ) on Monday August 25, 2008 @04:30PM (#24741933) Homepage
    100% on-topic. Data breach => identify theft => credit and lending fraud. Fix it at the tail end by making the data useless to fraudsters. Think it through next time, mod. Just think it through.
  • Re:Yes/No (Score:5, Insightful)

    by Zironic ( 1112127 ) on Monday August 25, 2008 @04:42PM (#24742117)

    Tell me again what part of those features require my personal data? Learn to use a serial number seriously.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Monday August 25, 2008 @04:44PM (#24742141)
    Comment removed based on user account deletion
  • by TubeSteak ( 669689 ) on Monday August 25, 2008 @04:47PM (#24742171) Journal

    If a COMPANY is being prosecuted criminally, you obviously cannot have jail time, because it's a non-person.

    It is tiring that this line of reasoning keeps getting trotted out.
    WTF do you think executive officers are for?

    "The Company" doesn't do anything illegal, the corporate officers & various (vice) presidents are the ones in charge and they have always born the responsibility of the company's actions.

  • Re:Yes/No (Score:5, Insightful)

    by Foofoobar ( 318279 ) on Monday August 25, 2008 @04:52PM (#24742257)
    Nobody needs to store SSN's except the government that issues them. The fact that people made the MISTAKE of standardizing on SSN's as primary keys for users to begin with is their own fault. Mainly because SSN's are horrible primary keys since they REPEAT!!! Yes look it up... they DO get reissued after death and with longterm storage, this will only cause issues for storage of personal data.

    Second, data loss is a quick route to a lawsuit as a result of storing SSN's; People and companies need to stop this bad procedure especially since laws in several states have been passed banning the practice. Good security can only do so much as human error is inevitably your final point of failure. And do you want to have a couple million social security numbers relying on the security of a backup tape in the back of your juniour sys admin's Pinto overnight?
  • by nine-times ( 778537 ) <nine.times@gmail.com> on Monday August 25, 2008 @04:58PM (#24742337) Homepage

    There needs to be a fixed ID system which is separate from the credit system (as in credit score) and governmental ID systems.

    Part of the problem is just that everyone wants everything to be easy, and "easy" doesn't get along well with "secure". Like with social security numbers-- they're being treated as a piece of secure information in order to identify people (which it wasn't intended to do). But then as a result, you have to give it to people *all the time*. Because so many things require your social security number and people are encouraged to give it so freely, it's effectively out in the open, and not a piece of secure information.

    But then what ID can I give someone online for an incidental purchase that won't effectively be "out in the open" after a couple of purchases? The only thing I can think of is if there were some sort of public key encryption signature that was issued to each person. That would possibly be cool, but then you'd have to come up with a trustworthy system to issue those keys/certificates, and you have to trust someone to administer to that system.

    It gets complicated fast. And ultimately, most people won't put up with anything that inconveniences them or requires them to be vigilant

  • Re:Yes/No (Score:3, Insightful)

    by beadfulthings ( 975812 ) on Monday August 25, 2008 @05:09PM (#24742527) Journal

    Getting rid of the credit card data after X weeks seems like an excellent idea.

    It's not easy to get a room at a decent hotel without a credit card. Certainly in some places you can pay cash in advance--but you can't use the phone, order a meal, connect to their network. If they require a credit card, or make it too difficult to procure their services without one, then they should absolutely be held accountable for the safety of the information.

    Organizations of all sorts--retail, airlines, hotels, hospitals, insurance companies, banks, potential employers, not to mention government agencies--are ravenous for your personal information. They go to great lengths to get hold of as much of it as they can, whenever they can, using whatever methods they can. If they want it that badly, they should be responsible for its safety and security, and they should be held accountable when it's compromised.

    We received a letter from our bank a couple of years ago saying that my husband's debit card (never used online) had been compromised, that he should stop using it, and that a new one would be issued. It arrived in due course, but they would never reveal who had screwed up or what had happened. It had to have been a local entity, but it could've been a supermarket, a restaurant, a gas station--we will never know. We don't even have the recourse of not giving them more business and further opportunities to screw up.

  • Re:Yes/No (Score:4, Insightful)

    by CodeBuster ( 516420 ) on Monday August 25, 2008 @05:10PM (#24742565)
    The issue is one of negligence [wikipedia.org] not the relative efficacy of the available security technologies. If a company is found, upon discovery, to have exhibited a complete or reckless disregard for the potential consequences of a breach then some liability is in order. The "reasonable man" test can be used by juries to decided whether or not the circumstances surrounding the breach amount to negligence and what the appropriate remedy should be. The negligence tort has already been well litigated in common law countries (like the US, UK, and Australia) so the only thing different here are the details (IT technical details) which might require expert witnesses to testify or offer their opinions, but the basic law in negligence is well settled (at least as far as I understand it, but IANAL so please do not take this as formal legal advice) once the details or facts of a particular matter have been determined.
  • Re:Yes/No (Score:4, Insightful)

    by Qzukk ( 229616 ) on Monday August 25, 2008 @05:16PM (#24742661) Journal

    Data is key for a successful company

    I never hear about a company having the laptop containing their inventory records getting stolen. Is that a function of nobody but the company caring, or do companies take better care of their "keys" than their customers'?

  • Re:Yes (Score:4, Insightful)

    by oyenstikker ( 536040 ) <slashdot@sb[ ]e.org ['yrn' in gap]> on Monday August 25, 2008 @05:20PM (#24742727) Homepage Journal

    Won't fly. The shareholders will then claim to be victims as well.

  • Re:Yes/No (Score:2, Insightful)

    by VEGETA_GT ( 255721 ) on Monday August 25, 2008 @05:24PM (#24742793)

    Ok first off if you are a IT person who part of your job is dealing with a server holding user records you could be held responsible for even a simple mistake, that I don't mind but criminal charges for said mistake seams a little over the top. If this actually became law, watch how many IT people will decide its not worth the risk or decide there salary needs a big jump because of the risk.

    This is similar to what my father has to deal with In Ontario Canada. He is a maintenance manager at a production plant, so he has to make sure the machines are fixed and the plant is safe. Now here is the kicker, if someone gets hurt and he did not do EVERYTHING (I stress that word here) he could possibly could do to prevent the injury he could face criminal charges. Ok Now define everything, say guy gets hit by a fork lift, did my father have caution tape along the entire stretch of the building the forklift was driving, was there 10 people in front of the forklift making sure no one was there, where there 15 different noises coming for the thing, a announcement over the PA saying its on the way. Um ok the answer is no, he has to be reasonable but in a court of law people have been screwed over for less. Its to the point where he is to anyone who is above the ground I believe 5 feet neets to be tied off, um that begs the great question tied off to what, how do you tie off without getting up and tieing off to something above you da da da. It gets to the point of being just dumb. right now to go up on the roof he has to fill out a form. Takes him 30 min of paperwork to check 1 thing on the roof. I agree with safety but that's to the point even Darwin is shaking his head.

    So back to IT, lets go to court, define EVERYTHING you could have done to prevent a hacker from getting data from a server, well I can un plug it, beat the living crap out of it, encase it in concrete, drop it in a lake and then its 99% safe. There is is always a new hole, but the patch came out for it yesterday and someone took the data 5 min before it got patched, why was it not patched 5 min earlier. Why where you not running the newest 2.0.3.4.66.3.11 instead of 2.0.3.4.66.3.10 that's the sport of thing someone may try in court, how do you defend yourself as in court the common scene approach don't always seem to work.

    Basically how far can you go to say you did everything, and was everything enough, well it never is. LETS bring the guy who came up with safety devices for cars to court, people still died in you car, why is that, how come you did not provide a trained driver with the car, full body air bags da da da. Yes Slippery slope

  • by erroneus ( 253617 ) on Monday August 25, 2008 @05:38PM (#24742973) Homepage

    It's the responsibility of the people who created this system that people cannot reasonably opt out of.

    With "drug laws" as they are, there are limits to the amount of cash anyone can carry without it potentially being seized by cops. You can't pay for everything in gold can you? With the majority of banks out there simply refusing to do business with you for not having a social security number, it is essentially impossible for people to exist in society without allowing your identity to be entered into various systems and databases. The credit and banking system has created this potential for abuse of our identities and it is the credit and banking system that should be held accountable for the abuse of the system that we are all but involuntarily required to be a participant in.

    Furthermore, since so many businesses feel it is in in their interests to collect our information and put it at risk, they should also maintain responsibility for its abuse when it leaves their control. Once again, as a condition for doing business and ultimately for leading a "normal" mainstream life, we are essentially powerless to opt out and are otherwise defenseless and unable to protect ourselves from what may happen when mismanagement and abuse of our trust occurs.

    What a great system they have where they reap all the benefits and we burden all the risk? I think it's more appropriate that they bear the risk along with the benefit. If they want to have the benefit of collecting private information, they should bear the consequences when the information is abused as a result of their own abuse or negligence.

  • Re:Yes/No (Score:3, Insightful)

    by baggins2001 ( 697667 ) on Monday August 25, 2008 @05:42PM (#24743011)
    This happens all the time. I've had VP's admit it to me and when I tell the CEO he doesn't really care.
    So therefore I don't care anymore.
    Security becomes a business cost that they didn't anticipate or aren't willing to accept.
    In fact during the latest briefing, we were told that we were looking to go public in a foreign exchange where the regulations weren't as strict.
  • [Citation needed] (Score:3, Insightful)

    by jabithew ( 1340853 ) on Monday August 25, 2008 @05:45PM (#24743069)

    The thing is, the corporations are deemed too valuable to be punished. THIS is what should change.

    Seriously, what the hell? Consider the HSWA (1974) [wikipedia.org], the Environmental Protection Act (1990) [wikipedia.org] and the Data Protection Act (1998) [wikipedia.org], all of which carry the possibility of fines and a jail term if breached?

  • by wshwe ( 687657 ) on Monday August 25, 2008 @05:59PM (#24743251)

    Companies will only stop allowing mass identity theft if there are definite consequences for their failures.

  • The real question (Score:1, Insightful)

    by Anonymous Coward on Monday August 25, 2008 @06:00PM (#24743273)

    The real question is, who in the company do you punish? Responsibility could lie with management, IT personnel, the end user, or any combination thereof, and once the cat is out of the bag, they will all point the finger at each other (and management will happily fire IT personnel or the end user to deflect blame). How do you determine who to drop the hammer on?

  • This would be a great civil class action case, but criminal? ...

    When I was doing standards work, I was introduced to the notion that only "must" and "shall" (i.e., imperative words) mean something you have to do. Words like "should" are really synonymous with "don't really have to at all" in standards lingo. They just mean you have to answer for something in words when someone calls you on it, but ultimately that no one can force you.

    So too the real difference between civil and criminal is that civil means you can buy your way out pf doing the wrong thing and criminal means you really have to do the right thing. So people can choose.

    Asking whether civil or criminal law applies isn't the thing to do. The thing to do is to ask whether this is really something that has to be done or whether it's ok to just let people do the wrong thing and then occasionally pay a fine. If you don't mind having your identity stolen and you think maybe courts will operate efficiently in your favor to reimburse you with extra dollars to spare for your trouble whenever it happens, you definitely want the civil penalty approach. Or if you have a magic way to have the problem not happen to you and you just don't care that it happens to someone else who is in the unfortunate set that you have excluded you from. But otherwise, I see no option other than to say criminal.

    That doesn't mean I think criminal law should be retroactively applied. It just means I think business people take very seriously the criminal law, and that if this is on that level of magnitude, then that'st he approach. But I'd decide first just the question of whether this is a "should" or a "must". The rest will just follow from that. Present attitudes in business tells you businesses think it's a "should" (meaning "don't really have to at all"). The question is, does the public agree? For the public to establish "civil penalties only" is, I suspect, the same as saying the public agrees it's a "should"--a mere cost to be managed, often after-the-fact.

  • by LtTickles ( 930947 ) on Monday August 25, 2008 @06:13PM (#24743463)
    Seriously, having worked in IT Security for some time and done numerous "compliance" projects. Compliance takes time and costs money. Too many times I have been told "we just don't have the money for that this year." Corporations commonly engage in the 'risk' game where they risk it for as long as they can. Until the bank stops taking their credit cards (in the case of PCI Compliance) or there is an actual public breach - the risk is quite low. I'm not against criminal charges but they should be levied on a corporate officer and not the rank and file IT person. This person has zero responsibility for the financial decisions required to keep data safe. I make recommendations until I am blue in the face but until management realizes the risk to them - they won't touch it with someone else's ten foot pole.
  • by CopaceticOpus ( 965603 ) on Monday August 25, 2008 @06:17PM (#24743521)

    Wouldn't this lead to all companies needing to purchase a data loss insurance policy, much like doctors need malpractice insurance? The end result would be richer lawyers and insurance companies, more wasted time in court, and companies not needing to change because they have insurance.

    I do think these companies need to be held responsible, but I think that they are already afraid of the PR hit from losing data, and their IT managers should already be afraid for their jobs if a data breach occurs. I really doubt that this sort of law is going to help.

  • Re:Yes/No (Score:1, Insightful)

    by Anonymous Coward on Monday August 25, 2008 @07:30PM (#24744393)

    Personally I think the government and law enforcement should take Identify Theft a lot more seriously, with major penalties against these fraudulent jerks.

    They have less to worry about. I'm sure they keep a separate, and better protected(read more expensive to get a hold of), database on government and law enforcement entities. And amidst all this wiretapping, mail-reading, internet-snooping, etc., it's a safe bet that this 'stolen' information is turning up in theirs or the highest bidder's hands.

    Why it would be on a flippin LAPTOP I have no idea

    You can't very well hand over giant servers rigged with a secure encrypted database, can you? You have to copy it all down on a laptop for convenience.

    Like the GP wrote, if you want to stop them from 'losing' your personal information then they can't be storing it. Either that or it needs some serious open regulation by the people who are supposed to be watching them, not the other way around. /tinfoil

  • Re:Yea! (Score:4, Insightful)

    by greenbird ( 859670 ) * on Monday August 25, 2008 @07:46PM (#24744601)

    The first step is to financially ruin and have real "pound me in the ass" prison terms for the executive staff that cut the IT departments budget to increase security.

    The only problem is that the executive staff won't be the ones going to jail. I guarantee it won't be any executives. It'll be the poor overworked IT guy doing 6 different jobs and is on call 24/7/364 (he gets Christmas off) who ends up with all the blame. And then the executive staff will give themselves a raise for doing such a good job getting to the bottom of the security breach and taking such decisive actions in making sure it'll never happen again.

  • by Anonymous Coward on Monday August 25, 2008 @08:39PM (#24745203)

    Oh for fuck's sake. If you're going to blame anyone, how about blaming the people resposible?

    Some jackass shows up at a bank, gives my name and social security number, gets a loan, and then the bank harasses me for their money. Sounds like the bank is the one to blame. They're the dumbasses who didn't adequately determine who they were dealing with, and they're the ones who sought to ruin me financially by trying to collect money I didn't owe them.

    The problem isn't that companies are leaking my social security number, the problem is that I can't tell everyone my social security number because a lot of dumbass companies assume it to be a PIN number and will make my life hell if anyone else happens to know it.

  • Re:Yea! (Score:3, Insightful)

    by HiThere ( 15173 ) <charleshixsn@@@earthlink...net> on Monday August 25, 2008 @09:18PM (#24745637)

    Major investors should be punished, yes. Minor stock-holders...no more than losing their investment. Directors, yes. Corporate executives, yes.

    It should be handled analogously to fiscal malfeasance. ... Or rather as fiscal malfeasance should be handled.

  • Re:Yea! (Score:1, Insightful)

    by Anonymous Coward on Monday August 25, 2008 @09:20PM (#24745659)

    Which is why the executives need to be PERSONALLY responsible for any security breaches. I dont care if dave in IT support is to blame, the CEO get's his ass in the grinder for it.

    By doing that things get done. for an example see Sarbanes Oxley. They get off their asses and make sure it's done because with SOX the executives are personally responsible for it.

    P.S. They deserve it, they get the outrageous pay for being the one in charge, then they get the risk as well.

  • Re:Yes/No (Score:3, Insightful)

    by jd ( 1658 ) <imipak@ y a hoo.com> on Monday August 25, 2008 @09:45PM (#24745905) Homepage Journal

    Yes, and often far more data is held than is necessary. Also, if you subscribe to the notions of grid computing and cloud computing, why store the data at all? All you need to do is tell an authorized holder of the data what operation you wish to perform, and get the results, entirely black-box. You need never see the data at all.

    In terms of liability, I would argue that the rule should be a generic one: if you assume control of data, you assume responsibility for that data - its accuracy, its security and its legitimacy. The distinction should come in the degree of reasonableness. It is reasonable for a non-mathematical corporation to trust RSA and Elliptic Curve public-key encryption, AES and the SHA-256, Tiger and Whirlpool cryptographic hashes. It is not reasonable for any corporation to trust unencrypted and unsigned sources - they wouldn't trust unsigned paperwork and physical signatures are easier to forge. Organizations which can be reasonably assumed to be aware of security bulletins, the assorted cryptographic lounges and other such sources should be held to the higher standard of being expected to discontinue additional use of vulnerable methods with a migration of legacy data in circulation within a sensible period.

    It is never reasonable to hold data a corporation cannot use in future, cannot be sure is authentic or accurate, and/or cannot be sure is serving any legitimate purpose on the system. Since there is no excuse to hold such data, there is less of an excuse to lose it. You can't lose what you don't have, so any loss of such data - regardless of method - can never be passed off as unavoidable. It was easily avoidable. Don't keep such data. Likewise, if an individual within that corporation is provided access to information they didn't actually need, and that data is subsequently lost as a result, that should be an automatic crime even if every precaution was taken, simply because it was an unnecessary gamble and therefore not entitled to any protection or justification.

    Data that is accurate, legitimate and in active use should be considered as highly sensitive, and companies that do not treat the data with the respect and maturity they are capable of and for which the data is deserving should find themselves less in hot water than boiling oil. Like I said earlier, this depends on what the company can be regarded as being aware of. All companies can be deemed aware of published security patches, common security software (Tripwire, RSA and PGP are hardly obscure!) and software equivalents of practices already in place for physical documents. Government (including military and veterans affairs) and computationally advanced organizations, as I said, should be aware of relatively mainstream peer-reviewed discoveries, not just pre-packaged solutions, and should also be aware of vulnerability scanners (Nessus, nCircle, SARA/TARA, and so on) and advanced access controls, where the size and type of organization is going to dictate what sort of preventative measures are cost-effective.

    Where a company falls below what can reasonably be expected of it, and loses data, that's boiling oil time. Where a company meets or exceeds a rational, sane level of protection and still loses data it needed to have, it should still be responsible for contributing towards cleaning up the mess (same as you would in a truly no-fault car accident) but shouldn't be punished for what was beyond its abilities to deal with. (That "needed to have" qualifier really is important.)

    If a company deliberately places data in a dangerously exposed context (eg: pushing personal data onto unsecure systems overseas to avoid any national laws on data security), then they deserve not only the boiling oil treatment but a loss of right to operate. Dodging the law or evading responsibility is not a helpful way to tackle data insecurity, even if it looks like a cheap way to solve the problem for the company.

    To those who argue that this is a slippery slope, I'd say that reasonable conduct can never be a slope, nor can it be slippery. If anything, it is a great leveler and a superb provider of grip and balance.

  • Re:Yes/No (Score:3, Insightful)

    by hedwards ( 940851 ) on Monday August 25, 2008 @10:48PM (#24746509)

    Since a state issued ID is considered to be valid identification for the federal government. And the federal government uses SSNs to identify people, it seems fine to me that they'd use that information.

    If one is going to be using it to board a plane, as identification for a passport, to register to vote in federal elections, it seems fair to me for the federal government to expect that state issued IDs are going to be recorded against the SSNs.

  • Not on IT (Score:1, Insightful)

    by Anonymous Coward on Tuesday August 26, 2008 @02:03AM (#24747955)

    There should be a stiff fine for the company. An incentive to think hard about security. Blaming IT is not good. Its actually wrong. Is IT responsible for decision making in IT? RARELY! Management usually insists on decision making, and very often make really really poor decisions. They then make IT suffer from their bad decisions: "We picked it, you fix it!" Management at TJ Maxx picked the security method. A management lacky was in charge of security. He didn't know any better, wasn't qualified, but was the boss anyway. Blaming an IT staffer for his bungling and incompetence doesn't prevent problems (stupidity will continue, because the stupid will go unpunished). The company must be fined. Heavily. It will at least make the incompetent think twice before failing to listen to advice.

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...