Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Software

Ask Slashdot: Share Your Security Review Tales 198

New submitter TreZ writes: If you write software, you are most likely subject to a "security review" at some point. A large portion of this is common sense like don't put plain text credentials into github, don't write your own encryption algorithms, etc. Once you get past that there is a "subjective" nature to these reviews.

What is the worst "you can't do" or "you must do" that you've been subjected to in a security review? A fictitious example would be: you must authenticate all clients with a client certificate, plus basic auth, plus MFA token. Tell your story here, omitting incriminating details.
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Share Your Security Review Tales

Comments Filter:
  • Fooled ya! (Score:5, Funny)

    by 140Mandak262Jamuna ( 970587 ) on Wednesday October 04, 2017 @02:43PM (#55310151) Journal

    If you write software, you are most likely subject to a "security review" at some point

    Wrong! My code has never been subjected to any such stupid security review.

    Disclaimer: Opinions expressed here are mine, not my employer Equifax.

    Disclaimer to disclaimer: Nah! I'm not really working for Equifax

  • >> What is the worst "you can't do" or "you must do" that you've been subjected to in a security review?

    Anything the client didn't pay for. (Threats to suspend a very large support payment count as payment, however.) Likewise, whenever a customer wanted to pay extra for an MFA, SSO or other integration, we were all ears.

    Or were you looking for whining?
  • Working for a semi-well known mesh networking company in Seattle, I was hired for DevOps because, despite 20+ years experience with C/C++, the gasseous CTO didn't believe I was qualified to do development. About a month into the job, I got called into a code review for one of the senior developers, and I quickly caught several buffer overruns on the "rookie mistake" level with strncpy overflowing the allocated space.

    Gotta wonder how many of those Mr. Senior Developer committed to the code base.

    • by Anonymous Coward

      Never trust management. They oversimplify and are totally defenseless against hype.

      My five cents.

    • Re:Buffer Overruns (Score:4, Insightful)

      by pr0fessor ( 1940368 ) on Wednesday October 04, 2017 @03:20PM (#55310421)

      You were hired for a lower paying position because they felt you weren't qualified enough for the position you applied and then had you doing it anyway for less pay because your title was still something else.

      This is standard practice.

    • Man, I had the exact same experience nearly. I was a n00b C coder in an embedded shop. I thought "These guys are veteran coders who can put me up on some real design patterns." Turned out that was mostly right, I met some badass C++ and TCL coders (the TCL guy was hyper-smart, he wrote a huge part of the ATC code still out there in a lot of airports). However, the place had two bosses (Bob) and one of them was a self-proclaimed "20-year veteran". He really had been coding in C for most of that time, but MY
    • ...with strncpy overflowing the allocated space.

      I'm impressed by that developer's skill, but not in a good way. That's exactly the kind of brain fart error strncpy is designed to prevent. I presume, then, that this developer never bothered to make sure that the destination string was large enough for the job, and I hope that his next performance review reflected his carelessness.
      • ...with strncpy overflowing the allocated space.

        I'm impressed by that developer's skill, but not in a good way. That's exactly the kind of brain fart error strncpy is designed to prevent. I presume, then, that this developer never bothered to make sure that the destination string was large enough for the job, and I hope that his next performance review reflected his carelessness.

        strncpy is defective by design. The carelessness comes from allowing code with it to ever be checked in.

        When I want to find bugs strncpy is one of the first things I look at exploiting. Very few understand null terminator is omitted on overflow.

      • Unlike other strn- functions, strncpy is not just a bounds-checked strcpy. It write precisely the defined number of characters from the source to the destination, ignoring '\0' terminators. If there is no null terminator in the source within the specified number of characters, none goes into the destination, leading to buffer overflow vunerabilities.

  • by ahziem ( 661857 ) on Wednesday October 04, 2017 @03:00PM (#55310247) Homepage
    I was a high-ranking official in the state department. The FBI sent me a subpoena for my private email server because I used it to discuss classified government business, so I had my IT guy wipe my private email server before I handed it over to the FBI. Later he was discovered on Reddit and confessed to the FBI, but I made sure they couldn't trace the decision back to me.
    • by Major_Disorder ( 5019363 ) on Wednesday October 04, 2017 @03:48PM (#55310647)

      I was a high-ranking official in the state department. The FBI sent me a subpoena for my private email server because I used it to discuss classified government business, so I had my IT guy wipe my private email server before I handed it over to the FBI. Later he was discovered on Reddit and confessed to the FBI, but I made sure they couldn't trace the decision back to me.

      I bet you would have gotten away with it too, if it wasn't for those meddling kids.

    • I was a high-ranking official in the state department. The FBI sent me a subpoena for my private email server because I used it to discuss classified government business, so I had my IT guy wipe my private email server before I handed it over to the FBI. Later he was discovered on Reddit and confessed to the FBI, but I made sure they couldn't trace the decision back to me.

      Wipe? Like, with a cloth?

  • by gweihir ( 88907 ) on Wednesday October 04, 2017 @03:03PM (#55310279)

    This happened to a customer of us: They were told by an auditor that they absolute must have anti-virus on all machines, as per policy. Hence they built a tunnel into a completely isolated environment with absolutely no malware-vectors in order to be able to get updated AV signatures to the AV they installed on these machines. The really bad thing was that they did not seem to understand when we explained to them that they now did not have an isolated environment anymore and that the AV vendor as well as anybody successfully attacking the AV vendor could now attack them and export data at their leisure. What they should have done is to get an exception.

    • I'm not sure there is such a thing as a completely isolated environment anymore. There are too many air-gap bridging attacks. (See also Stuxnet).

      Now, those attacks require far more work than the anti-virus vector. And it's not likely to be used. But it should be expected that something valuable enough (to a nation state) will be breached.

      • by davidwr ( 791652 )

        I'm not sure there is such a thing as a completely isolated environment anymore. There are too many air-gap bridging attacks. (See also Stuxnet).

        In practical terms, an isolated environment is one where the only way anything gets into the system is by a human being manually entering it, and the only way anything gets out by what a human being carries away with him either in his head or in his pocket/breifcase/other.

        I would count a system that has a keyboard or mouse for input, a video screen, printer, and maybe a "write only" media-writing tool (see below) that is in a room where electronic- or even look-at-the-screen-through-the-window eavesdropping

      • by gweihir ( 88907 )

        This one was. Sorry cannot get into details.

    • Should have built your own open source anti-virus as a 'side' project. It could scan for a few signatures or something. It doesn't even have to work, that's not a requirement for anti-virus: all you need is a website that looks really snazzy.

      No one buys anti-virus because it works, they buy it because of marketing.
      • You jest, but that's what we did.

        Very similar setup, a completely isolated network with no way to bring data in an automated way into it. Data was entered manually only and extracted on CD-Roms. Similar problem, getting any kind of data line in would have required unacceptable security breaches. And similar requirements, i.e. all machines need antivirus software.

        So we wrote one. What that software basically did was to routinely check all hashes of all files on the machine (with the exception of the data fil

        • Oh that makes me so happy. Not only did you work around the problems created by bad bureaucracy, but you ended up making something really great. If there were a Diogenes looking for a man who cared about security, he could rest when he found you.
  • by Anonymous Coward

    At my office, no one has physical access to their machines. They are all locked in shielded cabinets. We get a keyboard, mouse, and monitor. No access to USB, Network, or any other ports.

    No network connection by any means to the Internet, and no cell phones are allowed in the building, period.

    Place is pretty tight.

    • That's actually really cool.

    • you mean, no way for users to break the machine???

      Must be a sysadmin's wet dream!

    • by davidwr ( 791652 )

      We get a keyboard, mouse, and monitor.

      BWUHAHAHAHA says the disgruntled soon-to-be-ex-employee who happens to have a photographic memory.

    • by PCM2 ( 4486 )

      Are you using Zero Clients? (i.e. Teradici PCoIP protocol, probably via VMware Horizon)? Because that stuff actually is pretty cool.

    • At my office, no one has physical access to their machines. They are all locked in shielded cabinets. We get a keyboard, mouse, and monitor. No access to USB, Network, or any other ports.

      No network connection by any means to the Internet, and no cell phones are allowed in the building, period.

      Place is pretty tight.

      Don’t worry, next year, you’ll finally make it out of kindergarten.

    • Yeah, but do the cables run through pressurized conduit with pressure monitors on each one that signals a security alert if the pressure drops? If not, then you may as well be housing your servers on a rock in Central Park. </sarcasm>

  • by Anonymous Coward

    I worked for a startup as a contractor, and they were readying a security product to hit the market. It used public and private keys. Instead of generating and using a 4096 bit key (for securing files), it used sixty-four, 64-bit RSA keys. The reason for this is that the CEO wanted the ability to decode stuff in case a customer locked themselves out.

  • Worst:

    I use to work for a company, about a year ago, where no one had even the most basic concept of data security. During my time there I implemented MFA on all Servers, programmed in Data Encryption, Data Validation, Client Verification, DB Security and other such improvements. Well, I was showing the three other existing employees how the software worked and how the new infrastructure worked, they didn't like that it was now "hard" to log into the servers and that they would now have to use password a
    • Re:Worst and Best (Score:4, Interesting)

      by Murdoch5 ( 1563847 ) on Wednesday October 04, 2017 @03:35PM (#55310537) Homepage
      Worst:

      I use to work for a company, about a year ago, where no one had even the most basic concept of data security. During my time there I implemented MFA on all Servers, programmed in Data Encryption, Data Validation, Client Verification, DB Security and other such improvements. Well, I was showing the three other existing employees how the software worked and how the new infrastructure worked, they didn't like that it was now "hard" to log into the servers and that they would now have to use password and keys to access Data. They told me to revert to how it was before, as they knew better then I did, so I quit. They reverted all my changes and claim it's now more secure and better!

      The software product they're developing, without a developer (they still don't have one), is an iSCSI based Desktop Protection System, but it's so riddled with holes and such a massive lack of security that they're committing fraud by selling what they have as a security solution.

      Best:

      The best security I've ever seen and been involved with developing had multilayer client authentication, certificate binding, transaction queue verification. It had a routine that went through the software and tweaked it's ports and accesses. Every piece of data was run through an AES-192-GCM based function that signed all the transactions and messages. The infrastructure this software was running on was just as impressive, ever server had at least 3FA+ turned on for logging in, active port based monitoring, which used MongoDB Clusters to validate logins, clients and pretty much everything you could imagine.
  • by 140Mandak262Jamuna ( 970587 ) on Wednesday October 04, 2017 @03:27PM (#55310475) Journal
    I needed a long string to test some of my encryption decryption code. Some local test string for debugging and testing. It was just after 9/11. Naturally I wrote a long rant against Osama Bin Laden and used that as the test string. Encrypted, decrypted, round tripped, compared the strings, checked in the code. But forgot to #ifdef out the testing code.

    Some nosyparker busybody customer did a "strings" on our product and found the string and ratted out to our CTO. Nothing serious happened, just a slap on the wrist. But another colleague told me the same customer found the full "man from nantucket" in his test strings for the stringutil library he wrote. And another said that customer also found the "Fuck! Got null pointer again!" in his code.

    We think he was looking for some kind of debug switches and env settings that will disable license check.

    • by rew ( 6140 )

      A long time ago, I needed to develop something on an FPGA from Xilinx. Told them we needed the sofware to work by then-and-then (weeks in advance) and weeks past said date we still didn't have anything. So after some angry phone calls they agreed to send us a licence for the software on a Unix server for the time until our PC dongle would arrive. They planned a day overlap between our dongle arriving and the end of the unix-workstation-licence. With a week additional delay in the dongle I suddenly had not

  • by Seven Spirals ( 4924941 ) on Wednesday October 04, 2017 @03:28PM (#55310481)
    I was summoned by a contract firm to a 500 person company that had been a victim of an inside job. They wanted a security review and fixes for "whatever that guy did". Turns out the guy was a half-assed developer. The client had spotty and in some cases non-existent backups. They wanted to pass a SOX audit (hahahaha!) while 20-30 machines were completely pwned and backdoored. He'd used everything from sub7 to more modern remote access & control tools. Some of the tools looked like ones he'd cobbled together himself from other tools. He'd also got in and falsified and buried a bunch of code hacks in their version control repo. Luckily, I was able to get that off tape and they only lost about a MONTH of code/work. The FBI got involved because the guy was out of state. I spent about 3 weeks gathering evidence and rebuilding servers, routers, print-servers, and other devices he'd hacked or otherwise tainted. My fees amounted to around $30k. A federal DA charged him with about 10 different hacking related and felony vandalism charges. After a pretty short trial (no jury) he was found guilty and he's still in the same federal prison in Louisiana. He actually has a cell near Bernie Ebbers. I had to talk to him once while he was in prison to get some passwords. The whole thing was surreal. Now get this, on the SOX audit? They passed! They got dinged for the hack but they still passed even before I was done cleaning up. That's when I realized that CISSP/SOX/GLBA/PCI security and *actual* IT security aren't always aligned. Audit all you like, but ... "stay frosty and alert. You can't afford to let one of those bastards in here."
    • by sad_ ( 7868 )

      Now get this, on the SOX audit? They passed! They got dinged for the hack but they still passed even before I was done cleaning up. That's when I realized that CISSP/SOX/GLBA/PCI security and *actual* IT security aren't always aligned.

      an audit is nothing more then just checking off a few boxes. are you doing this, are you doing that. yes? then everything is fine. how you actually do it doesn't matter.

  • Years ago, I was making a website for a company that shall remain nameless. They wanted an online ordering system built. No problem. I can do that. Then, they told me they wanted the order information to not be saved into a database, but e-mailed to them. I pushed back as much as I could, but finally had to build it for them. (It's complicated as to why I couldn't just say "I refuse" and walk away. Trust me, had it been up to me, I would have.)

    A few years later, they came to me saying they had a complaint f

  • by darkain ( 749283 ) on Wednesday October 04, 2017 @04:41PM (#55311051) Homepage

    False Positives during automated audit tools is my own personal hell. PCI Compliance demands these audits be ran every quarter. And every quarter, our Windows 2012r2 server which is only used for a couple of people to work remotely fails the audit. Which test does it fail? The audit claims it is vulnerable to a Windows NT4 terminal services exploit. The exploits have long been patched by Microsoft, plus the effected cyphers have also long been disabled. Yet every single goddamn quarter, we fail the audit, and it is usually a month long battle with one-way messaging to the audit company to let them know their still a bunch of morons. And guess what? The quarter just started this week!

    • False Positives during automated audit tools is my own personal hell. PCI Compliance demands these audits be ran every quarter. And every quarter, our Windows 2012r2 server which is only used for a couple of people to work remotely fails the audit. Which test does it fail? The audit claims it is vulnerable to a Windows NT4 terminal services exploit. The exploits have long been patched by Microsoft, plus the effected cyphers have also long been disabled. Yet every single goddamn quarter, we fail the audit, and it is usually a month long battle with one-way messaging to the audit company to let them know their still a bunch of morons. And guess what? The quarter just started this week!

      I look at people who post things like this and I'm thinking to myself is there only one company on the planet selling automated audit services?

      Which is worse? A company raking in $$$ for being extraordinarily lazy and getting away with failing to address even known obvious shortcomings... or paying "a bunch of morons"?

      • A tool is only as good as the people using it and an automated tool is only as good as its configuration is valid. The GPs problem is one of faulty configuration, and he's not the one who could (or should) change it.

    • Invite your CISO to a meeting. Tell him about this problem, explain to him that the exploitable ciphers are not used by your company (bring proof!) and that the automated test needs to be remodeled to fit your case.

      It's likely that your CISO doesn't even know about it, and he's the only one that can sensibly end this madness.

      • by darkain ( 749283 )

        Auditing is done by an outside company that is associated with the payment processor. Every single quarter I remind them that their testing suite is broken, and exact details as to why and how. They've yet to do anything about it. Not much else can be done at this point. If I had the choice to switch payment providers to one with sensible testing, I would, but that's not my call sadly.

        • You cannot but the CISO can or at the very least he can put it on the table the next time the C-Levels meet and discuss why the audit failed. This will also probably not result in you switching payment providers, but at the very least it will move the problem out of your hands.

  • So we had a third party audit team come in to insure we were in compliance with appropriate security regulations.
    My app is essentially a scripting service internally to make it easier to connect various functions together. We don't generate data, we only take data from inputs or pull from encrypted databases (if it's sensitive data) and we only store working data for as long as the script runs. We're a web service so we use SSL for all communications and any temp storage is stored in an encrypted state t
  • The Binder of Doom (Score:5, Interesting)

    by rjh ( 40933 ) <rjh@sixdemonbag.org> on Wednesday October 04, 2017 @05:47PM (#55311493)

    In 1999 I was hired by a Midwestern telco -- in the interests of not getting sued I won't say which: I'll just say their market cap used to be in the billions and now you could buy them with the lint in your pocket -- to do security remediation on their billing system. I spent weeks poring over architectural diagrams, going through source code, examining protocols. After a while I realized I had some really scary information, so I asked my manager for a safe.

    "Just put it all in a binder," she said. "We trust you to keep an eye on it."

    The Binder of Doom was a nondescript black binder about three inches thick. It had no cover page and no markings: I didn't want anyone to realize the secrets that were in it. I carried it around with me everywhere. I slept with it in bed with me. That's how terrified I was these secrets would come out.

    Then the Binder of Doom got worse. Having completed my survey, I now devised attacks on the system. I found ways enterprising individuals could fleece the company out of truly mind-boggling sums, and how difficult it would be to detect these attacks with the then-current security infrastructure. By the end of six months the Binder of Doom was stuffed to bursting and I was giving serious thought to filing for a concealed-carry permit. I wondered if the sheriff's department would understand if I told them I was routinely carrying around a binder with a *conservative* worth to a criminal syndicate of $100 million.

    I went back to my manager. I told her I was done. It was time to remediate the risks. "Oh, excellent," she told me, "because we just ran out of money for the remediation."

    Uh. What?

    "Management has decided the main risk is in unsecured communications links, so just ensure we're using PGP on everything and we'll call it good."

    I asked if she wanted the Binder of Doom.

    "No, you hold onto it for a while."

    So I became increasingly disgruntled, bitter, and sarcastic. I told everyone I worked with that I'd been retasked to "secure" our network using PGP -- and even old-school PGP 2.6, not GnuPG (which had just reached 1.0), either -- and oh God this is awful and if this company lasts another year it'll be a miracle and...

    I was shortly thereafter cashiered for having a toxic attitude towards work. I walked into the parking lot, got into my car, and tossed the Binder of Doom into the passenger seat. As I drove away I realized something was horribly wrong, but didn't realize what until I was pulling out of the lot:

    I HAD THE BINDER OF DOOM IN MY PASSENGER SEAT.

    I returned to the office and tried to walk inside, but was met by an HR rep at the door who told me if I didn't leave they'd call the police and file a trespass charge. I held up the Binder of Doom to the HR rep. "Do you want this back?" I asked.

    "No," she told me clearly. "Keep it. We just want you to leave."

    I turned around, gobsmacked, and left the company holding detailed plans for how to embezzle $100 million or more... which the company had just thoughtfully delivered into the hands of a disgruntled former employee.

    (And if you're wondering what I did with the Binder of Doom, it sat on my bookshelf for a few days tempting me before I threw it into an incinerator and threw the ashes into a strong wind.)

    • I guess you escaped easy.

      If anyone stole any significant amount of money from that telco, you would be the prime suspect. Assuming they can know when significant sums of money was stolen...

    • It's good that there are still people that are willing to do the right thing, and not fall for the temptation to embezzle those hundreds of millions nudge nudge wink wink ;-)

  • Most folks, including many so-called "experts", lack both the knowledge and ability to do anything close to a "real" security check. So the best route is to rely on "canned" testing that has been created and is maintained by a reputable group.

    First, scan the platform (with the application installed) for known vulnerabilities, including updates, configuration (CVRs, STIGs), rootkits and antivirus.

    Second, scan the source code with all available static analysis tools. Start with lint, then do as many more as

  • by imidan ( 559239 ) on Wednesday October 04, 2017 @07:01PM (#55311955)

    I worked at a place where we had a lot of disk (~2TB) with data that were accessible to the public. We also had a web site in place where users could upload new data, which would then be vetted by staff and then published to the public. This was all okay.

    The bad part starts when we hired a new guy who, among other duties, wound up redoing the upload interface. So he redesigns and implements the system. I wasn't part of that process, and I wasn't paying any attention to how he was doing it. Later on, he quit, and his codebase was passed to me to maintain. That's when I started looking at the code and discovered that he had implemented a server-side API for uploading data that required no credentials whatsoever--he had set up a password authentication on the web front-end, but the API itself was open to the world. Oh, and the new API also stored uploaded data directly in the publicly accessible disk space. Any rando on the Internet who discovered this API would be able to upload hundreds of GB of whatever porn and warez they wanted, and just pass the URLs out freely. This code had been running in production for months.

    Luckily, apparently nobody noticed. I audited the file system and its contents were exactly the files we expected to be there, and with the correct hashes. But it all made me wish we had a better review process, if this was the kind of coder they were going to hire.

  • I work for a pretty big bank with thousands and thousands of servers. I doubt 10% would pass a PCI audit, but since there are so many incompetent and non technical people between the assessor and those that know what they are doing.

    I've heard the manager of Sys admins say "I've never been told any of our servers fall under PCI" this specifically in reference to systems that comb over CC use in search of fraud......

    Shared system accounts running processes with full sudo access using a password forced using

  • Standing security guard on an operation involving [can neither confirm nor deny] during an inspection. One of the inspectors was standing outside the security area next to some pipes that ran along the bulkhead. He put his hand on them and slowly started inching it along them towards the plane defined by the ropes that marked the security area. When his hand was just short of breaking the plane, I took my nightstick out and laid the tip on the pipe just touching his fingertips. He took his hand back, mad

  • Not even in a "secure" environment with an air-gap to the internet in a position that required a security clearance.

    Some years earlier, though, I had a job doing B2 security auditing at Data General. For those of you who don't remember Data General, they had their own line of high end workstations and their own variant of UNIX. Their thing was making secure versions of UNIX and they wanted a B2 cert for it. So I got to read a good chunk of the original AT&T C standard library, which they'd licensed. W

  • by holophrastic ( 221104 ) on Wednesday October 04, 2017 @11:46PM (#55312969)

    Through a client referral, I was introduced to a company that was in sudden need of a new web host. Their current Australian host was shutting down, and they had two weeks (by the time I was referred) to move their small Canadian site elsewhere.

    When I say "small Canadian site", I mean the site was a small, promotional, site, with little more than five pages and a signup form.

    Little did I know...

    This was ultimately the consumer-brand of a large telecom provider -- a very large, national, telecom provider. This "small" site, was a mass-market allowing consumers to sign-up, and to also pay their monthly long-distance bill. This was circa 2010.

    We shook hands, I said: "sure, I can move your site in the two weeks, just give me the credentials to it, and I'll figure it out."

    Wow was that a mistake. Anyone heard of CakePHP?. I had to figure it out pretty fast.

    It was late one evening, when I discovered the page that allowed customers to pay their bill online -- something no one had told me was a part of this tiny site. There was no https/ssl to even hint at it. And then I saw the MySQL insert statement, and the variable "card_number". And I was scared.

    I said, to myself, "no, it can't be!" There must be some part of the platform wrapping the database call that must mask-out the card number. Or this must not be the actual card number. Or maybe it's not used anymore. Or something.

    Then I logged into the phpMyAdmin, with the credentials given to me.

    So, this is when you need to understand something. I'm a small independent web developer. At the time, I was teeny tiny. I had no written contract. The e-mailed and in-person job discussions said nothing of sensitive information of any kind. No money would be transferred until the job was done. So at this point, there is effectively zero legal agreement between us.

    I looked at the table, I saw over forty-thousand records, each with real, live, credit card numbers. . .and expiry dates, and card holder names, and purchase amounts, and confirmation/approval codes.

    I was stunned.

    Obviously, being the non-criminal that I was at the time, I told them. I told them that I was appalled. I told them that it can't stay this way. I told them that I was going to charge them a few hundred dollars to encrypt the field, and the very least -- I was too young to know that I should have been charging way more.

    They said they didn't care, I should just leave it as-is.

    That was over a decade ago. Ever since then, I've learned that there are very few clients who will pay five cents towards security, backup, or encryption of any kind. In my entire 25-year career (so far), I've met only two clients who'll invest in that kind of safety.

    So I no longer bother even suggesting that security or backup is a good idea. My legal contracts ensure that I'm not legally liable for the consequences of doing anything that they've explicitly told me to do, and that's good enough for me, I guess.

    So to all those youngin's not yet jaded for failed efforts to be good, enjoy having the hero-skills to save people; but if your career is anything like mine, you'll quickly learn that those skills carry a perfectly zero-dollar value.

    In the days of Equifax, riddle me this: where's the law that says you can't store millions of archived data all in one place, forever, online? Some of these 40'000 records hadn't been charged in over a year -- clearly old/former customers. And aside from those from the current day, all of them were old records that were no longer needed at all. Equifax had e-mails from ten years ago. How about a very simple law saying that things get taken offline eventually? Your ten-year-old e-mail can be accessible from that machine in the corner of the office, or through a request for the tape backup, and that's good enough 99% of the time.

    But hey, where's the law that says one model of gun is illegal.

    Thanks for the freedom.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...