Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Network Webcurity Wishlist? 512

breillysf asks: "I am a California-based network security attorney who has been asked by a senior US Senator to compile a list of the most important legal concerns facing network security administrators. He has a good feel for the government security issues (and lack there of), but he is concerned about what is going on in the front lines in the private sector. I thought the Slashdot crowd would have the best feel on the pulse of the current situation. Specifically, if you could ask Congress for help in the area of network and information security, what would you ask for? Or would you tell them to get out of the way?"

"For example, I tried to push for tax incentives for upgrades in network security measures, but the Senator replied that is dead in the water because we are now spending into a deficit. He would rather see insurance companies reward firms with lower premiums for enhanced security. But there are International legal issues, compliance issues, privacy complications, potential negligence liability exposure, lack of federal incident response, FOIA and anti-trust issues with info sharing, conflicting state and federal cybercrime and privacy laws, USA Patriot Act concerns, etc."

This discussion has been archived. No new comments can be posted.

Network Webcurity Wishlist?

Comments Filter:
  • Don't ban tools! (Score:5, Insightful)

    by pete-classic ( 75983 ) <hutnick@gmail.com> on Wednesday December 05, 2001 @12:34PM (#2660005) Homepage Journal
    To borrow a phrase; if you outlaw nmap, only outlaws will have nmap.

    • Re:Don't ban tools! (Score:5, Interesting)

      by Bonker ( 243350 ) on Wednesday December 05, 2001 @12:48PM (#2660092)
      This is probably the most important thing any network professional can ask for.

      Outlaw evil behavior, not the tools that enable that behavior. In many cases the tools have many, many more positive and educational uses than negative uses. In a lot of cases, the tools can be used to stop or examine criminal (cracking) behavior.
      Say what you will about Steve Gibson [grc.com], but the
      guy knows a little about network security. He gives an extended discussion [grc.com] on how he used the tools of the IRC-based DDOS trade to help oust some script k1dd13's that were hammering his site.

      Tools like L0pht-crack, the NT password cracker tool, I couldn't have convinced my execs that a company password policy was necessary and passwords like 'password01' were unnacceptable.
      Just like we don't ban sledgehammers and bolt-cutters even though they can be used to break padlocks, we shouldn't ban network tools either.
    • As others have said, government should keep hands off taking tools away. Let the script kiddies have their scripts and password cracking software. We can use it against them if we're allowed to have it too.

      The main problem with all government laws is the inability of government to adequately enforce the laws. Like the DMCA. That was put in place, wrongly if you ask me, to protect digital copyrighted works, like CD's. Yet it would be trivially easy for me to download an entire CD of mp3's, burn a CD, and then sell it for half-price to my friends. And I would most likely never be caught. I most definitely do not think that this happens the vast majority of the time in America, but in China, software and copyright 'pirating' is rampant. But we can't crack down on them 'cause our government doesn't have the willpower to protect its own citizen's interests. Instead, it enacts even more ridiculous rules to try and stop the criminals, when really all they're doing is hurting free enterprising individuals within their own borders.

      So PLEASE! for the love of everything decent and holy, DO NOT start banning everything that *could* be used as a computer security breaking tool. Just enforce the rules already on the books. Hire more computer security experts, not more thugs in black suits. An ounce of prevention is worth a thousand pounds of security in our digital world.

    • by Hizonner ( 38491 ) on Wednesday December 05, 2001 @03:00PM (#2660930)
      There's an attitude out there that says people should have to justify their access to information about security... not just network security. You hear a lot of bleating in the press about how "just anybody" can get access to information about how to do dangerous things, and how we (whoever "we" are) need to clamp down on that in various ways.

      The problem with that attitude is that, to get real security, you have to do things in a secure way everywhere. That means that everybody has to be thinking in terms of security... and not only that, but thinking in terms of things that will actually help, rather than just giving a false sense of security. That takes a certain mindset, and the only way to develop that mindset is to think about ways to break security, to see examples of how security is broken, and to see how existing security measures work, both so you can improve them and so you can avoid screwing them up.

      If you restrict access to information, you end up with only two sets of people who have a clue:

      1. A small group of overworked security specialists. These people can't do it all, and, if the rest of the world is poorly informed, they won't be listened to. In addition, in an environment where information is tightly restricted, it's very difficult to recruit and educate new security specialists.

      2. The bad guys. Being more motivated than the general population, the bad guys will get most or all of the "restricted" information through their own networks.

      Security is everybody's problem, and that means everybody has to understand it. When you release information widely, you educate 100 good guys for every bad guy. When you try to keep everything secret, you hold the good guys back more than the bad guys.

      I'm not saying that there's never a reason to keep anything secret, but there should be a presumption in favor of openness. You should try to keep something secret only when:

      1. It describes the details of an actual vulnerability that hasn't been fixed, and provides information useful in exploiting that vulnerability, AND

      2. Having information about the vulnerability would not, in itself, permit people to protect themselves, AND

      3. You're reasonably sure that large numbers of bad guys don't already know about it. In network security, large number of bad guys will definitely find out about it within a few months, if they haven't already found it independently. That means that keeping anything secret for a long time will never work.

      In government, the sorts of things we need to watch out for are:

      1. Excessive classification. It would be nice to see more legislative sunsets on classification, and more requirements for review of the decision to classify something. Patent secrecy orders are especially suspect.

      2. Programs where government information is shared only with "trusted private sector partners". Not only is this intrinsically bad, but it encourages cronyism and corruption, and can create economic problems by raising barriers to entry in security-related industries.

      3. Misguided weakening of "sunshine laws" like the FOIA. Because information is power even more in the Federal bureaucracy than in most places, there's an incentive for agencies to hoard it for political reasons. When all else fails, these laws often serve, not so much to free the underlying information, as to expose the illegitimate reasons it's being held secret.

      4. The occasional calls for outright banning the release of scientific or engineering information, in the style of the idiotic Feinstein "bomb making information" law.

  • by Anonymous Coward on Wednesday December 05, 2001 @12:34PM (#2660006)
    How about holding various companies whose products are exploited the most (re: MS) liable for their lack of security?
    • by eclectric ( 528520 ) <bounce@junk.abels.us> on Wednesday December 05, 2001 @12:45PM (#2660068)
      doesnt' mean they're the least secure.

      Exploits are still made against products that Microsoft secured over a year ago. And indeed, microsoft gets exploited the most because they are used by the vast majority of non-technical users. Can you imagine what would happen if 90% of the computer-owning people used linux? Every single hole in the OS would not only be explioted, but you could count on it being a LOT less likely that the average-joe user would *ever* update his software to fix the hole
      • by Daniel Dvorkin ( 106857 ) on Wednesday December 05, 2001 @01:43PM (#2660478) Homepage Journal
        Apache has more than twice the marketshare of IIS, but gets hacked less than a tenth as much. Now, it may be true that it takes more technical knowledge to set up and run an Apache server than an IIS server that is enabled by default in the OS ... but it doesn't take that _much_ knowledge, and it's certainly possible for inexperienced admins to make dumb mistakes that leave Apache servers open to attack. And yet Apache is much more secure in the real world. This isn't just a difference in the quality of the users; it's a difference in the quality of the products.
    • by jspey ( 183976 ) on Wednesday December 05, 2001 @12:52PM (#2660123)
      More specifically, if you pay for some software and it has security holes that a reasonable and prudent check should have found before it went on sale, and those security holes cause you problems (like lost time, lost money, lost business, whatever), then you can at least try to get the purchase price of the software back from the publisher. Seriously. Lots of software has holes in it. But if I buy win2k and install it, and the default install turns on IIS, and IIS has enormous holes in it that should never have made it past quality control, then I should be able to get the cost fo the software back from microsoft when I suffer problems from their poorly designed software.

      If you make the penalties for unsafe software too large, no one will write software. But there needs to be some sort of incentive for companies with so large a market share that they don't care how crappy their software is to make their software safe.

      Mr. Spey
      • So today I get a vulnerability announcement from RedHat. Seems Apache will expose files if you install certain mod_* packages. Isn't this a flaw in the design of the mod plugins?

        Shouldn't this have been caught before release? Who do I get to go after, since Apache is free? Should RedHat pay the penalty since they shipped the product? Or the Apache developers?

        Obviously we need some penalty, otherwise they will not have an incentive to make their software safe.

        Furthermore you need to define "reasonable and prudent". You rely a lot upon 20/20 hindsight, but could you have predicted the 9/11 attack? I'll bet you can now, but could you on 9/10?

        Just making a point...
        • Shouldn't this have been caught before release? Who do I get to go after, since Apache is free? Should RedHat pay the penalty since they shipped the product? Or the Apache developers?

          Good point. What I guess I should have said was, "commercially developed software" or something like that. If you use something someone wrote for free, you're on your own. I have no incentive to make software I give away for free safe so long as I don't go around making guarantees that it's completely safe. Basically, if a reasonable users does reasonable things with software that cost money and suffers because the software's really insecure, the company that made a profit from the programming of the software should be at least slightly liable.

          And I used the word "reasonable" becuase that's the word that is most often used in laws and court descisions. It's a vague standard but it's used an awful lot.

          Mr. Spey
          • I can see your point. But consider it has a fallout... If you thought using free software in business was hard now, it'd be absolutely impossible after such a bill was passed.

            This is a tough one. I've always been rather upset that software includes a disclaimer that says they are not responsible for whether or not the software works. I think that's bullshit. But, on the other hand, am I willing to pay more to get that disclaimer taken away? That's another part of the reality. If companies are more financially responsible, the prices are going to go up. That's what has happened in every other industry, for example automobiles, private planes, etc.

            Maybe that's a side effect of a maturing industry. But it also means the small mom & pop shops(aka Free Software) is going to die. Funny thing is that usually big businesses push for these regulations for exactly this reason. It's pretty easy for a company like GM to pay to follow all the government safety regulations on cars. It's difficult for a new startup who has to build all that testing and reporting infrastructure from scratch.
    • How about holding various companies whose products are exploited the most (re: MS) liable for their lack of security?

      There was a recent security seminar sponsored by the Georgia Tech Information Security Center [gatech.edu] by Gene Spafford [purdue.edu] who is the director of the Purdue CERIAS (Center for Education and Research in Information Assurance and Security), where he mentioned the problems with security and the software industry. One of his slides in his presentation [gatech.edu] showed that Windows NT and Windows 2000 (combined), RedHat Linux and Solaris are respectively the first, second and third on the lists of OSes that have had vulnerabilities discovered in the past five years.

      Legislation that aims to punish companies for writing insecure software would harm almost every company that writes any software that is aimed at being used in a server/multi-user environment since security is an absolute that most non-trivial software does not reach.

      Secondly, who will be forced to pay when it comes to Open Source vulnerabilities? wu-ftp is notoriously broken [redhat.com] , as is telnetd [cert.org] , sendmail [llnl.gov], BIND [ciac.org] and some would consider recent bugs in the Linux kernel [slashdot.org] as OS vulnerabilities. Opening the door to lawsuits to software developers for writing software would probably kill a number of projects rather quickly.

      I'd rather that we let capitalism take its course. If customers want secure products then they should stop buying insecure products or they should communicate to the vendors that security is of importance to them. As long as consumers (both individuals and corporate entities) continue to accept the status quo then no change will be made but I don't believe that lawsuits will solve anything except make some lawyers rich and significantly increase the cost of software as the effects of the lawsuits are passed on to consumers.
  • Wishlist... (Score:5, Funny)

    by gowen ( 141411 ) <gwowen@gmail.com> on Wednesday December 05, 2001 @12:35PM (#2660011) Homepage Journal
    My wishlist:
    1. Never ever ever use the so-called-word "Webcurity" again.
    2. ...
    3. Err ...
    4. Thats it.
    (apologies to Private Eye)
  • well... (Score:2, Troll)

    by turbine216 ( 458014 )
    My network-security wishlist for presentation before Congress:
    • Try all Microsoft engineers as domestic terrorists in one of those military tribunals.
    • Kindly ask Larry Ellison to get bent.
    • Outlaw any Passport and .NET services.

    Whaddya think, mr. attorney? Can we make this happen??
  • What I Really Want (Score:5, Insightful)

    by twoflower ( 24166 ) on Wednesday December 05, 2001 @12:36PM (#2660015)
    The number-one item on my wishlist would be for the government to keep completely out of network security issues -- the government should ensure security on its own networks, of course, but they shouldn't be concerned about anything else.

    There's already enough laws to deal with DOS attacks and such -- more laws just means more expense for those who have to deal with them.

    • by Flower ( 31351 )
      Actually, one thing that I currently like seeing the government doing is creating publications on security best practices. Like what the NSA distributes here. [conxion.com]

      A lot more useful than any regulation or a thousand laws IMO.

  • The obvious (Score:5, Insightful)

    by heyeq ( 317933 ) on Wednesday December 05, 2001 @12:36PM (#2660018) Homepage
    Well, for starters, don't let Microsoft's Chief Security Advisor work as a security advisor for the White House.
  • by curtis ( 18867 ) on Wednesday December 05, 2001 @12:37PM (#2660021) Homepage Journal
    This is a great chance to get our concerns as a community out into the public sector.

    Consider this: ONE person/organization has EVERYONE'S personal and financial data online. This goes against all design architectures in both security AND engineering. A single point of failure. Imagine one bank in real life, with Barney Fife guarding it. Would you put your life savings there?

    With more and more commerce occurring on the internet, the more important it is that there is some scheme to protect this important market. I am particularly concerned with one private company holding the public trust in their hands -- I am also very concerned about the government, for that matter, also holding this information!
    • I understand everyones concerns with Microsoft and their Passport technology. But what would you have the government do to change it? I think this is more of a case where if you don't want to use it don't. And if a company you deal with requires its use, talk to them.

      You can't have the government put a stop to a perfectly legal business practice by Microsoft just because you don't like it. I'm not sure government overcite would be a good thing either. I'm interested to know what you would want the government to do about it.
      • What to do? No, don't ban the business practice. Just ensure responsibility.

        We have a company (not just MS, but anyone) that holds user data (passwords, credit card info, whatever) accessible online (the proof of the pudding is in the eating.. if some cracker is able to access it, then it was accessible). Make that company liable for any real or consequential damages to users due to leakage of that data. Damages including value of time lost in changing passwords, dealing with credit card companies, whatever. Liable regardless of whatever EUL or click-thru smokescreens disclaiming liability they may have.

        Don't mandate *how* they should stay secure. Just make it clear that if they blow it, it's going to be very very expensive.
        • by chris_mahan ( 256577 ) <chris.mahan@gmail.com> on Wednesday December 05, 2001 @02:49PM (#2660877) Homepage

          I work at a bank. We have bi-annual audits, and if we screw up , the FDIC and other FEDERAL government agencies can shut us down. Literally. They can take away our charter as a bank, they can fine us, etc.

          I would say that leaving customer credit card information out in the open (meaning where hackers can get to it) is not only irresponsible, but also criminal. Make it a federal crime punishable by 10 years imprisonment and $100,000 fine per infraction, and then audit the hell out of anyone who accepts credit cards.

          This will force companies who want to trade online to REQUIRE their software vendors to CONTRACTUALLY guarantee that their software offerrings cannot, under any circumstances, be breached by unauthorized personnel.
          This is already standard practice in the banking software industry, and it's usually one of the first things we talk about when reviewing potential software.

          Yes it's expensive, yes it's a pain, and yes it's required for the long-term stability of the banking industry.

          As far as what congress can do now: give more money to the executive branch for cyber-crime law enforcment.

          Related: For shipping companies to include 100% insurance in all shipments. Maybe that way they'll be more careful. And make it a violation of Federal law not to insured all packages 100%. Also, fine them if they don't pay the insurance settlements immediately. Like to the tune of $1,000 per violation per day late.

          People in America should not have to have a law degree in order to not feel at the mercy of multi-billion dollar corporations.

          These companies will complain and say that this will hurt their industry and the economy as a whole, but I say that's the opposite: If you have reliable shipments and safe payment systems, the economy will just ooze along nicely.
      • But what would you have the government do to change it?
        Simple. Inform consumers of what we pros already know. Before using passport, you must read the 24 point disclaimer on the web page:


        Call it truth in advertising or whatever, but be sure that NO ONE can call their product secure unless it is.
  • Egress Filtering (Score:5, Interesting)

    by jac ( 7157 ) on Wednesday December 05, 2001 @12:38PM (#2660024) Homepage
    "Coax" all carriers and providers to do egress filtering at the edges of their networks. This should help significantly in reducing DDoS attacks and should help make malicious network activity easier to trace.
    • ... Carnivore. No thanks. The goal is to promote sturdy, redundant comunication networks. Anything that gets in the way has got to go!

      I've got a long list of things I do not want the govenment doing, and what they should do instead. They should not be reading my email, they should prosecute those who do as they prosecute those who use the inherently insecure potocal known as US mail. They should not be collecting information they don't need to do the job of infrastucture development, military defense and welfare. They should not be buying insecure propriatory OS such as M$ offers. I'd much rather have information kept on secure servers so that it will stay put. The government should not hand over the publically built communications infrastructure to a cartel of greedy corporate interests. Redundancy should be encouraged and inexpensive anonymous public access assured.

      Security should not be an excuse to hand the internet over to either corporate of govenment censors. This is the future of publications and it must remain free. The future freedom and prosperity of our country depends on free information interchange. Business can not funtion without privacy in their plans. Individuals can not be sure what is true if they can not trust the media that brings them their news. Control of the internet by government or corporate censors will eliminate all the blessings of this new form of communications.

      How exactly do you do this? Mr. Senator, that is your job. Now get to work.

  • tell them (Score:5, Interesting)

    by elliotj ( 519297 ) <slashdot.elliotjohnson@com> on Wednesday December 05, 2001 @12:38PM (#2660028) Homepage
    the more crypto the better. and don't try to legislate backdoors into it or anything.

    people need to reliaze that crypto is available to anyone with the ability to use it...it needs help in getting the average joe to use it.

    most people won't use PGP or something b/c it is too complicated. crypto needs to be built into office and internet apps from the ground up. strong crypto. stuff that can't be broken.

    people need to feel secure about these things. i think the govt has a lot to offer in promoting pki and such to get this in the hands of everyone.

    privacy is important. the govt needs to make a proactive effort to show that they believe in personal privacy and are willing to help make it happen online.
    • Re:tell them (Score:5, Insightful)

      by remande ( 31154 ) <remande@ b i g f o o t . com> on Wednesday December 05, 2001 @02:16PM (#2660693) Homepage
      I'll make a stronger statement on that. Any attempt to require back doors on encryption (e.g. the Clipper Chip) will significantly increase our risk exposure. Let me illustrate.

      A back door is really a master key. Government back door schemes require the encryption to have a back door key, and for the government to have that key.

      If you're paranoid about the government like I am, you can see where giving it the master key can ruin your day. But even assuming that the government is all white hat, you're still in deep trouble.

      That master key is worth hundreds of millions of dollars in the right hands. Organized crime could use that key to commit credit card fraud on millions of credit cards. This is also a great way for terrorists to get funding. Depending on the crypto scheme, it could be used to forge communications, rerouting shipments. If I had the Master Key and needed a couple of hundred pounds of plastic explosive, that would be my first idea.

      And that key can't be kept very secure if it's being used. Thousands of people, whether law enforcement officials or court officials, will have access to that key. Out of a thousand people, somebody's going to be bribable for a mere one or two million dollars. Or be required to hand over the key to get their loved ones back. Or write down their password and have their office computer broken into. It won't be too hard for a determined criminal to get that master key.

      I am a big fan of crypto, but I would honestly prefer no crypto to back door crypto. At least if you have no crypto you know you're not being spied on.

  • At the very least a free one like Tiny Software [www.tinysoftware]. I'm sick of getting DOS attacks looking for IIS from zombies on my subnet.
  • Two things (Score:4, Insightful)

    by Anonymous Coward on Wednesday December 05, 2001 @12:41PM (#2660045)
    First, stay out of the way. don't meddle in things that you know nothing about. Don't place restrictions on security meassures, a la encryption export. Don't mandate government backdoors and don't permit the likes of Carnivore and Magic Lantern.

    Second, concentrate on the governments own cyber security problems. Clean up your own house before you start trampling over mine.
  • IPv6 and IPSEC (Score:5, Interesting)

    by PineHall ( 206441 ) on Wednesday December 05, 2001 @12:41PM (#2660052)
    If the government would require on all their networks IPv6 and IPSEC, that would go along way toward IPv6 and IPSEC being accepted and would improve network security. Nothing else needs to be done.
  • by Cesaro ( 78578 ) on Wednesday December 05, 2001 @12:42PM (#2660053) Journal
    The most important and significant problem is not putting the proper resources into getting that security. Upper level management are not technically minded folk, and they don't view computers and true tools. They don't understand the costs when you try to explain it to them. "I'd like to get around $200k so that I can physically seperate out infastructure and give us added security."
    Management: "I'll give you 2 un-trained contractors, a spool of thread, and a tin can."

    They just don't understand, or appreciate what computers provide, but yet they get irate when something happens. Therefor the largest hurdle to overcome is getting the senior people up to snuff, or willing to to dish out the resources for what needs to be done above and beyond a simply reactionary level. To them, pro-active computer security is like flushing money down the toilet.
  • Dictate that computing environments must employ a free mix of platforms and tools so that a single crack or worm can't be used to exploit the entire company/organization/network.
    • Nope, sorry. Bad idea. Won't work.

      Actually, it's not a bad idea. It makes good sense to have a variety of things going on in your environment, and not be able to take down an entire company with a single exploit. HOWEVER, the costs associated, the time involved, and the incompatibility issues go up stupidly once you've done this. Different hardware platforms mean you can't swap parts from non-critical machines when you can't get a replacement in an hour. Having multiple admins doing different work but only using 50% of each one's time is a huge waste.

      Mandating different systems is a great way to get people to ignore you completely, unfortunately. It just won't work in the real world.
  • ...to implement the death penalty for anybody using Outlook or Outlook Express on my internal networks? It would make my life a lot easier.
  • Egress filtering (Score:3, Informative)

    by cgleba ( 521624 ) on Wednesday December 05, 2001 @12:43PM (#2660060)
    A professor at the University of Massachusetts named Brian Levine pointed this out and I wholeheartedly agree:

    It should be regulated that every network only allow their alotted IP to leave their network -- aka egress filtering.

    For example (using unassigned addresses purely for example), if you have a subnet, you should not allow addresses to leave it -- aka ONLY allow addresses to leave it .

    If everyone did this it would solve most of the IP spoofing problems and add a lot of accountability without infringing on people's privacy. Massive DoS attacks could be traced and stopped.
    • Re:Egress filtering (Score:3, Interesting)

      by Agthorr ( 135998 )

      What about multihomed hosts where one ISP doesn't know about the other's addresses? I was administering such a setup once, and it was extremely useful that the ISPs didn't do egress filtering!

      Also, although I agree it's generally good practice, this isn't something I'd want the government regulating. It sets a bad precedent, and they'd try to regulate all sorts of other aspects of network administration where they should not be sticking their noses.

    • This possibly doesn't buy you much - many DDOS attacks utilise captured machines, and so there would be no requirement to spoof the source address - since it is not the attacker's own address.

      FWIW, nobody should allow addresses to leave their network, since it is a RFC1918 address.
    • Re:Egress filtering (Score:2, Interesting)

      by dcviper ( 251826 )
      Personally, I would have a problem with my home ISP (at this time, Insight RR) filtering anything. I pay for open and unfettered access to the net. However, that does not stop mr from adopting a secure setup for my home network. My firewall is setup to automatically drop anything from the RFC 1918 addresses (,, because those should only be used between routers at the core and disro layers, or behind NAT setups. This should also be implemented at the corporate enterprise level, as ACLs on the edge routers. In an ideal world, RFC 2827 would be implemented everywhere, but I'd hate to see it done by governmental regulation...

      -dcviper (Cisco Certified Network Associate)
  • Is there an FOIA equivalent for private companies holding data on people, along with an obligation for speedy correction -- including a good-faith attempt at propagating corrections to other data-holding companies if the misinformation was propagated?

    If not, perhaps there should be.
  • by dfeldman ( 541102 ) on Wednesday December 05, 2001 @12:45PM (#2660069) Homepage
    A few years ago I worked as a sysadmin at a moderately large company. We had a pretty big turnover problem because our company's marketing efforts tended to attract job applicants who were "green" college grads, lazy, troublemakers, and looking for a "fun" workplace with foosball tables and free snacks. Needless to say, they did not fit in at the Fortune 500 company where I worked.

    One of these employees got bored with his coding tasks and, with no previous exposure to a broadband Internet connection, apparently decided to become a script kiddie on company time. From all outward appearances, he got pretty good at it, but one day it caught up with him: U.S. Marshals came into my office and served me with a court order that asked for many, many pieces of information that would tell them who had been cracking systems from our corporate network.

    I had no problem turning this information over, as the other choice was to go to jail and let the hacker go free. However, I was appalled with the way the marshals treated me: they knew that I was just the sysadmin, not the perpetrator, but they still treated me like a criminal. When I told them that our NAT setup doesn't keep logs of every single outgoing connection from our network (as had been requested in the court order) they got really pissed off and started threatening me. At that point I told them that I was not going to do anything for them without talking to counsel, and they backed off.

    So, the moral of the story here is that law enforcement needs to show more respect for sysadmins, and learn the difference between a network admin and a criminal on the admin's network. Treating everybody as though they are all guilty will only build resentment and get in the way of getting their precious case solved.


    • by sting3r ( 519844 ) on Wednesday December 05, 2001 @12:59PM (#2660168) Homepage
      One of my co-workers was scamming people on eBay from home, and one of the disgruntled customers called our local police department to whine about it. The police came down to our place of employment and started talking with the managers, and the managers literally turned white and started handing over records. This was without a warrant or court order, mind you. Last I heard, they had turned over the employee's entire HR file, his entire mail spool, and his desktop computer. Needless to say they did not want him to work there anymore after that day.

      This brings up an interesting point, though: should Congress make it illegal for companies to give up your personal information to law enforcement without your consent (or a court order)?


  • Responsibility (Score:5, Insightful)

    by Alien54 ( 180860 ) on Wednesday December 05, 2001 @12:46PM (#2660073) Journal
    I do not know how you would do this, or what the right way to do it is, but I would like to see some responsibility for writing or creating secure systems.

    I am thinking specifically of Microsoft, and the Microsoft Outlook Email Viruses, but this could certainly apply to plenty of other companies.

    If companies are merely licensing the use of the software to us (and we do not own it), and charging the big bucks, shouldn't they be responsible and/or liable for the consequences - damages from using it? or is this a matter of they get all of the benefits, and we get all of the problems?

    • Responsibility may be better served through higher insurance rates for "known" buggy software, rather than Government red tape.

      Though there is always the "utterly secure today, completely broken tomorrow" problem with software as new attack methods come to light, which would complicate insurance/government penalties...
  • by Bob(TM) ( 104510 ) on Wednesday December 05, 2001 @12:46PM (#2660078)
    Congress doesn't regulate whether individuals or corporations lock their doors, install security alarms, or any of a plethora of physical security measures. Then, why would I want them to step into the fray and regulate security responses and policies in cyberspace?

    To begin with, the government doesn't move fast. Given that time scales associated with the IT was becoming smaller and smaller, the iterrations would go through many cycles before Congress knows what hit them. Attempting to regulate the arena would get in the way.

    Secondly, Congress obfuscates rather than clarrifies. Look at the DMCA - which causes more problems for the industry than it solves. It's great for the conventional copyright holder but has the effect of stiffling digital advances. Congress moving to mandate information security policies or measures would be the same thing - the paradym they are working under doesn't apply well to this technology or the time scales under which it operates.

    Let the industry that's used to the pace of things set the policies. Congress is better suited to time scales where change occurs in years, not days.
  • What I'd like to see is forcing mailserver default installs not ever to be open relay configs. One of the biggest pains right now is spam, largely enabled by open relays (besides clueless admins). Spam is theft of resources, can result in DoS, and should be outlawed.

    Oh yes, force producers of email clients to use secure default settings. Deny *Script in emails, automatic opening of attachments even in preview mode etc. (thinking of Outlook [Express]). This would massively reduce damages by email worms.

    Yet another point: get the ISPs to actually *do* something about abuse complaints [when they are reasonable].

  • I would say the greatest issue is response by isps and groups who seem to have been a source for an attack. I NEVER hear back from ip address block owners, its rare, In maybe a three or four HUNDRED emails, I have only gotten one response from a person. In all honesty though, no matter of legislation or tax incentives can help that.

    I think it would be best if the US Goverment, My Goverment, took a hands-off approch, but while encouraging insurence companies to give incentives to customers who maintain high security networks. Goverment Control of technology, Outlawing of the tools, will only make things worse, because only the crooks, script kiddies, and outlaws will have the tools and technology.

    The internet is an international, boundless medium, and only a community effort, with the cooperation of isps and companies who hold massive networks, will keep the net free, and allow net admins to hunt down, and stop people who are doing things that cause net admins trouble in their job. I mean, I would be much happier if one isp out west would email be about one of their customers who have a box that is scanning one of my customers just about every three weeks.
  • by mikej ( 84735 ) on Wednesday December 05, 2001 @12:48PM (#2660093) Homepage
    There's an ongoing trend to criminalize the tools and speech used to conduct security research; This is the single most frustrating aspect of the government's involvement in network security. Lists like bugtraq and tools like nessus and nmap are absolutely vital to the health of a network-connected system. Some suggested legislation would make all security discussions criminal, some would allow such work to only be conducted by approved organizations; Both would shatter the ability of the individual administrator to effectively secure his systems. If I could make one and only one request it would be to specifically disallow legislation that attempts to let companies involved with the internet take the security ball to their private court and bounce it around, leaving individual system administrators with no tools and no forums in which to discuss their own defences. In short: keep public, individual security research legal.

    Thanks, and good luck.
  • by Electric Angst ( 138229 ) on Wednesday December 05, 2001 @12:49PM (#2660098)
    Federalize computer security. Make network admins another part of the executive branch, like the FBI, NSA, or ATF. Assign agents to every buisness with an internet connection (more significant the connection, more agents). Give them the authority to break down the doors of the script kiddie attempting to zombie user's workstations and point a gun at their head.
    • Make network admins another part of the executive branch, like the FBI, NSA, or ATF. Assign agents to every buisness with an internet connection

      This is a suggestion so terrible that I suspect it may be a troll. Nevertheless:

      Please, for the love of all that is good and right and doe-eyed cute, don't do this. Even I, a guy with somewhat leftist politics by any reasonable metric, think that this would be an awful idea.

      First: you're either assigning existing officers to these tasks (they're already over-committed), hiring new ones (spending assloads of tax money to hire tens of thousands of new field agents), or federalizing private positions (taking private sector employees and slashing their salaries to government rates). None of these sounds like a recipe for success to me.

      Second: law enforcement's supply of clueful computer people is shamefully low. I had the "pleasure" recently of attending a convention "CEO's dinner" at which an FBI agent working on computer security issues from our local branch office gave an "executive overview" of security issues from a law enforcement officer's point of view. In summary: it was terrible. It was clear that the guy was in over his head. For example, I usually accept it as a given that someone speaking on pirated software will understand that "warez" rhymes with "shares," and is not pronounced "Juarez."

      I could go on, but the thought of all these new agents, coupled with the wasabi green peas I'm eating, is bringing tears to my eyes. I'm gonna go have a lie-down.

  • Three vital needs (Score:2, Insightful)

    by PrimeEnd ( 87747 )
    There are at least three things we need:

    1. Wide deployment of IPSec.

    2. Open standards and full disclosure of vulnerabilities.

    3. Client diversity in the network ecosphere. A single species (can you say 'outlook') is extremely vulnerable.

  • Yes, network administrators have to be vigilant about their own security, and put in place whatever measures are necessary to ensure the integrity of their data (and their companies)

    My only wish would be specific legislation proposing limited liability in cases where a 3rd party piece of software was used and an exploit found and used against said software before a security warning is made known, or security patch is made available by the vendor.

    If the administrators have done their job and have all their software up to the best spec they can, but are subjected to liability against themselves for an error in a piece of software they put their trust in.. it's bad news.
    Especially if the client dictates the software to be used for securing the data... man, it's just bad karma.

    In the meantime, keep using multiple levels of security. Screw the overhead if you've got sensitive data...

  • Or would you tell them to get out of the way?

    Maybe that's a good idea: let the technologists work it out. Was it a politician who developed the first firewall, IPSec, NIDS, etc.? I don't think so.

    While there is a social element to breaking networks, the solutions to these problems should NOT legislation (IMHO). Making something illegal or applying manditory monitoring does nothing to stop those who intend to circumvent/ignore those measures.

    Network security should be left in the hands of thoses most capable. If any body or government should look to tackle the 'issues' - real issues - of network security, I think it should be a body of technologists and people who really do have an understanding of what network security really means.

    Thank you.
  • Given Congress's track record of passing laws relating to computing which, in about 100% of cases, clearly demonstrate the fact that the people who wrote the law have no concept of how the Internet works and are responding solely to what corporate lobbyists are telling them, I'd rather if Congress would keep their dirty mitts off of this issue.

    Yes, it sucks to essentially have to barricade your computers from the rest of the world and not be able to trust any external entity to help you effectively, but I'd rather have that than more weird laws making more innocuous actions criminal offenses for no apparent reason.
  • by moonboy ( 2512 ) on Wednesday December 05, 2001 @12:59PM (#2660166) Homepage

    • No New Laws - The government has a habit of throwing more laws at a problem (yes and money too). We don't necessarily need more laws, just proper enforcement of the existing ones. (or maybe I should say, no laws just for the sake of creating them....no hollow laws to appease the general pulic and press...if new laws are made, they must be effective!)
    • Crypto - No more restrictions on crypto.
    • Tools and Methods - The government shouldn't ban tools and methods used to work in network security. These are very necessary to increase the level of security. Like another poster said, if you ban them (ie, make their use, possession, etc.) illegal only the "bad guys" will have them.

  • Webcurity? Sounds like one dot-com too many. Among other problems, "curity" feels more like it belongs to *obscurity* than *security*. Besides the famous line separating the two, nobody wants an obscure website :-)

    Security-related phrases in the english language are usually combinations of initial syllables. Information Security gets compressed down to InfoSec, "Defense Condition" to DefCon, and "Strategic Forecasting" to StratFor, for example.

    WebSec...well, sounds like it'd be a phrase for the specific branch of Infosec dealing with external access to internal data through a tightly controlled interface. Certainly feasible, though you start hitting problems when protocols other than HTTP start getting used. (Is it a website if you don't get it over HTTP/HTTPS?)

    Of course, with everything imaginable getting piped over HTTP(as opposed to SSH *grins*), maybe WebSec is appropriate...

    Yours Truly,

    Dan Kaminsky
    DoxPara Research
  • by SecurityGuy ( 217807 ) on Wednesday December 05, 2001 @01:01PM (#2660179)
    The *LAST* thing I want is a legislative "solution" to a problem the so called experts can't even agree on. Full disclosure or not, is scanning illegal, should it be, etc. Legislative solutions are far too often nothing more than new problems. Copyright violation is a problem. The DMCA is supposedly the solution. Terrorism is a problem. The solution, apparently, is to pass laws undercutting privacy and liberty in the states. Crime via computers is a problem, their solution was key escrow (thankfully not implemented), and now the FBI is writing computer viruses (Magic Lantern).

    Thanks, but no thanks. I'd much rather stick to securing my boxes with the understanding that it's a hostile net out there than have my government tell me the One True Way to do so. Passing laws which only apply to less than 5% of the world's population will not make the net secure, and feel good legislation is something I can do without.

  • Maybe they could clear somethings up...

    I'd like to have clear guidelines on mail. How long do I need to keep it? Can I just totally delete mail or do I need to maintain backups.

    When can I monitor/read someone's email? It's mine (well, it's the companies) but if MGR A wants me to give her access to EMP K mail is that legal? Can I monitor how many times my boss hits his stocks? When is OK to put a key stroke logger on someone's machine (don't ask, we ended up using a modified virus)?

    Is it OK to block Accounting from mail
    internet? To put a brick wall on their doorway so they are trapped in their damn Accounting offices forever? (OK that's probably not legal.)

    PS -- I work for Lawyers' Travel... kinda ironic huh?
  • It is current practice of some US states to sell driver's license pictures and other personal data from their database to private firms, for various reasons. This practice should be illegal, or at the very least carefully monitored at the federal level.
  • by shanek ( 153868 ) on Wednesday December 05, 2001 @01:09PM (#2660235) Homepage
    1: Get out of our way WRT encryption and other secure technologies. We're not terrorists, we just want to keep our personal information secure. Installing "back doors" and other methods may, on the surface, seem like a good idea for national security, but in reality hackers can enter through those as easily as the government.

    2: Hold vendors responsible for security holes in their products. Currently, the EULAs prevent someone harmed by a security flaw from seeking liability, even if that security flaw was deliberately programmed into the software as a "feature."

    3: Recognize the role of antivirus firms such as McAfee and Symantec in protecting users. They should be unrestricted in their efforts to make and sell software that can protect computer users from harmful files, regardless of the source.

    4: Realize that the best way to catch criminals and terrorists is through the use of human intelligence, which history has proven to be much more effective than randomly reading private EMails. Also, human intelligence doesn't involve threatening the liberty of normal, law-abiding Americans like many of the other proposed methods do.

    5: This is probably the most important one: Remember the words of Ben Franklin when he said, "They that would give up Essential Liberty in order to obtain Temporary Safety deserve neither liberty nor safety." I would also add that, in these cases, you usually don't get the safety you're seeking in the first place.
  • by jet_silver ( 27654 ) on Wednesday December 05, 2001 @01:09PM (#2660239)
    Encourage the Senator to remain aware that legislation about the Internet doesn't have crisp borders. Bits don't change color when they cross national boundaries.

    When you do that, you might get him to understand that such laws are not easy to enforce and will certainly involve a lot of jurisdictional disputes.
    And you might encourage him to realize that it is the lowest common denominator of behavior on the Internet that represents the cutting edge of security needs.

    In other words, passing legislation against US Internet users is tantamount to taking their guns away, when they can at any minute be involved in a virtual gun-fight with, for example, Chinese or Indian crackers who have no such laws hampering them.
  • A few ideas (Score:2, Interesting)

    by pdqlamb ( 10952 )
    First: make sure product liability applies to software products. That will, at some point, allow users to sue companies who foist lousy software on us, which in turn creates security headaches. Code Red and NIMDA are the worst examples of this to date. It could have been much worse.

    Second: Congress needs to do some serious thinking about common-carrier issues for the internet. It seems reasonable to say a phone or cable company, for instance, cannot preferentially transmit information while blocking traffic from another source. Problem is, this is what we count on to block probes and flood traffic. Please try to keep RIAA, MPAA, and other intellectual property thugs out of these deliberations!

    Third: it seems Dubya and his cronies don't have a really good idea how to handle security. Ask them for details on how a redundant govnet will increase security before giving them lots of money to hand out to their favorite contractors.

    Fourth: push available technology. NSA with SEU Linux is a great idea. How about pushing IPv6 and IPSEC, for instance by including it in communication RFPs? That would increase the availability (from virtually nil) and help work out the bugs. How about specific funding to increase the security of notoriously insecure government computers hooked up to the net? The GAO will tell you, after they finish laughing, how well secured government nets are.

    I also like the idea of computer security scholarships. Are these still around after the change in administration?
  • My Wishlist (Score:5, Insightful)

    by medcalf ( 68293 ) on Wednesday December 05, 2001 @01:13PM (#2660271) Homepage
    In no particular order:

    1) The Federal government should encourage, not discourage, the use of encryption, without key escrow or back doors, by not regulating encryption in any way. (The government should also invest heavily in the appropriate technology to break encryption when it needs to do so.) Without the fear of government intervention, application designers will be encouraged to add encryption to email and other software as a business advantage to themselves, thus allowing my business to communicate more securely with ease.

    2) The Federal government should encourage open source and open standards by requiring the use of open source software and open standards on all government systems (except possibly military/intelligence systems). This will get more eyes on the code, thus reducing vulnerabilities and fixing them faster, and will ensure that people are unable to take advantage of unpublic holes in uncheckable software.

    3) The Federal government should generally *not* regulate the internet, as this can introduce holes that cannot be fixed because of regulatory requirements. In particular, the government should not use either legislation or funding to control the use of the internet by libraries, schools and other non-Federal government institutions, or by private individuals and organizations. There are a few exceptions I would be OK with:
    a) requiring "edge filtering" so that networks would not support denial of service attacks;
    b) allowing wire fraud charges against people/organizations who deliberately send email without proper and valid headers (or with forged headers), so as to obscure their identity while sending unsolicited commercial email and/or perpetuating scams (note that this should be allowed for the purpose of anonymously propagating a political opinion, for example, just not for commercial use);
    c) requiring organizations who control internet naming or numbering to have public accountability, as these organizations were largely granted a monopoly by the US government; opening up these processes to a standards-based system where everyone can participate; or allowing anti-trust legislation against such bodies if they attempt to coercively control internet access.

    4) The Federal government should designate ISPs and online communities as common carriers.

    5) The Federal government should require cable and telephone companies, as part of their FCC licensing requirements, to offer the option of access to the network for paying subscribers wihtout mandatory membership in an ISP, and in particular an ISP should not be allowed to gain monopoly status by association with a government-granted monopoly such as a cable system. This would have reduced the @Home debacle, for example, to a trivial matter. The potential for AOL/Warner is even worse down the road if something is not done to guarantee choice in broadband access.

    OK, I guess I got a little away from security with those last some of that.

  • It's all well and good to propose holding Microsoft responsible for security holes in their software, but please keep in mind that this also means that Open Source Software authors will ALSO be held fiscally responsible for holes in THEIR software.

    Microsoft will be far more able to pay up for massive holes in IIS than, say, the author of BIND or Sendmail. I would imagine that one successful suit could take out RedHat altogether.

    Don't hurt community-oriented authors for making their code public.

  • by James Youngman ( 3732 ) <jay@NosPaM.gnu.org> on Wednesday December 05, 2001 @01:18PM (#2660302) Homepage
    Don't mandate key escrow. Key escrow will inhibit the adoption of encryption, and encryption is vital to both proper and secure authentication and to data privacy. Attempts by various parties to limit the widespread adoption of encryption might make their job easier but is not good for (internet) security. It is frequently said that if you outlaw encryption, only outlaws will use encryption - that is, making it illegal to use it will not stop criminals from actually doing so.

    Re-think laws that make it possible to prosecute scientists for publishing the results of their research - i.e. the DMCA or parts of it.

    Encourage the adoption of IPv6 - perhaps by allocating budget for adoption of this by government agencies (I mean carrot here, not stick).
  • Some suggestions (Score:5, Insightful)

    by jd ( 1658 ) <imipak.yahoo@com> on Wednesday December 05, 2001 @01:19PM (#2660312) Homepage Journal
    • Security should fall under some form of "trades description act" - eg: what you're offered is what you get. A firewall that isn't, secure transactions that aren't, or privacy that's sold, should be actionable. That isn't about the limits of technical skill, it's about fraud that merely happens to involve computer technology.
    • It should be illegal for an ISP to prohibit customers from implementing security on their machines (except where that security is, itself, a hazard to other machines)
    • Where the technology exists to prevent criminal abuse, and an ISP neglects to use it for reasons OTHER than financial or technical, then that ISP is an accessory to the crime, and should be held accountable as such.
    • Insurance companies should have the right to carry out periodic audits of computers belonging to customers they insure, and modify premiums according to the flaws encountered.
    • Customers of companies should have a similar right to scan the companies they deal with (and vice versa), so that neither side can claim ignorance of the status of the other, prior to transactions taking place.
    • As things stand, "important" web transactions are secure, and all others aren't. This is the same as placing a large, neon sign over the hidden wall-safe. It is no longer hidden, or safe. I would like to propose that unsecure, or only partially-secure websites be subject to penalties, where such a policy results in a breach of security.
    • Finally, where concious and deliberate inaction results in an expense to any emergency service, security agencies, etc, the organization responsible should be expected to reimburse those costs, in full. (Note that this is for inaction alone. You can't sensibly penalize those who make a genuine effort, even when that effort fails.)

    Implementing even a few of these should deal with the national deficit, quite nicely. Some of the biggest costs in both public and priate spending are to fix serious problems, after the fact. The burdon should be shifted, as much as can realistically be done, to those responsible. A stitch in time saves nine. But, damn it, the tax payers shouldn't have to pay for someone else's failure to stitch.

  • Cryptography is the strongest weapon we have against cyber-terror.

    Whatever is done, don't put limits on cryptography.

    I design secure cryptographic-based architectures for a living. I can't design a secure information system without strong cryptography.

    It's a shame that in the public eye cryptography became a "tool of terrorism" in the days following 911, when in reality it's our only hope for an attack-resilient Internet infrastructure.

    At the same time, it is a merit to Congress that crypto limits have NOT yet emerged in the reactionary aftermath.

  • by pubjames ( 468013 ) on Wednesday December 05, 2001 @01:29PM (#2660374)

    Make all government-funded development work open-sourced.
  • Sadly, I think that the government can do little in way of issuing new laws to help network security in the private sector. You can't prevent people from opening viruses "like you told me not to" with a new law. You can't prevent Microsoft from setting "user friendly" defaults in Outlook, Internet Explorer and SQL that violate the most basic security priciples with a new law.

    However, the goverment can do some things:

    1. Deal with Microsoft's monopoly effectively. Microsoft's continued embrace, extend, kill the competition and then screw it up strategy doesn't help security one bit. They have no motivation whatsoever to fix even the simplest problems in Outlook and other swiss-cheese-like products. If there was a viable competitor in that market the two would probably attempt to one up each other on several points, including security.

    2. Use more secure and more reliable software inside the government (read Linux, et al). Refuse to use/purchase products where security flaws crop up every time you read slashdot.

    3. Use/support open standards and refuse to use/purchase products that rely on embraced and extended technology.

  • by JWhitlock ( 201845 ) <John-Whitlock.ieee@org> on Wednesday December 05, 2001 @01:30PM (#2660380)
    Computer Intrusion is a cost of doing business. Because the Internet is not secure, and because it can be low-cost to break into computer systems, computer systems will be broken into. Making intrusion illegal will help when you catch someone may help disuade others, but more often than not, other crackers will simply analyze the case for mistakes and blame the criminal for "being stupid". Making tools illegal will give sys admins a irrational sense of security, since they won't be able to test their own networks with their own tools.

    One thing that may help is if there was some independant firm that could give a qualitative and quanitative measurement of a company's security. These independant firms could review patch logs, sys admin proceedures, backup procedures, and employee training materials. They could also perform more intrusive audits, using a standard set of tools (upgraded quarterly) to attempt to infiltrate the organization. At the end, they could then give some sort of ranking, to let a company know what bases have been covered and how they rank with others in the industry.

    This service is done by many security firms, but there is no real standard. All the information is propriatary, and usually secret, because a company doesn't want to publicize what holes were found. Even then, there is no real motivation to get ongoing reviews, because, if there are no visible hacker attempts, then it seems like a waste of time and money.

    This might be changed by offering computer security insurance. This insurance would cover the cost of recovering after a sucessful cracking attempt, as well as any lost business. An insurance firm would evaluate the current security and ability to recover from a hacking attempt, and find a reasonable insurance rate based on the company's preparedness.

    This would help in several ways. First, even though the evaluation would be between the insurance company and the insurance purchaser, the insurance rate would show up on the financial reports. Investors and reporters could compare the rate and the coverage, and make a rough determination of the fitness of the company's security measures. The rate information should be included in the financial report, since this information would help an investor decide how likely a company is to suffer financial loss due to a hacking attempt. It may require a law to get this insurance information into financial reports.

    Second, it would give companies a forum to disclose successful hacks. Currently, companies keep all but the most damaging hacking attempts secret, because it makes them look bad in the eyes of investors. If there is a financial incentive to report hacking attempts (they could get some insurance money back), there may be motivation to share this critical information, and other companies may be able to secure their own systems against new methods.

    Third, damage claims would be more realistic. When a cracker is caught, many companies let their imagination soar when it comes to damages, assuming fantastical scenarios like, "What if he found our most prized trade secrets, and sold them to our direct competitor, thus making us lose all the profit from that product / service?", or "What is the sum of all the salaries of everyone who ever worked on that machine?". If the company had to actually file a claim, then the insurance company would dictate the terms of that claim, what is fair game for damages and what is not. This will help put the cracker's actions into better perspective.

    Fourth, once standards are formed, the government could use the standards for contractors. For instance, a contractor working with "Secret" documents may have to have a score of 90 out of 100 for the general company, and a score of 97 out of 100 for the division working with the secret data. The government may even demand scores of 100 - not unrealistic for a score based on repeatable and auditable tests.

    Fifth, the insurance companies would have an incentive to discover what security measures work, and which don't. If they find that yearly training for employees to deter social engineering attacks work, then they can make that part of the standard. If randomized one-use passwords work, then it goes in. If some widely believed precaution has little effect, it can come out of the standard. In general, we'll have a better idea of what makes a secured network, and more books will be written helping small businesses meet the insurance company's demands.

    Sixth, we can develop labs like UL for computer security, which can rate software, operating systems, and hardware, giving them ratings for their out-of-the-box configurations. Vendors will work harded for better ratings, and auditors will have an idea how much patching needs to be done for a particular system to be kept up-to-date. Security will actually become a selling point.

    I'm not sure if there is a law that would make this happen. I'm sure you can talk to the insurance lobby, and get a rough idea why this doesn't exist yet.

  • Frankly, I don't see how network security is any of Congress' business.
    And regardless of whether it's a good idea or not, I don't see anything in the Constitution that would grant them authority to take any action in this arena.
  • All Aboard!!! (Score:2, Insightful)

    by A_Non_Moose ( 413034 )
    Thank you for using Cluetrain express, be seated and enjoy.

    I realize I am merely echoing what others have said, but to have a 'fellow professional' ask our opinion/advice is always welcomed.
    Add to the fact that a US Senator is asking makes it even more necessary to voice out opinion.

    (HELLOOOO! McFly!!! ...some of you /.'er saying "you want us to do your job for you?" need to board the cluetrain as well...uh, Senator, law making, U.S. of A, Constitution, righting wrongs, fixing bad laws... mean anything to you?
    Apologies for the brow beating, someone had to say it)

    I realize it has little to do with security, but hear me out:
    Consider the eBook, DeCSS, Napster, DRM, Watermarking, DMCA, SSSCA, RIAA, MPAA, Microsoft, et al.

    What do all of these have in common? Bad Laws, legislation, and corporations who are twisting and perverting the legal system to thier own will, and succeeding to implement new forms of Prohibition.

    You see the 1920's provided a clue to a generation: You can NOT legislate morality.

    What these laws are saying is "Napster Baaad", "Fair use, Baaaad", "Freedom of speech Baaad!"...you get the idea.

    Trying to "outlaw computers, fair use, tools of the trade" is a bad idea, but it is one that seems to be advancing at an alarming rate.

    What is being ignored in the law making body is:
    The tools of the trade (any trade), be it a lock pick, gun, sledge, bolt cutters, or, yes, a computer...these things need to be available reguardless of intent and use.

    It seems most corps/senators/congressppl are afraid of "what we might do/think" and are making it illegal.

    Wrong, wrong, wrong.

    I think a "Digital Boston Tea Party" protesting this "Digital Prohibition and Taxation w/o representation".

    But the only thing that comes to mind is lobbing modems and misc computer parts on the Whitehouse/Congressional 'doorsteps' in protest.

    Ok, I've gone on long enuf, but I'll leave you with this thought:
    The most powerful network security tool is called "a pair of wire cutters", after that is finding the offending wire and pulling as hard as you can :) .



  • thanks to government regulations:
    Houses cost more than they need to
    Medical Insurance/Proceedures/Drugs cost more than they need to
    Automobiles cost more than they need to
    we have the DMCA

    no I dont think we need any more of their "help"
  • by Syberghost ( 10557 ) <syberghost@sybergho[ ]com ['st.' in gap]> on Wednesday December 05, 2001 @01:44PM (#2660482) Homepage
    Mr. Senator, there is something you can actually do for us.

    It even involves you getting to pass a law, which I know is something you Senators greatly enjoy.

    It is:


    Thanks for taking my valuable time (because I pay for your time, too) to listen.
  • Part of the problem is that some confuse a tool with something that can only be used for "evil". A set of lockpicks in the hands of an honest locksmith are just tools of the trade. In the hands of a crook, they can be tools of crime.

    The problem is distinction. Systems administrators are not (and should not be) required to be licensed. This makes having tools which could be used for testing or black hat hacking always open to targeting by unsophisticated law enforcement. We've seen this time and again on Slashdot.

    Our current internet is impossible to completely secure and still offer usable services. A big problem with security are ISP's that require you uninstall any firewall software before they will support you. Firewall software on broadband should be required, not by law, but by the ISP being responsible. No firewall, no connection.Same for virus engines and current virus signature data files.

    The other big security hole on the internet is the constant bugs found in software such as Outlook and Outlook Express by Microsoft. Other vendors are guilty too, but by far the most problems are with MS products, and they just keep turning up. Part is sloppy code, part is just the way simplistic programs have to be written for the (now) average user. Harry Homeowner doesn't understand a lot about computers, nor does he want to. He wants to get on AOL or MSN, cruse the internet, and get his e-mail. As long as the most common user is of this type, security of all types will be very difficult to implement.

    Another part of the problem is that many non-technical people keep looking for the magic bullet to fix all the security problems, and want to pass laws to make it so. They forget that a law in the United States has no effect in China, and vice-versa.

    We will always have rogues with us. That will never change. There are some simple things we can do to improve security, one being that outbound filtering be emplaced. This doesn't require a law, but a bit of effort on the part of a router owner.

    As simple as it is to use, the internet is far from simple. Most people that use telephones don't understand how they work, and the same is true for computer users. Any law requiring one thing for forbidding another will have very little long term effect on computer security for the mid-level black hat. At most, you will make life a bit harder for script kiddies, but not for long and not very much. Conversely, you will be making our (honest administrators) life difficult.

  • by rlp ( 11898 ) on Wednesday December 05, 2001 @01:53PM (#2660544)
    My biggest concern is the woeful state of computer security research in the U.S. Due to crypto restrictions in the U.S., foreign firms offering commercial cryptographic products have gained a major competitive advantage. This has translated into more R&D money for these firms. The crypto regulations were repealed. But now history is repeating itself, due to congressional meddling with Intellectual Property laws (DMCA, and it's ilk). It's had a chilling effect on security research in this country. Similarly, the Sklyarov arrest resulted in foreign security experts being very wary of even attending conferences in the U.S.

    At a time when the U.S. needs to strengthen our computer security infrastructure, congress has managed to handicap the very people needed to accomplish this goal.

    So, bottom line, change the laws (starting with the DMCA), before all computer security research moves offshore.
  • by imadork ( 226897 ) on Wednesday December 05, 2001 @02:02PM (#2660604) Homepage
    As I've been following this issue over the years, I've been suprised at the parallels between the discussions over firearms and over network security tools and crypto:

    Both are considered "weapons" that can be used to "attack" others (or, in the case of crypto, facilitate attacks, although strong crypto is still considered a "weapon" by the government, right?)

    Both are also tools that can (and mostly are) be used for legitimate purposes

    Both suffer from attacks from their critics who can't differentiate between the inherent goodness/badness of a tool and the goodness/badness of the intent behind the use of the tool.

    Both suffer from the radical polarization of viewpoints on both sides of the issue.

    The only difference that I see is that we don't have a Constitutional Amendment that says "the right of the people to use BackOrifice shall not be infringed..." Perhaps that's what we need?

    I know many people who are pro-"gun rights", and by making these parallels, I've started turning them into pro-"Crypto and Internet Security" people as well. After all, if they passionately believe in the right to defend themselves from the threat that may come through their front door, they will believe in making all the information available for defending from the threat that may come through their cable box!

    (I might add that while examining these isues, I've come to understand and sympathise with the pro-"Gun Rights" people a bit more. I still don't agree with all their points, but at least I understand their basic beliefs.)

  • by chill ( 34294 ) on Wednesday December 05, 2001 @02:03PM (#2660610) Journal
    Decriminalize the publication of information. Throwing someone in jail because they talk about an encryption system or they reverse engineer a protocol, is stupid.

    Criminals, by definition, will not obey they law. Criminalizing research and information sharing hinders only the legitimate researchers and security professionals.

    If a product/services is secure, it has nothing to fear from scrutiny.
  • by Animats ( 122034 ) on Wednesday December 05, 2001 @02:33PM (#2660793) Homepage
    • Criminalize spam. Much system administrator time is spent dealing with spam. Those are the same people who have to deal with first-line security issues. There are only a few hundred high-volume spammers, annoying tens of millions of people daily. Just shutting them down will reduce the noise level, effectively providing more resources for dealing with security. Spamming only needs to be a misdemeanor, but the financial penalties should scale with the number of spams, and on the civil side, class actions should be allowed.
    • Create some financial responsibility for vendors who sell software with security holes. The current "as is, no warranty" approach to selling software is part of the problem. This is a mature industry, and it's time for it to accept its responsibilities.

      The same thing happened to the auto industry as it matured. Today we have strong warranties on cars, strong liability laws, and cars work very reliably. The auto industry kicked and screamed about regulation for decades. But in the end, they built better cars. It's time to do the same for software.

      I'd suggest, as a start, that software which will open "executable content" (which can contain viruses, etc.) without the user's explicit permission for each opening make the vendor of said software liable for negligence should any harm result from said action. This liability must not be waiveable. That puts the burden on mail readers and web browsers to protect the user against incoming attacks. Don't accept any arguments that this is technically infeasible; it's not.

    • Tighten up the Internet infrastructure. This is being looked at, but a higher priority needs to be given to tightening up the naming and routing systems of the Internet.
    • Don't overreact. So far, the main attack on the US was carried out by about twenty guys with box-cutters. There's no indication of serious "info-war". There's no domestic "fifth column". It's not clear that the enemy uses anything higher-tech than cell phones and fax machines. So back off on the reductions in civil liberties; there's no need.
  • by The Man ( 684 ) on Wednesday December 05, 2001 @02:40PM (#2660828) Homepage
    I can take care of script kiddies, virus outbreaks, and idiots who install IIS. It is Congress's responsibility to do only two things: (1) require that the computers and networks belonging to the federal government are as secure as humanly possible, especially those which may contain citizens' records, and (2) protect law-abiding or possibly law-abiding citizens from the three letter agencies by forcefully restricting their activities to legitimate investigations using constitutionally "white" - not "grey" or "marginal" or "illegal as hell" methods. That applies to computer crimes as well as all others, and for practical purposes it should restrict the TLAs to prosecution of known crimes involving federal computers, and pursuit and analysis of foreign intelligence.

    Don't protect private companies and individuals from anyone but the government. We can take care of ourselves.

    Don't protect the government from law-abiding citizens. We're at sufficient disadvantage already.

    Don't protect the privacy of convicted criminals.

    Don't create laws that favour any one kind of entity over any other, except law-abiding citizens and corporations over convicted criminals.

    Don't legislate exclusions of liability for security breaches. Let the civil courts decide who, if anyone, is responsible for damages due to security breaches.

    Don't restrict or attempt to restrict cryptography, and strictly prohibit the three letter agencies from planting or distributing intentionally weakened or defective cryptographic tools.

    Don't allow the three letter agencies to wiretap data connections without meeting constitutional requirements - it does nothing to improve security and most likely decreases it by creating additional copies of sensitive information.

    Most importantly of all - *DO* build trust in the security community by passing and strictly enforcing JUST, FAIR LAWS in all matters concerning digital security, copyright law, privacy, and civil liberties. In other words, do your job as statesmen and earn the respect and trust of all the citizens you supposedly represent. Your job is MUCH easier to do when we can trust you, and sadly, your record makes that outright impossible.

  • No new laws (Score:3, Interesting)

    by Monoman ( 8745 ) on Wednesday December 05, 2001 @02:56PM (#2660908) Homepage
    Paraphrasing Bruce Schneirer; We already have laws in place for stealing, copyright, etc. Just because someone is using a new technology to commit the same old crimes doesn't mean new laws are needed.
  • Obscurity == Fraud (Score:4, Insightful)

    by stonewolf ( 234392 ) on Wednesday December 05, 2001 @03:06PM (#2660960) Homepage
    There should be criminal and civil penalties for withholding information about security risks. Right now I do not have the legal right to know about security risks that are discovered in systems I use, the creators of those systems are not legally required to inform me when a new risk is discovered. This means that I can not make an informed decision about how to protect myself from the problem. I can't even use a list of currently unresolved risks to help me decide what systems to use and/or purchase.

    To me, the withholding of security risk information is a form of fraud. It is the same as rolling back the odometer on a used car. It is the same as selling Pintos with exploding gas tanks and the same as selling flammable pajamas to children. Companies must be required to release security risk information about their systems in a timely manner. They must be legally liable for damages that result from security issues between the time they discover the problem and the time they warn users of the problem. These kinds of penalties will force companies to create secure systems in the first place. And, to warn people in a timely manner so that they can take action to protect themselves. Although it is tempting I don't think the developers should be required to fix the system. But, a list of all outstanding security problems must be included in advertising and on the packaging of any system. People have to be able to make an informed decision about what systems to use. We put warning labels on beer and cigarettes, we require people to wear seat belts, we require the disclosure of the ingredients of all our food, we have lemon laws to protect us from unscrupulous car salesmen, and we have product liability laws that cover every physical thing we purchase. But, we have no equivalent legal protection from the purveyors of software snake oil.

    The only way a company should be able to get out from under these penalties is to declare the product "dead", notify all customers of record that no more security support will be given for that product. Declaring the software dead should also require that the source code and/or system designs as well as any patent and copyrights to the system be released to the customers so that customers can arrange for other sources of security support for the system. At that point the company would not be allowed to sell, distribute, or accept any sort of payment including royalties and support payments for the software.


  • by Remote ( 140616 ) on Wednesday December 05, 2001 @03:27PM (#2661064) Homepage
    The most important thing is to push for the correct approach. By that I mean whenever one talks about anything "digital" or "computer"/"internet"-related, commonsense dissapears, most people tend to look at relations as if a different balance was needed. It is not. Cyber tools are like any other tools. Companies that offer computer-related products should be accountable for damages, like any other company. Products that involve risk should stamp that clearly in the manuals. Tha most secure way to use software should be described in detail. If one promisses and sends a bill, one has to deliver, or else compensate. Things like that. Think of software as an automobile. It's so simple! That would answer many other questions.

    One thing, though, *is* different: the absence of an a clear geographic location for things and people on the net. This can only be dealt with through international cooperation. I would advise your Senator not to try and push for unilateral measures, as seems to be the norm in the US with this administration, because that would make it far more difficult to iron-out differences in the future.
  • My simple wishlist (Score:4, Interesting)

    by anticypher ( 48312 ) <anticypher&gmail,com> on Wednesday December 05, 2001 @05:56PM (#2661895) Homepage
    I no longer live in California, but I'd love to see some changes in the state.

    In a nutshell, intelligently enforce the laws you have.

    One. Fund a specialized law enforcement group dedicated to cybercrimes committed by individuals and organized crime gangs located physically in the state. The group should consist of state marshalls, prosecutors, lawyers, judges, and a civilian oversight committee. Recruit from computer science programs at state universities, or require experienced judges and prosecutors to attend graduate level CS programs at least part time. The oversight committee should be paid, at levels to rival good silicon valley firms, so that experienced engineers can spend a couple of years helping to guide law enforcement efforts.

    The cybercrimes group should go after trade secret thieves, spammers, scammers, slammers, crammers, and others who feed on the naivete of consumers, or who interfere in the operations of companies. They should target phone companies who slam/cram consumers, arresting corporate officers on criminal charges as warranted. They should actively track down individuals and groups who send out UCE, since spam clogging my servers is the largest single cost I have as an administrator. There should be an undercover unit targeting criminal groups who dupe individuals with "guaranteed 100% opt-in 5 million email addresses CDROM". There are many confidence/scam operators in California who have no fear of prosecution, because there hasn't been a single arrest in the last decade for any hi-tech scams in the state.

    The group should have a very publically advertised way of being contacted, and should give priority to administrators like myself who want to start legal proceedings against criminals inside of California. The people taking the complaint should have a thorough understanding of network issues, system management, and technology in general. That means you will have to pay them competitive salaries, which will make this the most expensive law enforcement group in the state. Don't worry about the cost, the value to california businesses and voters^Wtax pay^W^Wresidents will be worth it.

    Two. Criminalize aiding and abetting identity theft. This means the state should stop selling records to marketing firms. California needs to rework its incorporation laws to dis-allow companies from compiling marketing databases for sale to others. Any corporation that compiles in depth information on individuals (putting together name, address, SS#, CDL# and photo, tax history, property records, medical info) and then sell it should have its charter revoked immediately, and criminally prosecute the directors.

    I'm regularly in touch with my counterparts on the west coast of the US, and I hear their complaints on a regular basis. The FBI has dropped *ALL* cases that don't directly involve shit that happened in September. Local cops are completely incompetent to do anything more than write speeding tickets or bust kids with joints. There is no state organization to fight cybercrime. The admins spend most of their time keeping their long distance voice traffic on the best carrier when they get slammed once a month. They deal with a level of spam which equals 80% of their incoming traffic, much of it from dialups inside of California. They have to deal with employees walking out with 40 CDROMs full of locally produced code who start at a competitor the next day, who one month later have an identical product that even duplicates the bugs. Hackers at the firewall are insignificant compared to all the other criminal activity going on.

    Look at the Avant! case, where a handful of engineers walked out of Cadence, and the next week started selling an identical product at half the price and made millions of dollars in profit. The only way Cadence could prosecute was to pay for training for the judge and prosecutor, pay the whole investigation costs, and it still took most of a decade for the criminal parts of the case to occur.

    There are organized gangs selling spam-kits to unsuspecting idiots all over California. They take a bunch of money up front from the scammees, in promise of huge returns down the road for selling "penis enlargement" and MLM scams. Until now, these scammers have had no fear of prosecution, because there isn't a cop or judge in the state who will (or able to) apply the law.

    There are arguments that most of these things should be left to civil action. The problem is that civil action costs lots of money, and the civil courts tend to ignore complex cases that don't have huge amounts of money on both sides. The PUC is incapable of dealing with crammers, and have declared that any consumer who is hurt can throw millions into a civil case and hope to win. With consumer protection at the lowest in California history, its time for the government to step back into enforcing the law.

    Arguments about the internet being international are just a red herring. The laws are already on the books, some jurisdiction has to start applying them first. So what if most of the scammers leave the state? Fine, but I doubt it will happen, the drug dealers didn't all leave with tough new anti-drug laws. I'd be willing to bet very few people have enough money to start a new life in another state, spammers are lazy bastards. Kick down a few doors, prosecute some spammers and make some press about it. You might only make a small dent in spam, but I'll take anything I can get.

    the AC
  • by Bartlet ( 302396 ) on Wednesday December 05, 2001 @07:07PM (#2662331)
    I was offered a better opportunity recently, which allowed me to leave a fortune 500 company where I was the engineering manager providing ISP services to thousands of end users. While in that position, I often asked myself this same question and came up with the following wish list.

    There are a couple of things that the government can do to make computer networks and computing more secure.
    1) Repeal the DMCA. When security problems are found in an implementation of an algorithm, this law makes it illegal to talk about the problem or to implement a solution.
    2) Repeal patent law as it applies to software. Software is well protected under copyright law as a work of art. The underlying function (algorithms used) for every program out there is a subtle change to prior art. It's just that no one but large corporations have access to the courts to successfully challenge these ludicrous restriction's on sharing mathematical equations with one another.
    3) Allow end users to sue companies that keep there products closed and security problems a secret.

    4) After fixing the above. Get out of the way as the free market takes over and those with bad software are forced to compete or go out of business.

The only possible interpretation of any research whatever in the `social sciences' is: some do, some don't. -- Ernest Rutherford