Forgot your password?
typodupeerror
Security Businesses

Has Corporate Info Security Gotten Out of Hand? 466

Posted by Cliff
from the proper-security-is-like-walking-on-monowire dept.
KoshClassic asks: "What is the right balance between security and productivity, in the corporate IT environment? Looking back at my company, 10 years ago, our machines were connected directly to the Internet, no proxy, no firewall, no antivirus software. Today, my company's proxy server blocks access to: 'bad' web sites (such as Google Groups; our 'antivirus' software prevents our machines (even machines that host production applications) from carrying out legitimate functions, such as the sending of email via SMTP; and individual employees are forced to apply security patches with little or no notice, under threat of their machines loosing network access, if they do not comply by the deadline. On one hand, you can never be too secure, however on the other hand, have we become so secure that we're stifling our own ability to get things done? What is the situation like at other companies?"
This discussion has been archived. No new comments can be posted.

Has Corporate Info Security Gotten Out of Hand?

Comments Filter:
  • Management? (Score:5, Interesting)

    by Tadrith (557354) * on Wednesday January 18, 2006 @07:53PM (#14505614) Homepage
    The only real problem is overzealous proxy servers, which can be tough to configure, but should have a whitelist of some sort... the rest of the problems mentioned are problems that have solutions. There are plenty of corporate-level antivirus solutions that will allow the control of virus scanning policies so that you could enable the sending of e-mail through SMTP. If it's corporate policy not to allow it, then it really isn't a computer problem, but a company policy problem. There are also plenty of options for keeping up on patches that would relieve the users of this responsibility. Even in the case of Windows, Microsoft distributes a free "private" version of Windows Update, called Windows Server Update Services [microsoft.com] that can be deployed on a network. This version allows you to choose when and how which patches are distributed; all you have to do is point your computers to the server. Assuming you are running a Windows network, the settings for the Windows Update can be deployed via Group Policy without ever having to visit a workstation. Workstations can be scheduled to update themselves without taking control away from the IT department in regards to which patches they want installed.

    Most of that was assuming you are running a Windows-based network. I am not as familiar with Linux software, but I know that similar services are available for Linux as well. In my experience managing network environments, most of this has never been a major problem. It seems to me that the network environment doesn't suffer from too much security, but that the existing security needs to be better managed so that it doesn't prove detrimental to the productivity of the employees.


    • In my experience managing network environments, most of this has never been a major problem. It seems to me that the network environment doesn't suffer from too much security, but that the existing security needs to be better managed so that it doesn't prove detrimental to the productivity of the employees.

      Security is a moving target. What you meant by security 10 years ago and what you mean today is different in many ways. A better way to talk about security is: Security from BLAH where BLAH is som
    • Re:Management? (Score:2, Insightful)

      by 246o1 (914193)
      "There are plenty of corporate-level antivirus solutions that will allow the control of virus scanning policies so that you could enable the sending of e-mail through SMTP. If it's corporate policy not to allow it, then it really isn't a computer problem, but a company policy problem."

      Well, it seems to me that the question is really about whether corporate security policies have gotten out of hand, not about the technology itself (though a key feature of any technology, as any Mac user will be glad to lectu
    • Re:Management? (Score:5, Informative)

      by canuck57 (662392) on Wednesday January 18, 2006 @08:57PM (#14505983)

      The only real problem is overzealous proxy servers, ...

      Not really, often it best to deny, evaluate and permit with business cause. Provided the response is usually positive where the business need is legitimate then their is not an issue. Any security system will need to be tuned to work correctly. And often users fall into the trap of buying products that abuse protocols to circumvent security without regard to company policy.

      The enemy within is in my experience a 50/50 split with the enemy outside. These tools are needed to prosecute criminal and negligent employee behaviors. Some examples I have freequently seen:

      • Insider trading of company secrets
      • Posting of internal information on Yahoo and other board and mails services
      • Had a manager watching video porn consuming the network bandwidth while he was bitching at I/T because the lines were slow and the clerks could not do order input.
      • Much like the last point, the clerks will call while they are all listening to the radio and complain because the servers are slow... they don't understand nor give a damm that 100 people in an office listening to radio designed for 1 cable modem drives costs up -- they don't know how dumb they come off to I/T. And their managers didn't have the spine to say no.
      • Had one more advanced user who bypassed the proxy with a VPN type software using SSL. He thought he would not be noticed so we watched his terminal. He was using file shares relayed from his home system and watching, you got it - porn.
      • Caught one person posting personal comments about the CEO on a message board.
      • Figured out which user posted the companies address book right onto a known spammers web board as it would be "more convenient".
      • Had one one user who used their internal priveleges to load seti on 12 shared UNIX systems. The company thought their CPUs were slow and were preparing to buy more.
      • Had one internal developer who back doored some applications for stuff I can't say, but cost the company a million to clean up.
      • Had one case where every Windows server bar none was compromised and controlled from the outside. The real kicker is that the systems were compromised from the inside and then controlled from the outside to serve Warez. Got my first copy of W2000 before it was released!
      • Had one user who would run a "spam" program while working on his PC. He was caught because the companies domain was blacklisted.
      • and many more...

      So remember this when you bitch about security. The behavior above was detected by security tools. And this type of behavior in corporate America costs companies lots and reduces the security of your job. Security is to enable you to do your job AND is there to prevent the 1/100 bad asses from getting inside to do your company harm. And the opposite is true, to prevent the 1/100 bad asses you have hired from compromising your company.

      And if you don't think your threat exists from the inside, your either a very small trustworthy group or your just not looking.

      • Now I feel like I have to take a shower.
      • Re:Management? (Score:5, Interesting)

        by Anonymous Coward on Thursday January 19, 2006 @01:13AM (#14507358)
        I agree that some level of security is needed to prevent threats from both inside and outside the company. However, the goals of IT and security organizations often don't seem to align with the main goal of all companies -- to make money. At the company I work for, most departments are focused on improving efficiency, improving product quality, and keeping our customers happy. All things that are necessary for a business to be successful. However, the IT organization seems to be focused only on taking every precaution to keep the network running smoothly without regard to the impact on the rest of the business. When one of IT's policies conflicts with a legitimate business need, there's nothing I can do about it. There's nothing my manager can do about it. There's nothing his manager can do about it. There's nothing the director of engineering can do about it. The only thing the VP above him can do about it is try to work out an agreement with the VP in charge of the IT management chain or complain to the CEO. So basically, when IT's policies screw us, we just have to bend over and take it. Here are a few recent examples:

        1) A bug in one of our products affects an important customer. Engineering works feverishly to release updated firmware to fix the problem. As soon as the fix is validated, we e-mail it to the customer, but they never get the attachment. Why? IT decided to block attachments for unknown file types. The director of my division calls IT and compains. The response: "Sorry, that's our new policy." Our solution: I fly to Germany to hand deliver the updated firmware on a CD. Cost to the company: about $4000 in travel, 2 days of my time, and a customer who thinks we're crazy.

        2) We are completing the timing analysis for a new ASIC. The simulations take about a week to complete, and if they are interrupted we have to start over. The only problem is that every time we start the tests, IT deploys a new security patch and forces a reboot of the PC before the testing can complete. This happens repeatedly and results in a 2 month delay in getting the chips made. We make up some of that lost time, but the project still slips by more than a month. As a result, we were contractually obligated to refund $200,000 of the NRE we got for doing the work since we missed our dates.

        3) We use ClearCase for source code control. Everyone in the company with a unix account had access to the source code and could check in and check out files. Our IT department decided this was a security risk -- reasonable, I suppose. To correct the problem, without notice they disabled access for everyone. They then sent out an email saying that anyone who needed access had to fill out a form, get it signed by a manager, and fax it to their department. They were so bombarded with these requests that it took about 3 weeks to process them all and get everyone's access restored. It took them about 2 weeks to get to mine. During that time, my company paid me a fat salary to sit at my desk and learn how to work a rubik's cube. I can now work a rubik's cube in about 90 seconds, but this is of questionable value to my company.

        4) To increase password security, our IT department implemented a new password policy. All passwords must be at least 8 characters long, contain at least one uppercase character, one lowercase character, and one number or symbol. All passwords must be changed every 30 days. When changing your password, you can't use any of the last 10 passwords you have used. Every system that requires a login must use a different password (I have a windows login, a unix login, a SAP login, and a login for an internal bug tracking tool). Ironically, all of these systems use LDAP authentication which was implemented about 2 years ago so that we could use the SAME password for all our accounts. If you enter the wrong password 5 times, your account gets locked out and you have to issue a ticket to the help desk to get your account restored. This usually takes about a day. The result of
        • Re:Management? (Score:3, Insightful)

          by rmm4pi8 (680224)
          1) Ever heard of a file server?

          2) Take the box off the new while it's doing the sim. Thus, sim gets done, box doesn't get owned, net stays secure.

          3/4) These aren't evidence that your IT department values security over ease-of-use, but rather that they're totally incompetent, utterly crazy, or both.
        • Re:Management? (Score:4, Insightful)

          by maxwell demon (590494) on Thursday January 19, 2006 @05:34AM (#14508093) Journal

          1) A bug in one of our products affects an important customer. Engineering works feverishly to release updated firmware to fix the problem. As soon as the fix is validated, we e-mail it to the customer, but they never get the attachment. Why? IT decided to block attachments for unknown file types. The director of my division calls IT and compains. The response: "Sorry, that's our new policy." Our solution: I fly to Germany to hand deliver the updated firmware on a CD. Cost to the company: about $4000 in travel, 2 days of my time, and a customer who thinks we're crazy.

          Did the director tell the IT department about your specific file type, so they could just add that to the white list of allowed attachments instead of just allowing all sorts of attachments? If he did, and they refused to add that file type, it's their fault. If he didn't, then it's his fault. BTW, hand delivery is indeed crazy: If an email attachment had beed enough, surely mailing them a CD-R with the patches would have done it as well, and would surely have cost you less. But even for email, there might be solutions, like uuencode (which makes the file part of the mail text instead of an attachment, and therefore might not be detected/blocked by the automatic filters).

          2) We are completing the timing analysis for a new ASIC. The simulations take about a week to complete, and if they are interrupted we have to start over. The only problem is that every time we start the tests, IT deploys a new security patch and forces a reboot of the PC before the testing can complete. This happens repeatedly and results in a 2 month delay in getting the chips made. We make up some of that lost time, but the project still slips by more than a month. As a result, we were contractually obligated to refund $200,000 of the NRE we got for doing the work since we missed our dates.

          Did you talk to the IT department about this? Would it have been an option to take the PC from the net during the testing period, and then apply all securiy patches in one bulk before reconnecting it?

          3) We use ClearCase for source code control. Everyone in the company with a unix account had access to the source code and could check in and check out files. Our IT department decided this was a security risk -- reasonable, I suppose. To correct the problem, without notice they disabled access for everyone. They then sent out an email saying that anyone who needed access had to fill out a form, get it signed by a manager, and fax it to their department. They were so bombarded with these requests that it took about 3 weeks to process them all and get everyone's access restored. It took them about 2 weeks to get to mine. During that time, my company paid me a fat salary to sit at my desk and learn how to work a rubik's cube. I can now work a rubik's cube in about 90 seconds, but this is of questionable value to my company.

          Ok, this one is clearly a stupid action from your IT department.

          4) To increase password security, our IT department implemented a new password policy. All passwords must be at least 8 characters long, contain at least one uppercase character, one lowercase character, and one number or symbol. All passwords must be changed every 30 days. When changing your password, you can't use any of the last 10 passwords you have used. Every system that requires a login must use a different password (I have a windows login, a unix login, a SAP login, and a login for an internal bug tracking tool). Ironically, all of these systems use LDAP authentication which was implemented about 2 years ago so that we could use the SAME password for all our accounts. If you enter the wrong password 5 times, your account gets locked out and you have to issue a ticket to the help desk to get your account restored. This usually takes about a day. The result of this new policy: people write their passwords on post-it notes and stick it on their monitor because they

        • Re:Management? (Score:5, Insightful)

          by Alioth (221270) <no@spam> on Thursday January 19, 2006 @05:56AM (#14508140) Journal
          Someone needs to get hold of your IT department and tell them they don't work in a vacuum. It *is* possible to design a good security, update, patch etc. policy - but it HAS to be done in conjunction with the rest of the business (and the rest of the business must at least understand a little bit about information security and the need for an orderly process). Your IT department management is incompetent by the sounds of it.
          • Re:Management? (Score:5, Insightful)

            by cowbutt (21077) on Thursday January 19, 2006 @06:01AM (#14508150) Journal
            Seconded. Good information security should ideally be transparent, and with a bit of work on the part of the people implementing it, often can be. Sometimes, it's even possible for the good security to facilitate working practices that wouldn't have previously been considered possible.
        • Re:Management? (Score:5, Insightful)

          by dclydew (14163) <dclydew@gmail.com> on Thursday January 19, 2006 @10:25AM (#14509428)
          In your first two examples, I think that the security team was being entirely reasonable. Files should not be transmitted via email, tools like FTP/SFTP appear much more suited for such work. Using the right tools, often improves security. In the second instance, taking the system off of the network while building should fix the problem. I wouldn't be surprised if the third example had to do with SOX, since we had to do something similar here. All systems had to have a managed trail that could tell us which employees had access, when they accessed and what they accessed. On a number of older systems, we found lots of generic ID's that were being used by multiple employees. We didn't have the luxury of slowly fixing this issue. We were told by the auditors that it HAD to HAPPEN IMMEDIATELY, or we would fail complaince.

          The password thing sounds bad. 8 characters is ok (though not really mush more secure these days), no repeating of old passwords is ok (again not great), but 30 days is very bad. 30 days to lead to two problems. 1) People write it down on sticky notes; B) People make easy to remember "MyFebPwd1" "MyMarchPwd1" etc.

          It sounds like the person who made your password policy could do with a dose of accurate information about the usability of passwords. However, the other stuff seems reasonable to me.
        • by Pac (9516) <paulo...candido@@@gmail...com> on Thursday January 19, 2006 @10:46AM (#14509638)
          From your examples, it looks like your whole IT deparment is working very hard to be downsized or outsourced. From my experience, the minute a smart VP or CEO (or, a common case, an external consultant who has the VP or the CEO's ear) notices and documents the kind of impact they are having in the bottom line, lots of high and middle heads will start rolling. Having inflexible rules when your market is evolving or constantly changing (and when your market is global it is always changing and evolving) is so dumb it hurts - when have we called the high priests back to the computer room, anyway? I though we had all agreed to send them home for good by the end of the 70's.
    • Re:Management? (Score:4, Insightful)

      by rblancarte (213492) on Wednesday January 18, 2006 @09:00PM (#14506005) Homepage
      I very much agree with what you are saying here. I mean, what I see in the message posted is some poor IT policies. Just picking it apart (just like you did):

      Looking back at my company, 10 years ago, our machines were connected directly to the Internet, no proxy, no firewall, no antivirus software.

      I am pretty sure that most people agree, this is not acceptable, and 10 years ago, this would also be considered dangerous.

      Today, my company's proxy server blocks access to: 'bad' web sites (such as Google Groups)

      First off, blocking objectional sites is a good thing. There are a number of things in a work environment that are unacceptable. Sure, some good sites will be gotten as well, but the IT department should have a policy such that you can ask for sites to be allowed if they are being blocked and really shouldn't be. Considering the information on Google Groups, I think that you are looking at a site that really should be allowed.

      our 'antivirus' software prevents our machines (even machines that host production applications) from carrying out legitimate functions, such as the sending of email via SMTP

      Time to get new anti-virus software. Good AV software, will allow you to scan message in- and out- bound via POP, IMAP and SMTP.

      individual employees are forced to apply security patches with little or no notice, under threat of their machines loosing network access, if they do not comply by the deadline

      Very poor policy. This should be handled by professional IT workers. Not because the end user doesn't know what is going on, they might, however, something could go wrong, and someone better equiped to handle those issues should be on hand for them. Like the parent said, at this point, you could even have these patches be automated.

      The main message asked about other companies, so ... I used to be an IT worker for an international law firm (before returning to school). Everything that was just described would have never happened at that place. The IT staff handled all computer issues. With most of the security being done in a way that was transparent to the end users. AV software - they didn't notice it, and it auto updated itself. Firewall - blocked objectional sites, but there was a policy to allow them, because some times it was necessary to view them (sometimes you have to serve legal documents to the porn companies). And patches were handled by the IT staff, usually in off hours.
      To me you have an IT staff for a reason, they are there to handle computer issues. They should not be there to be some draconian department that weilds their power as if they are doing you a favor. They are there to handle your computer problems. They should also take some of the responsibility for that as well, which includes handling most of the issues that you listed.

      RonB
    • Re:Management? (Score:4, Interesting)

      by bhmit1 (2270) on Wednesday January 18, 2006 @10:41PM (#14506572) Homepage
      If it's corporate policy not to allow it, then it really isn't a computer problem, but a company policy problem.

      Being a consultant, I've seen a wide variety of security policies from my various clients. I've had countless clients that have strict restrictions on where you can get over the network out of concern that you may transmit confidential data, but then let you walk in and out the door with a laptop as you please. That same client provided vpn access for remote support, but blocked ssh over the vpn because that would allow an ftp like (scp) access while leaving telnet open. I've been to places that refused to give me internet access even though it was the prefered way to receive support for their application and the only way to search the knowledge base. I've started on a project with a team of people, and more desktops (not even counting our own laptops) than network jacks. After waiting several weeks for a couple new jacks to be installed with three of us sharing one PC, I gave up and got a cheap network hub (this was several years ago) but was told that it wasn't allowed because they couldn't be sure it hasn't been compromised. I've been places where they wouldn't give me a badge to get in the door and no one was assigned to the front desk, so the unlucky guy sitting by the side door got used to hearing the banging and letting anyone in without any idea of who they were.

      Of course, for every bad client, there's one that lets me remotely connect to my home network, makes sure I have a badge with access to everywhere I need to be, and promptly makes a backup and changes the root password before providing me full access to the server that I need to configure. It's all a question of cost of security breach vs cost of security enforcement.

      To me, none of these things are worth being upset about. Yes, they are annoying, but it's the clients decision to make things more difficult, and therefore, more expensive. I simply do the best I can with the resources available. Of course it would be nice if the policies considered the threat instead of only the past exploits. Then they would realize that someone trying to carrying a stack of files out the door is no worse than the guy that walked by with the flash drive in his pocket.
  • Technology (Score:3, Insightful)

    by biocute (936687) on Wednesday January 18, 2006 @07:53PM (#14505619) Homepage
    I think overall mankind's productivity has increased thanks to the technology. I can't say if the IT world would be more convenient if 95% of us were using Linux.

    It's like when cars were first introduced, there were not speed limits, cars were hardly locked and tyres were hardly threaded......

    As cars become more common, more people died in car accidents, so you can't drive too fast anymore, must wear seatbelts and cannot drive drunk.

    As car thefts become a norm, we must lock our cars, when that's not enough, we need to put on the steering lock, alarm, then immobalizer, and now the security datadot. However, I think overall we do benefit from the introduction of vehicles.
    • Re:Technology (Score:4, Insightful)

      by eobanb (823187) on Wednesday January 18, 2006 @08:01PM (#14505675) Homepage
      The issue is not with the equivalent of locking your car. The issue is draconian policies like arbitrary blocking of sites like Google Groups. Therefore, I feel that your analogy isn't right for article in that it assumes that "well there are good and bad things about computers, but the good outweighs the bad." No one's arguing that point. Instead it's more like, "well there are good and bad security policies. At what point does it become simply stupid?"
      • Re:Technology (Score:5, Informative)

        by CleverFox (85783) on Wednesday January 18, 2006 @08:15PM (#14505773)
        Being a corporate IT security at large corporation I can tell you why google groups are blocked. If I am looking at porn on alt.binaries.erotica and a female co-worker walks up behind me she could sue for sexual harassment and say the company did not take adequate measures to prevent this situation. Basically they fear a lawsuit.
        • Re:Technology (Score:3, Interesting)

          by pete6677 (681676)
          What if you were sitting at your desk "reading" a Penthouse instead? Or looking at porn pictures on your computer that you brought in on a flash drive? Where would the company's liability end? I'd say firing an employee that generated complaints by looking at porn in the office would be adequate.
        • Porn liability (Score:4, Interesting)

          by typical (886006) on Wednesday January 18, 2006 @10:34PM (#14506529) Journal
          Being a corporate IT security at large corporation I can tell you why google groups are blocked. If I am looking at porn on alt.binaries.erotica and a female co-worker walks up behind me she could sue for sexual harassment and say the company did not take adequate measures to prevent this situation.

          My understanding is the hoopola about "if you don't block pornography, you're liable" is nonsense that's heavily propogated by vendors of filtering software. The case that claims about liability are based on is the '91 ruling in Robinson v. Jacksonville Shipyards, Inc. Here, the plaintiff was being directly targeted and porn was being publically pervasively placed throughout the workplace. That's a *far* cry from someone walking in and seeing a pornographic image on someone's computer monitor. That's even *further* away from a company being liable because they actually aren't buying a product to do filtering.

          My impression is that most of the people that install these packages get sold a bill of goods by the filtering people "Lawsuits! Lawsuits!" The IT people pass the possibility of a lawsuit on up, some higher-up decides that the software is cheap insurance against a lawsuit, and buys it.

          Frankly, companies don't need to worry about liability from not filtering porn (IANAL and all that). They might need to worry about employees being off-task (I mean, come on -- if you're browsing porn, you are *not* doing work). However, I've been incredibly frusterated by stuff in the past (like pages containing "wine" in the URL being blocked -- when I'm trying to look up constants in WINE's header files), with information about HTTP tunneling that I needed for writing some software that had to interoperate with a firewall being blocked (as "criminal activity", impressively enough, along with anything involving a "proxy"), and so forth. Companies aren't avoiding liability at all -- they're trying to control employees, and keep them from goofing off at work. I'm not saying that there's necessarily anything wrong with that that, but it's just not really a liability issue. I've seen people blow time chatting with their friends on non-work related stuff on AIM, and I can understand that there's a desire to not let the computer be an entertainment device.

          However, I've got a much better solution. Have software that skims browsing history, flags anything suspicious, and allows an employee's boss to take a gander at it (if he really wants to). Oh, and *tell* the employee that you plan to do this -- the idea is to prevent abuse. I don't have a problem with my boss seeing a complete log of my at-work browsing history -- I do have a real problem with IT blocking things. I don't abuse my work connection, and it's really irritating to be treated as if I have because someone somewhere *has* done so.

          Basically, I think that it's probably unreasonable to prevent the following types of Internet usage in a regular work environment, at least from a security/liability standpoint:

          * Outbound TCP connections, other than maybe to port 25. The whole world is not HTTP.

          * Requests to DNS servers other than the company one (why on *earth* do people do this?)

          * Outbound SSH connections (a special case of the above that's particularly annoying -- sometimes I need to get at my addressbook or something else on my home computer). (There is a small potential security issue here in that someone could set up X11 port forwarding, and have a compromised outside box keylog or screenshot their workstation machine desktop) but goddamn it, the risk is awfully small and the loss of functionality enormous. This is not James Bond, and armies of ninja hackers are not out trying to take screenshots of desktops.

          * Access to webpages. Good *God*. If you have to log them, fine, but for Chrissake, do not filter. It's *so* irritating.

          Real security risks? Worms, dubious software that people intentionally install, people simply taking confidential (*actually* confidentially, not doc
        • Re:Technology (Score:3, Informative)

          by cmacb (547347)
          As far as I know Google Groups doesn't carry binaries of any kind, nor do they carry and of the groups in which you would likely find text porn. They do have technical groups back to the beginning of time though and I've used them more than once for technical research.
      • Re:Technology (Score:3, Insightful)

        by Kyosuke77 (783293)
        But then the question is do they have legitimate reasons for doing things like browsing Google Groups? A friend of mine works for RBC Royal Bank as a personal banking manager. Their network is so restricted, he can't access Hotmail.

        Yet why does he need to access Hotmail from his work computer? Besides, he can just access it from his Treo, on which he has an unlimited data plan. I don't see that as onerous security, and neither does he. They're a bank for goodness sake! They have very good reasons for
    • by hackstraw (262471) *
      I think overall mankind's productivity has increased thanks to the technology. I can't say if the IT world would be more convenient if 95% of us were using Linux.

      I believe that CAD, CAM, robots, genetic engineering of crops, and assembly lines has much more to do with it. Well, I guess all of those things are technology. I love Linux. It has more creature features than "real" unix OSes. FreeBSD 4.9s 'ls' still does "ls -ke
      ls: illegal option -- e
      usage: ls [-ABCFGHLPRTWabcdfghiklnoqrstu1] [file ...]"

      Thank
  • It's all possible... (Score:5, Informative)

    by jabella (91754) * on Wednesday January 18, 2006 @07:54PM (#14505622) Journal
    Security like most things, is a balancing act. Being able to manage the 'pain vs. protection' factor is the key to all of it, and unfortunately no tools seem to have the sliding adjustment with those options on it.

    Ideally security will allow everything that's vital while not stepping on any services that are required. With most companies, what is 'required' ends up being pared down as the security net gets closed down tighter.

    Nostalgia is one thing -- how many of us worked on systems that had telnet / ftp open to the outside without a firewall? I know I did back in the day. When management is behind security initiatives, being able to work on the business isses ("No, we CAN'T disable FTP!") becomes less of a problem.

    Regarding individual workstations -- putting the burden on end-users doesn't seem to be a common (thankfully) configuration in the companies I've seen. Most larger places are doing automated patch management and deployment now. I know quite a few places where every single system (desktop and production) is patched within a 15 day window. While it's not bleeding edge, this relatively fast schedule combined with the concept of 'defense in depth' goes a long way to preventing issues. I know places that haven't lost a machine to a virus in YEARS.

    Security that's preventing legitimate work from being done needs to be adjusted. All of the problems you've mentioned are fixable.
    • by Alaren (682568) on Wednesday January 18, 2006 @08:07PM (#14505720)

      I agree with most of what you've said, but there are two major problems:

      The first is with the "appearance of security." Oftentimes management will hand down edicts based on something they've heard or read or even something a customer (when doing business with other businesses) has demanded. They may not understand why or how the security measure is preventing legitimate work from getting done. All they care about is that they can say "we have security measure X in place." In some cases they do understand that the problem hurts legitimate work, but believe for whatever reason that employees can/should adjust accordingly.

      Second, security is often used as an excuse for "enabling workers through managed limitation of potential distractions." Increasingly, employers are concerned that one of their employees might not be thinking about work every second of every day. This stems from an unfortunate misunderstanding of the bounties technology has brought us. Instead of thinking (as they should) "I pay Joe to accomplish X task," they think, "I have purchased Joe for X hours." Hours are good, they think, because hours are quantifiable, but it makes more sense (especially in the tech industry) to tell people: this is your task. I don't care what you do between now and next month, so long as your task gets accomplished.

      Maybe that's too utopian of me? I guess I just have a problem with a society that is increasingly able to accomplish great things in short periods of time insisting that the extra time must be filled only with more drudgery.

      • Yes, security is most definitely being used as the stick to beat end-users down as far as 'distractions' go. I have had the fortunate experience to work for a company where the motto is:

        "It's the result that matters."

        If you spend time on slashdot or other forums during the day that's ok (and most definitely not filtered) -- but at the end of the month you have XYZ to get done. If you get it done by working nights / weekends that's your prerogative. Flexibility like this is one of the reasons why we've ha
      • by Pig Hogger (10379) <`moc.liamg' `ta' `reggoh.gip'> on Wednesday January 18, 2006 @08:47PM (#14505923) Journal
        Oftentimes management will hand down edicts based on something they've heard or read or even something a customer
        ...
        They may not understand why or how the security measure is preventing legitimate work from getting done.
        That's because, in case you haven't noticed, management does not do any legitimate work.
  • by yagu (721525) * <yayagu@@@gmail...com> on Wednesday January 18, 2006 @07:57PM (#14505636) Journal

    One time for security's sake my office ethernet port was turned off by IT. Figuring it to be some outage I called support (hah!), and they looked up my IP address and said yes the port had been turned off because my machine had refused to accept recent XP updates.

    Hmmm, but my machine is a linux machine! We're sorry, but until you're machine accepts the updates we can't re-enable the port. I asked why I hadn't been notified -- they said ALL XP login scripts had been posting the notice for over a week, I had been given "plenty" of warning!

    Hmmmm, but my machine is a linux machine! We're sorry, but until you're machine accepts the updates we can't re-enable the port.

    Fortunately I had a dual-boot, so I was able to comply.

    But, ironic that one of their (in my opinion) least vulnerable machines on the network was mine.

    (And, for the record, my assigned work had no specific XP requirement, and my responsibilities were heavily around Unix... so I wasn't in violation of any policy (such as they existed).)

    • by badriram (699489) on Wednesday January 18, 2006 @08:02PM (#14505678)
      Well if IT installed linux, well they should not be doing something that stupid. However if you decided to install Linux, and the IT folks maintain your computer, i would have to agree with them. Unless you work at a software company, developing apps, or a sys admin you are outta luck.
      • by Vellmont (569020) on Wednesday January 18, 2006 @08:18PM (#14505783) Homepage
        He said his responsibilities were heavily around Unix. I kinda doubt he's some low level secretary that wants to install linux for fun. Why not give him the benefit of the doubt and assume he's not in the wrong here?

        I'm guessing the problem is one of compartmentalization. The IT department doesn't talk to the production department, and so doesn't know there's some people that are running linux and not XP. The standard drone-like response of "We're sorry, but until you're machine accepts the updates we can't re-enable the port." really sounds to me like extreme compartmentalization.
    • Honestly, if I was you in that situation, I would have simply sat back and explained that you could not do any work, and that they are free to try and turn on Windows XP updating, but oh of course any system re-installation and thus potential loss of data would be their fault, not yours, at which point you launch a flurry of complaints to whoever is even higher up in the corporate chain of command.
    • Oddly enough, I'm going to be replying to your sig, but in this case it's actually rather on topic.

      If enough virus writers made viruses for Linux security vulnerabilities frequently enough that it necessitated monthly or even bi-weekly kernel updates, would not the statement about Windows in your sig then apply to Linux?
      • Totally wrong. One of major flaws in Windows is that you can't replace any file that is currently open, and since the major system libs are not modular, nearly any patch issued by Microsoft requires a reboot.

        On any Unix system, you can update anything except for the running kernel (actually, you can replace it on the disk but can't reload it). In the case of Hurd, you can update even it.

        Since security updates to the kernel itself are pretty rare, you don't need to make almost any reboots. This enables
      • To answer your linux question, if cyber terrorists were able to gain hold of the windows and internet explorer source code, would they be able to continually target and take over every windows box connected to the internet and be able to wreak financial havoc on busniesses around the globe (microsoft itself acknowledged it was an extreme security risk). If thens work in computer programimg, don't become stuck on possible failures when trying to avoid a known failure, windows security.
      • Maybe, maybe not. But I don't think so. Consider that MOST patches with Windows (any version) call for a reboot, thus downtime just happened. Many patches in Linux don't require for the system to be brought down. Sure, you might need to bring down a service or two, but that would leave the system still up to fill other requests.

        RonB
        • Well, my question was more hypothetical than anything. I was talking about kernel updates, though, which I know for a fact always require reboots on Linux. The way I see it, Windows is under constant security siege, and I was posing the question that if Linux's security were under that same siege, so that monthly kernel updates were necessary for safe operation, would it not then need reboots that frequently as well?
    • by Thuktun (221615) on Wednesday January 18, 2006 @08:13PM (#14505761) Homepage Journal
      Hmmm, but my machine is a linux machine! [...] Hmmmm, but my machine is a linux machine! [...] Fortunately I had a dual-boot, so I was able to comply.

      Yeah, weird that they might want a machine running Windows XP to be updated. You might have Linux on the machine, but you also had Windows XP, and it sounds like it was missing security patches.

      And, for the record, my assigned work had no specific XP requirement, and my responsibilities were heavily around Unix.

      And you apparently had a machine with Windows XP missing some (possibly significant) security patches sitting on their network.

      I fail to see how this was stupid of the network admins. Draconian maybe, but it got you to apply the security patches.
      • Why it's stupid (Score:5, Insightful)

        by Gorimek (61128) on Wednesday January 18, 2006 @08:37PM (#14505880) Homepage
        The stupid part of the story (as told by the poster) is that these IT "professionals" didn't seem to understand that Linux is incompatible with XP.

        Why are people who don't comprehend - or can't communicate - this employed in an IT organization??

        Had they just explained things the way you explain them in your post, there would be no problem.
        • by IAAP (937607) on Wednesday January 18, 2006 @09:06PM (#14506035)
          hy are people who don't comprehend - or can't communicate - this employed in an IT organization??

          You sir, need to accept the bureaucratic nature of large organizations. There have been a few times that I've had to do some really asinine things in order to keep my job. I knew it was bullshit, my coworkers knew it was BS, and the poor SOB on the other end really knew it was BS. But, if either strayed from policy it was our asses. Why was this policy in place? Because the higher ups didn't want to take the time for all of the inevitable exceptions that occur.

          The solution? Acceptance - Zen practice. Or, start your own organizaton - if possible. Entrepreneurship!

          There's a reason why small companies are the ones that are creating most of the jobs. There's a reason why small companies are the innovators. There's a reason ... you get the idea.

      • by Savage-Rabbit (308260) on Wednesday January 18, 2006 @09:02PM (#14506014)
        Yeah, weird that they might want a machine running Windows XP to be updated. You might have Linux on the machine, but you also had Windows XP, and it sounds like it was missing security patches.

        The fact that he hadn't noticed the loginscripts for over a week indicates to me that the didn't use his XP installation at work alot and even then how can you assert it wasn't patched? He may even have had to wait until a patch becaeme available to qualify for a connection because his XP installation was already fully patches! Off hand I am guessing this guy probably got issued a laptop from his employer and used installed Linux on it for day to day for home as well as for work use dual booted with XP for mostly for gaming and perhaps for that once-in-a-blue-moon that he couldn't get something done at work with Wine+[Random M$ application] and for Gaming.

        I fail to see how this was stupid of the network admins. Draconian maybe, but it got you to apply the security patches.

        It is stupid because they could have exempted him from their Windows specific policy quite easily. It is stupid because they may even have given him a hard time because they didn't even know how to exempt a non Windows boxen from their MS specific setup. All it would have taken was to send somebody up stairs to check out his setup for security and if it was OK adapt the policy. If you are an IT tech that works alot around Engineers, non-MS admins or Programmers you are going to have to get used to cases like this (ie. escaped mental patients who use Linux or OS.X in a corporate environment) and unless you find out how to cater to people running non-MS Operating systems you will quickly find out that you haven't got any friends willing to do you a favor when you really need it (ie. when you have screwed up and need a quick fix from the local nerds).
        • It is stupid because they could have exempted him from their Windows specific policy quite easily. It is stupid because they may even have given him a hard time because they didn't even know how to exempt a non Windows boxen from their MS specific setup. All it would have taken was to send somebody up stairs to check out his setup for security and if it was OK adapt the policy.

          But it wasn't ok. He had a dual boot system, with one of the OS's way behind on patches. That's not secure. Any time he reboot

    • They were right. (Score:5, Insightful)

      by lheal (86013) <lheal1999@yaho[ ]om ['o.c' in gap]> on Wednesday January 18, 2006 @08:28PM (#14505843) Journal

      You should have simply rebooted to the XP side and run the updates. If you want the luxury of a dual-boot system, you should be willing to maintain both halves.

      My policy for dual-boot machines is this: No. You can have two machines. I'll get you two monitors you can use dual-head on each machine, a KVM, your own switch, and I'll even clean the goo off your keyboard. But I won't manage a dual boot machine, and I don't want them on my network.

      Why?

      • One side is always down, meaning network monitors need special work
      • Either both sides share one IP address, or each gets its own. Either figure out which one is running, or figure out which address to use.
      • It requires physical intervention (or extraordinary hacks) to reboot remotely to the other OS
      • I can't just wax the whole thing if something goes wrong
      • Rebooting implies root access for whoever is around
      • In short, they're a PITA
      • My policy for dual-boot machines is this: No.

        Realistically, it seems like there are really two ways to go here. Either build an environment in which all elements can be rigorously locked down and validated, or be prepared to contain the effects of allowing people to attach foreign equipment such as laptops or other systems that they maintain to their own standards.

        Security comes down to defining the conditions of ownership and trust at each point in the computing environment. That's something agreed a

  • by heatdeath (217147) on Wednesday January 18, 2006 @07:57PM (#14505637)
    individual employees are forced to apply security patches with little or no notice, under threat of their machines loosing network access

    I don't think this is unreasonable at all. What's the downside of enforcing a little rigor in your employees, when the alternative is having your entire corporate network become a zombie farm overnight controlled by a mob boss in Russia named Vladamir?
    • About your sig,
      I used it as the OGM for my phone and you would not believe the number of hangups I got!
      -nB
  • by MicroBerto (91055) on Wednesday January 18, 2006 @07:58PM (#14505640)
    What "we"?? The company I work at does none of those things, and the network runs almost perfectly. There is a balance.

    But also realize how much the worms of 2003 and 2004 cost corporations. I saw it first hand when working in a plant, and it was seriously disastrous. I can understand why they don't want that to happen again.

    If surfing "bad" sites is THAT important to you, perhaps its time to get your resume out to a company that trusts its employees more. Or quit complaining to a bunch of slashdotters and present a true solution that benefits everyone. There are ways to have both security and usability.

    • What "we"?? The company I work at does none of those things, and the network runs almost perfectly. There is a balance.

      Sure there's a balance. Don't rely on Windows. It's quite simple. No draconian security policy needed (blocking Google Groups? Whiskey Tango Foxtrot?), AND there's but a miniscule risk of malware infection.

    • If surfing "bad" sites is THAT important to you, perhaps its time to get your resume out to a company that trusts its employees more.

      How do you know he's not about to do exactly that, but first wants to know if the draconian security policies are the norm and not the exception?

      Or quit complaining to a bunch of slashdotters and present a true solution that benefits everyone. There are ways to have both security and usability.

      Any why isn't asking for help from peers a good way of trying to find that exact sol
    • Yea, he should polish his resume, but how many /.'ers download (or used to) MP3s, movies, warez, etc over their corporate connection because they don't/didn't have a highspeed connection at home?

      For some companies, it is cheaper to just lockdown the network and reduce efficiency, than it is to have to spend $$$$ on playing whack-a-mole with computer problems as they show up. Or to deal with bandwidth issues because someone is leeching like crazy over the company connection.
  • Sorry... (Score:5, Funny)

    by Necrotica (241109) <cspencer.lanlord@ca> on Wednesday January 18, 2006 @07:58PM (#14505648)
    What is the situation like at other companies?

    I'd love to tell you but that would be a breach of security.
  • Everywhere I've worked seven to ten years ago (1995-1999) made IT workers who wanted Internet access sign special forms that had to be okayed by three levels of management before Internet access was granted. And once granted, it was heavily monitored.

    Four to seven years ago (2000-2002) getting Infobahn access was far easier, but most companies still required that you use their proxy so that they could monitor who visited which sites and who spent more time posting to /. that checking code into CVS.

    But latel
  • - Google Groups doesn't sound like a business website. That's "bad" from a management perspective.

    - SMTP blocking would not be needed if users didn't keep clicking on emails from the "FBI" "CIA" , etc. Besides that, it's easy to configure an AV policy to exempt legitimage SMTP usage.

    - Updates can and should be applied automatically and without user intervention. If a reboot is required a nightly shutdown policy will suffice.

    I'd love to live in a happy land where all computers can be open and free but unfort
    • and if you work for the "FBI" "CIA"?

      Man "sorry boss, I couldn't check your email, it was from the FBI."
      FBI Head honcho: "we ARE the FBI IDIOT!"
      Assistant: "That's no way to talk to the president!"

      Rimshot!

  • I think that there are too many companies who have people who just decide iTunes purchases and downloading of podcasts specifically through iTunes is not a good use of resources, yet we are a educational institution that can have VALID reasons for purchasing music and downloading podcasts. There's a programmer that creates...things that are put into our login scripts to kick off antiviral scans at every reboot, scan inventories and update records at every log in among other things. It's to the point that
  • Personally (Score:2, Interesting)

    Being a memeber of the IT dept. at a school district , i am glad our secuirty policies are as stringent as they are. when you have a few thousands teenagers trying to download as much spyware and pr0n as possible. Now you may say most business dont have teenagers as employees, but even the teachers need to be protected from themselves because they dont know any better. What im getting at , is if he thinks its hard to get stuff with his security policies wait one week without them and see what he can do.
  • At big big US government agency they block jakarta.apache.org because it is a "hacker tools site". Ironically the majority of their own stuff runs on Tomcat, et al.
  • Has Corporate Info Security Gotten Out of Hand?

    Obviously it still needs work.
    google: stolen customer data [google.com]
  • by Saint Aardvark (159009) * on Wednesday January 18, 2006 @08:09PM (#14505736) Homepage Journal
    • Your company's proxy policy is a matter of policy at your company -- complain to them about it! If it's preventing you from getting work done, you should have no problem convincing them -- and if you do, light a fire under your manager; that's what managers are there for.
    • "the sending of email via SMTP" -- Maybe I'm misinterpreting this, but if you mean "our desktops and servers have to pass email to the designated relay", then I'm completely unsympathetic. If your complaint is about poor performance, complain about that -- but your desktop and your production machines are not mail servers!
    • "forced to apply security patches with little or no notice" -- I can guaran-fucking-tee you that each time that happens there is a wave of complaints to your IT department. And yet they keep doing it anyway. They're either heartless, bastard pyschopaths with no concept of sympathy, or it's important to apply these patches. Human nature being what it is, I'm willing to bet they think it's important...no one lets themselves in for a shitstorm voluntarily just 'cos it's, you know, second Tuesday of the month.

    And, why, yes I am a network administrator, thanks. I'm lucky so far -- it's a small company, people are well-behaved, and I don't have to implement the policies you describe. I set up times for patches, there's no proxy yet and not too many firewall restrictions.

    But if this place gets to be big enough that I can't count on collective intelligence and/or social pressure to keep people doing the right thing, I'm going to have to seriously consider policies just like the ones you describe, in order to keep things running as they need to -- because your complaints about the network not working 'cos of the latest virus outbreak are going to be a fuck of a lot louder than your complaints about your desktop machine not being allowed to be a mail server.

  • None of the stuff you mention bothers me, except occasionally when a site I need to access is mysteriously blocked.

    What does create havoc (and I jump in with this in every one of these discussions because it can't be said enough) is the insanity with multiple, long, complex, frequently-but-out-of-sync changed passwords. It causes huge hassles, prevents users from taking advantage of resources and is an absolute disaster for security.

  • Of course, when companies get nonsensical security policies, they force people into horribly inefficient and/or insecure workarounds.

    Rather than issuing in-office consultants a company e-mail address, CCing a Yahoo.com e-mail address, besides being insecure and unaudited, just looks damn unprofessional.

    Don't have a document management system, SFTP, or even FTP? People clog up Exchange with huge attachments with no central control or even a sense of where the authoritative copy of something can be found.

    How
  • by ayelvington (718605) on Wednesday January 18, 2006 @08:14PM (#14505767) Journal
    I work in a .mil environment with managed images and very good security. What I'm reading is that your company is still in the learning phase when it comes to customer service balanced with security.

    We operate under a standard image architecture with updates and patches pushed out across the enterprise. Proxy servers are a necessary evil, but we are very reasonable on our block lists. (North Korean sites are discouraged along with Ebay...) This is for our unclassified network...

    We learned the hard way too. Our first generation of machines were issued with padlocks on the cases and no CDROM drives...

    Our IT system never compromises operations for security, and it never has to. Your IT staff may need a bit of fresh air, a few customer-centered workshops, and maybe some field trips to see how others work.

    I feel your pain and wish you the best.

    ay
  • by canuck57 (662392) on Wednesday January 18, 2006 @08:20PM (#14505802)

    What is the right balance between security and productivity, in the corporate IT environment?

    Simple, more security. As more secure systems tend to run more reliably (less bugs) and with lower maintenance (removing root kits)than do less secure systems. Knowing most corporate environments, security tends to be lax.

    Looking back at my company, 10 years ago, our machines were connected directly to the Internet, no proxy, no firewall, no antivirus software.

    Yes, it was better more than ten years ago. If your computer was connected to the internet and caused someone problems you got kicked off for a week or two to think about it. Some were even blacklisted. And few if any ran Microsoft products as their gateways or terminals.

    But the fact is with many hundreds of millions of Internet users today practicing self administration of an inherently insecure OS and trusting everything they click on -- without regard to others or their companies costs, security has had to evolve. And believe it or not, firewalls existed 10 years ago.

    Then along comes the modern cowboy on an unmonitored cable connection hacking people for sport and profit. People hack computers just to send spam, and the system/ISP do nothing. They have long since abandoned kicking them off. The result is the problem is mow rampant.

    have we become so secure that we're stifling our own ability to get things done?

    Not at all, I have always kept important stuff on UNIX and Linux, and professionally manage them like I do at work. They haven't been hacked or wormed. I also tend to use "safe" tools as they also fail less as well are more secure.

    But the optimum answer to be secure is to use securable tools and secure practices in what you do with your computer, something like safe sex.

  • Try a University (Score:3, Insightful)

    by froschmann (765104) on Wednesday January 18, 2006 @08:21PM (#14505808)
    Heh, my Christian University is a lot worse than that. We have mandatory antivirus (which seems to run scans at the most inconvienent times. Cancel them and you get kicked off the network.) We also have to run all traffic through a HTTP proxy, because they block all outgoing port 80 traffic. The HTTP proxy logs all traffic which is then sent to our deans and hall directors, as well as kept on record forever. In addition, it blocks such disgusting websites as Ebaumsworld, and hackaday (hacking is illegal, kids). It can be loads of fun trying to get programs without proxy support to work. We also get AIM file transfer (for my non-geek friends from home) disabled, along with bittorrent and pretty much every non HTTP protocol. They even have a packet shaper which detects traffic on the wrong ports and blocks it, so forget about using a proxy. Internet access at schoool can be much worse than at a workplace... Thank the gods for PGP and dial-up!
  • You made me laugh. (Score:2, Insightful)

    by catahoula10 (944094)
    " Looking back at my company, 10 years ago, our machines were connected directly to the Internet, no proxy, no firewall, no antivirus software. Today, my company's proxy server blocks access to: 'bad' web sites (such as Google Groups; our 'antivirus' software prevents our machines (even machines that host production applications) from carrying out legitimate functions, such as the sending of email via SMTP; and individual employees are forced to apply security patches with little or no notice,"

    Of cours
  • Years ago people didn't lock their doors because everyone knew each other. Years ago you didn't need a firewall in many cases and these things weren't on your mind. Times change and you have to protect yourself.

    Many of the complaints in the submission sound like bad IT or mis-directed policy. AV might block a server from sending SMTP mail, but how is it supposed to know it's legit? The IT staff should be telling it which is legit. Users shouldn't be responsible in a corporate environment for patches an
  • but employers do have a right to dictate what happens on their own property. (Although some employers are abusing this right now to dictate what happens on their employers' property, which must be stopped and soon.)

    Any employee computer activity on the job, especially internet activity, is a potential liability for the company, and if you browse to the wrong site you can get hit with spyware, cookies, etc. that could compromise the security of the network. Get nailed with a keylogger cookie and all your int
  • How about too many accounts and strict passwords? That part drives me nuts.
  • Unplug, people. (Score:4, Insightful)

    by ubiquitin (28396) * on Wednesday January 18, 2006 @08:34PM (#14505863) Homepage Journal
    Security has very little to do with updating your virus definitions hourly, and everything to do with knowing when to just unplug the box and find another way to get the job done. What's your risk model? Point granted: the network is a demanding mistress. But fortunately, everyday risk is often handled best by the simplest of means. Stop instant messaging the person one cubicle owner, and get to know your local coffeeshop owner. Or neighborhood banker.
  • I am probably one of the only mac users on a large (50000+ employees) network. I practically daily messages about patches, reboots, viruses, malware, etc. from corporate IT. I ignore them, and simply keep my computer up to date via Software Update. Ironically, my computer being on the network technically violates IT policy. If I were to follow IT policy, I wouldn't get work done. Why can't IT leave people alone, especially in technical (engineering) environments?
  • I have run into this problem at my college as well. Virtually every port is closed except those needed fot http, https, ftp, and smtp. I cannot use RDP, SSH, or VNC to check on my servers at home or at work. Frankly, with better security implementation they could allow these services to students without compromising themselves too much. I think it is mostly just the higher-ups in the college who are all concerned about "piracy" and hackers.
  • really, the only people that aren't a security risk without security disabled can easily get around it, if they need (or want...) to. The average luser will cause more problems than this security will. The key to this though, is punishment of those who circumvent security. At my school, I regularly aid even teachers in getting freemail access, around the filter, etc. They trust me because they know I'm smart enough to do this, and not do anything stupid with my 'superpowers'. Most of them are well aware tha
  • I develop display software for US military aircraft. IT wants the company to switch from UNIX boxes (Suns) to Windows. Need I say it sucks? Windows screws up the case in filenames. The machines aren't set up to carry your environment from one box to another. They have to be rebooted at least every couple of days. There's so much useless crap loaded at boot that they've already consumed 300MBs of RAM before you log on. Then when they are running they're constantly probed by the mother ship. We have the block
  • (term coined by Bruce Schneier, AFAIK)

    What bothers me more than the company turning down the screws to secure things is when they turn down the screws to secure things, without really accomplishing that end. I certainly won't disagree with a software maintenance policy, for Windows, Linux, and everything else. Nor will I disagree with firewalls and enforcing company policies across them.

    But if I were to tell some of the more boneheaded things that are ALSO done, and the holes obliviously left open, you'd ei
  • I work for a company which has a very restrictive policy. All PCs are centrally managed, monitored, patches are remotely applied, internet access is very strict (only ports 80, 443 outbound allowed). All access is via corporate proxy server with layer 7 filtering. Every outbound access is logged.
    However, despite these measures I can still use JAP or Tor to access any site. I can still ssh (via ProxyTunnel) to my home PC over port (my sshd runs on port 443). Basically, it just means I have to go through hoop
  • What is the productivity of a system full of spyware/viruses? Usually, just about zero.

    If you can restore a system in a matter of minutes (deep freeze), then maybe it's not such a big deal to have a secure system. But if it takes an hour or a day, then its a bigger deal.
  • > [...] individual employees are forced to apply security patches with little or no notice, under threat of their machines loosing network access [...]

    One thing you DON'T want is your network getting all loose; the bits fall out everywhere and it's very messy.

    So keep your network tight! Apply patches!
  • So the question that LEAPS out at me, is how can they block groups.gooogle.com as being a "bad" site, and still allow access to slashdot? WTF??

    Seriously, one of the problems has a relatively simple solution. Antivirus is running, and blocking SMTP. I am assuming that you run an "enterprise" edition of some anti-virus software. They probably have one group policy set for all machines, since everyone uses outlook or something.. This is not taking into account your groups machines, that need it to get work
  • by justin_w_hall (188568) on Wednesday January 18, 2006 @09:32PM (#14506180) Homepage
    Disclaimer: I work on the security team for a rather large (Fortune 5) corporation.

    I would say, compare the environment of the public internet to how it was ten years ago. Would you place your unpatched Windows machine directly on the public internet now? You have (roughly) ten minutes before another infected machine exploits one of the dozen out-of-the-box vulnerabilities that will allow them to run anything it wants on your PC. Not the case ten years ago.

    Unfortunately, what was once a rather quiet suburb filled with geeks posting to Usenet and using Mosaic is now a post-nuclear, disease filled demilitarized zone where so many infected systems simply sit and try to infect others that a defenseless machine (or a network of them) is doomed.

    Trying to manage security in this environment is a much more difficult job than it ever has been, and every month that goes by makes it more difficult. We shudder on the second Tuesday of every month at what new terrifying vulnerability Microsoft will tell us is in their product that's deployed on a hundred thousand machines on our network. We plead with other IT teams (networking, server admins, client admins) to implement our tools and software and protect the environment, but most of them get pushed to the back burner, either because it's "too invasive", i.e. it annoys the end user too much; or it costs too much; or they just don't have the time.

    Then MS05-039 [microsoft.com] is released. We plead and plead for the patches to be distributed right away because of how severe the threat is. But users like the submitter can't stand to have their PC rebooted unless it's the absolute perfect time. Plus, we have 1700+ applications to test compatibility with the patch on, on hundreds of different PC environments. And it requires a service pack we don't have deployed everywhere, again, because it's too invasive.

    Then Zotob.E [symantec.com] gets into the environment, and shuts down large sites in a matter of minutes. Then people scream even louder! Where is security? Why didn't they prevent this?

    Because no one takes security seriously until it's too late.

    From a security admin's perspective, we never have enough resources or management support to fully defend against even the most prevalent threats. Because security (and, as most admins know, IT in general) is underfunded. Because of (very real) scenarios like I described above, we have much more support than we did, and things are improving.

    I guess my point is, step into our shoes for a few days. We don't enjoy being draconian - we like Google Groups as much as anyone else! But there are so many attack vectors that we have to be concerned about to protect the environment - and it only takes one. One of my co-workers is fond of the saying, "the hackers only have to be lucky once - you have to be lucky all of the time."

    I guarantee every IT admin reading this is thinking, well, if you did this instead of that, if you had two hundred guys on your security team, with all of them testing patches, while listening to every end user complaint and rectifying their situation immediately, you could stay out of the end-user's way! Trust me - we know. We wish our teams were as stacked as they should be. Heck, we wish it wasn't necessary at all to have to defend against stuff like WMF [microsoft.com], where any end-user clicking on a link from their IM buddy could get exploited in a second... we wish it wasn't like this. We wish things could go back to how they were ten years ago. The reality is, this is the internet we built and we are fighting to protect our assets from.
    • Amen.

      I'm a network engineer and have had to deal with Blaster and Zotab. It is evil what a Pentium 4 PC on 100 meg ethernet can do to a 1536 kb/s T1, or a pair of them. Everyone goes home at 5pm and in a couple of hours 100 sites have been scanned for vulnerable PCs, numerous sites are effectively down - legit connections are timing out. A couple of hours later the firewalls go down. A couple of hours after that - bye bye network. It's not easy to stop those Pentium 4 PCs on 100 meg ethernet.

      I was at a Fort
      • a few windows pentium 4's can be nasty, a unix server is far worse.

        While i was attending binghamton university as a freshman a SINGLE unix server got owned. it annihilated the entire dual OC3 campus network. for nearly 3 days.
  • by whoppo (218875) on Wednesday January 18, 2006 @10:16PM (#14506417)
    A decade ago it was not unusual for corporate networks to have little or no restrictions on end users. Workstations, servers and even printers had publicly routable addresses and free access to the internet as it was. Back then we had to deal with relatively few miscreants... the occasional "ping of death", "teardrop" or the dreaded "smurf" attack. Malicious activities could be deflected by a few simple firewall rules.

    Flip the calendar ahead 10 years... The internet is ripe with malicious content. Organized groups of crackers, writing exploit code for every system vulnerability imaginable... Script kiddies gaining "respect" relative to the number of machines they can compromise for addition to their bot-nets... Spammers building their armies of compromised boxes to anonymously sell viagra and fake rolexes... the list goes on and on. In short, the need for network security is real and sometimes the end user is inconvenienced in the process of running a tight ship.

    In an ideal corporate world, the bad guys would stay out and the users would have everything they want. In the real world there is a balancing act that weighs a security "best effort" against business needs. It sounds to me as if the original poster's company is in the early stages of making this happen. Security measures are being taken and users are feeling the pain. The next step is for the users to identify the needs that are not being met and challenge their management and IT resources to provide for those needs while making a best effort to do so securely. This, unfortunately, often involves plenty of corporate political bullshit and associated headaches, but if you can show a LEGIT business need, it should make it through the process.

    I manage all internet connectiity and perimeter security for a very large healthcare foundation that includes several hospitals, physicians offices and research facilities. Not a day goes by without some kind of request for additional access to some resource. Most are reasonable and can be accomodated with little or no impact on security. Some are not so reasonable politely rejected with a comprehensive explanation of why it's not gonna happen and where applicable, alternative solutions are offered.

    As for the original poster's situation... should end users be applying system patches? hell no. IT folks get paid to do that. Should individual workstations be sending SMTP traffic beyond the network perimeter? hell no! IT folks should make a suitably secured SMTP gateway available. Should users be able to go anywhere on the 'net they want? hell no! The company pays for the bandwidth and owns the workstations... they can say "no" to anything they consider to be unrelated to doing business. If users need to get somewhere on the filtered list, it should be easy enough to justify it to management. Do the homework and make your case... you'll get much farther than someone that just pisses and moans about how restrictive those IT bastards are.

      Best of luck.
  • by ocbwilg (259828) on Wednesday January 18, 2006 @10:32PM (#14506520)
    Looking back at my company, 10 years ago, our machines were connected directly to the Internet, no proxy, no firewall, no antivirus software.

    Looking back 10 years ago, your biggest threat was someone bringing a virus-infected floppy disk into work and taking down one of the 20 computers in your 50-person office. But hey, if you want to connect your PC to the Internet with no proxy, no firewall, and no virus protection, then be my guest. I doubt your PC lasts 24 hours before it becomes unusable.

    Today, my company's proxy server blocks access to: 'bad' web sites (such as Google Groups;

    And also very likely thousands of hacking, piracy, virus, worm, spyware, and phishing-related sites.

    our 'antivirus' software prevents our machines (even machines that host production applications) from carrying out legitimate functions, such as the sending of email via SMTP

    If it really is a legitimate purpose, you shouldn't have any problems being granted an exception for your specific case. Everywhere I have ever worked has done so.

    and individual employees are forced to apply security patches with little or no notice, under threat of their machines loosing network access, if they do not comply by the deadline.

    Ah, now I see. Your administration is incompetent. Under no circumstances should end users be installing security patches. They should be installed by administrators (if not automatically), and there shouldn't be any concern about cutting off non-compliant PCs because there won't be any. Anything less isn't security at all.

    have we become so secure that we're stifling our own ability to get things done?

    We haven't, but it sounds like the folks running the show at your place may have. But it also sounds like they don't know what they're doing either.
  • by Money for Nothin' (754763) on Thursday January 19, 2006 @01:16AM (#14507377)

    On one hand, you can never be too secure, however on the other hand, have we become so secure that we're stifling our own ability to get things done?

    Yes, you *can* be too-secure. "Too much security" occurs when you can't get work done -- as is your case. The only *real* question facing corporate IT is "what amount of liberty is necessary to perform the duties of the employee requesting that access?" In true totalitarian style, the old computer security saying "that which is not expressly-permitted is forbidden" is the basic principle of current corporate IT security.

    We have this same problem where I work. Thank shitty MSFT security for the current mess...

    On a related, more-general note, security and liberty are *always* at odds. They logically must be: if you are restricted from performing action A, then you are not at liberty to perform action A. Simple as that.

    For a real-world example: if you are locked-out of somebody's home, then you are not free to open the door to that home. The home is secure against your entry (at least from this particular vector).

    Frankly, he who wants to be both safe and free will never have what cannot be.

The bugs you have to avoid are the ones that give the user not only the inclination to get on a plane, but also the time. -- Kay Bostic

Working...