Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security

Ask Slashdot: Is the World Better Or Worse Because of Security Tech? 126

Slashdot reader krisdickie is a developer for embedded devices (and many other systems), and spends a lot of time being proactive about security. This is obviously important, and I don't necessarily see it as a distraction, but rather a complex problem that has some added thrill to being solved. I can't help but wonder though if I (and my team) would have been X times more productive or have come up with some amazing new concept or feature, if we didn't have to deal with implementing security measures.

In a utopian world, where there are no bad actors, we would have likely forfeited many of the systems and ideas that have been put into place to prevent bad things from happening. So my question is -- are we more technically advanced because of the thoughtfulness that has gone into creating these systems?

Or are we just losing precious resources and time dealing with the necessity of protecting ourselves from the perilous few?

Share your own thoughts in the comments. Is the world better or worse off because of our ongoing development of security tech?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Is the World Better Or Worse Because of Security Tech?

Comments Filter:
  • Seriously? (Score:3, Insightful)

    by RobbieCrash ( 834439 ) on Sunday May 06, 2018 @12:52AM (#56561876)

    What an asinine question.

    Of course we're worse off because there are bad people in the world. If everyone was a magical completely altruistic person who did nothing but make the world a better place, the world would be a better place.

    • This article touches on many of the overall issues, implicit in the "question", as it is.

      "It’s time to kill the web"
      https://blog.plan99.net/its-ti... [plan99.net]

    • Security and Convince have always been at odds.
      Now new security tech does help make some things a bit more convent while keeping a reasonable (not superior) degree of security.
      Such as biometrics like finger print reading and face recognition allow you to keep devices secure enough against the casual bad actor ( the majority of them ). As well with the advancements in encryption allows a lot of extra security to go on without much user interaction. But still it isn't faster and easier to use these system w

      • There is some truth in that. Sometimes there is a trade-off between certain types of security and convenience.

        Also, it's VERY inconvenient when the system goes down entirely because it wasn't secured. The easiest attacks are generally denial of service attacks, so if you pay no mind to security you can expect the service to be unavailable frequently. A bit of security would make things a lot more convenient.

        It's also pretty darn inconvenient when the system gives wrong results, such as when your bank balanc

    • by Sique ( 173459 )
      This mostly misses the point. Security is much more than protection against bad people. This might be a very visible and easily explained effect of having security, but security also protects against misshappenings, mistakes, accidents, errors, all the little nuisances which disrupt the intended way of running things. Even if all people were saints, we still would need security.
  • by Anonymous Coward on Sunday May 06, 2018 @12:53AM (#56561880)

    Is the World Better Or Worse Because of Security Tech?

    Yes.

  • by Anonymous Coward

    admin/admin passwords, not rolling out patches, leaving anonymous FTP open... what can go wrong? this article was written by a dumbass

  • by walshy007 ( 906710 ) on Sunday May 06, 2018 @12:57AM (#56561896)

    This is not a one-case-fits-all item.

    What kinds of measures specifically are being spoken of? Does it help or hinder end users doing what they wish? Are end users even a consideration or is this solely to keep a stranglehold on the device from a manufacturers perspective?

    As with many things there will never be a single answer, what is presented is a set of varying trade-offs whose value will change depending on the desired goals and whose perspective it is desired from.

  • by Kobun ( 668169 ) on Sunday May 06, 2018 @12:58AM (#56561898)
    Human 'bad actors' are only one source of adverse conditions for computing. Many security features double as stability and error-checking features. I think that the author's question is ultimately a silly one because of Hanlon's Razor - "Never attribute to malice that which is adequately explained by stupidity". I think most people have seen terrifically destructive users who had no malicious intent behind their actions. Even in a utopia, humans are still human.
    • by Calydor ( 739835 )

      A good example would be the Melissa virus which, IIRC, started out as a proof-of-concept that accidentally got loose.

  • by dohzer ( 867770 ) on Sunday May 06, 2018 @01:02AM (#56561908)

    Not better or worse, but as it should be.

  • Sadly, because we, somehow, have allowed this great infrastructure we call "the internet" to be as filled with (security) holes as a collander.

    At this point, we re just imitating the Dutch boy quickly plugging holes in the dike while at the same time realizing that we'll run out of fingers long before all of the holes are plugged.
    • by dgatwood ( 11270 ) on Sunday May 06, 2018 @02:16AM (#56562098) Homepage Journal

      It's not even that. The answer to the question of whether security makes things better or not in general is straightforward: It depends on whether the cost of the security is enough of a nuisance to exceed the projected lifetime benefit. And that largely depends on context. I'll explain by analogy.

      I grew up in a small town in West Tennessee. Lots of folks around town routinely left their houses unlocked. It was that kind of town. There were a few thousand people, and everybody knew everybody, or if they didn't know somebody, they knew someone who did. In that context, it didn't take much security to keep things safe, because most people are good people, and if somebody from outside the community was wandering around, everybody knew that the person was an outsider if nobody out of a group of three or more people recognized the person. Thus, a bad person from elsewhere would arouse enough suspicion to be noticed, and would probably be thwarted in whatever nefarious deeds he or she was planning, unless it was just minor mischief like TPing the house of somebody that nobody really liked much anyway.

      Now, I live in the Silicon Valley. I know two of my neighbors. Thanks to work and church, I know people from various parts of the area, but they don't live nearby I'm reasonably confident in leaving things lying around at work for precisely the same reason that I was reasonably confident back home—because everybody knows each other. But if you were to ask me if I could leave valuables lying around anywhere else, the answer would be "heck no," because nobody knows anybody, statistically speaking, and so everybody is indistinguishable from a potential insider or outsider. Even though most people are still good people, the odds of a bad person getting noticed are much lower. And with so many more people, the number of bad people is much higher even if the percentage is the same, which only compounds the problem.

      The same problem exists with technology. Prior to the Internet, when computers were basically devices that you interacted with locally, security didn't matter that much, because most people are good people. When computers became more connected, that became a problem, because even if most people are good people, the bad people can get to your systems from anywhere in the world, so it only takes a few bad people to ruin everything. And because the pool of people potentially accessing your system is so much larger, the ability to distinguish good people from bad people is diminished.

      So to make a long story short, computer security is a necessary response to the realities of a more interconnected world. Would things be worse without all that added security? Yes. Does the security actually make the world better? No. It just keeps things from unraveling in the presence of interconnectedness that does make the world better. The real question is whether that distinction matters.

      • I agree with your point "computer security is a necessary response to the realities of a more interconnected world." That said, in many cases, I feel the deeper issue is, as in my sig, the irony of technologies of abundance in the hands of those still thinking in terms of scarcity.

        I write about those ironies in regards to militarism here: http://pdfernhout.net/recogniz... [pdfernhout.net]
        "Likewise, even United States three-letter agencies like the NSA and the CIA, as well as their foreign counterparts, are becoming ironic i

  • Technology is a gift (Score:4, Interesting)

    by MrKaos ( 858439 ) on Sunday May 06, 2018 @01:22AM (#56561970) Journal

    The choice people have to make is if it frees us or enslaves us.

  • by ctilsie242 ( 4841247 ) on Sunday May 06, 2018 @01:49AM (#56562020)

    In the 1980s and 1990s, there was a turning point where security was considered something that should be baked into an OS and product, be it an operating system (thus the C2/C3/B1/etc. levels), MAC/DAC controls, security as part of the kernel, and part of a module, and so on.

    However, what happened is that companies took the easy route. Windows had no innate security so the whole firewall/castle model of company security was formed, where security was done by the network fabric, and not the endpoints. This worked for a while, until malvertising and Trojans allowed malware to attack anywhere.

    These days, security is pathetic in general. I have heard "security has no ROI", "the hackers will always win, so why waste money?" and other claptrap for over a decade. In fact, because there is no real criminal penalty, an egregious security breach makes the top levels of a company a lot of money because they can short their stock before making the announcement public, especially if they can keep the breach under wraps for six months.

    IoT devices come to mind as a specific example. Why even bother with meaningful security when customers are forced to buy your version 1.1 of a doodad because version 1.0 will get their stuff hacked, and cannot be upgraded? Especially because the money with IoT is the analytics coming in, not the actual purchase of the device.

    • It's stuff like this that makes me cry, but only because it's true. It actually makes me second-guess my second career; do I take my risk analysis and other skills to a company that will ignore it because "the hackers will always win," or do I go for a NSA gig where I can actually be on the offensive for once?
    • by ka9dgx ( 72702 )

      In the 1980s and 1990s, there was a turning point where security was considered something that should be baked into an OS and product, be it an operating system (thus the C2/C3/B1/etc. levels), MAC/DAC controls, security as part of the kernel, and part of a module, and so on.

      However, what happened is that companies took the easy route.

      Amen! However, also along the way is that the entire tech community decided that real security wasn't possible, it somehow became unobtainable. The problems were SOLVED in the 1970s in response to the data processing problems encountered with multi-level data security for Viet Nam, but we failed to heed the lessons, and eventually they fell into obscurity.

      Capability based security offers a way to have general purpose computing that humans can manage and secure. The core concept is to never, ever, trust an

      • by dnaumov ( 453672 )

        People can't deal with trivial UAC prompts because they don't understand what's being asked of them and you are suggesting THIS?

        • by ka9dgx ( 72702 )

          UAC suck, quite frankly. It's a "this might be bad, do you want to do it anyway" type of question, conveying no useful information other than horrid boolean choice (Yes - your machine might get PWND along with everything on it, No - Your machine won't do what you want because of "Security")

          Replacing dialog boxes with "power boxes" makes almost no difference in terms of ease of use, but it shifts permissions away from the application code and puts it back where it belongs.

          Insisting that users can't manag

          • by Bert64 ( 520050 )

            the current OS design would have you hand your wallet (and a non-revocable power of attorney) to the clerk, and just hope that they take the right amount out of your account before handing it back.

            Which is exactly how credit/debit cards work...

      • by Anonymous Coward

        The OS file dialogue is exactly how OS X handles sandbox applications for opening a file.
        It goes further, that it gives a file handle back that is signed by the OS, so it can store the user-given-permission in a preference file, so that the next time the application is opened it will still have access to that file.

      • Ridiculous. A computer program can "touch" many files and also may need to run without user intervention. No one is going to answer "yes this problem needs to access djfhgkl.dll" prompts. General computers can never be secure because malware are just programs too.
        • by ka9dgx ( 72702 )

          Why should a program even know about the existence of "djfhgkl.dll"? It shouldn't see any of the file system, except when handed a capability for a file or folder.

          Every gas station clerk I hand $20 to as a form of payment doesn't have the ability to take out a mortgage in my name... they only have the $20. There are zero clerks asking to touch each note in my wallet by serial number, etc.

          Malware are just programs that are written to do evil, everything else does evil by mistake. Capabilities just prevent

    • A big factor was U.S. export controls. An operating system with strong security would require individual licenses to be sold in other countries. Remember 40-bit encryption keys?
    • Yeah, when an OS was the size of what is now a simple device driver.
  • It probably increases usability in the same way that car safety measures increase usability of cars. As someone already mentioned, it forces systems to be designed in such a way that they are also proofed against users "shooting themselves in the foot" at a moment of even a tiniest incompetence.
  • I know that when I first started hacking around with Linux in the mid 1990s that I had an easy time experimenting with networking compared to somebody just trying things out today.

    Samba was out and all the security in it, and in Microsoft products that used SMB, were loose and easy to use. NFS was a breeze to use, so you could boot up a machine with an NFS install floppy diskette and put a whole freenix (I like NetBSD) on a system quickly.

    A lot of that has changed now. It's even a hassle now just to get t

    • The problem is that the intruder doesnâ(TM)t have to come from outside, but most likely will be a naive user on your own network who clicks something they shouldnâ(TM)t have on a poorly secured computer. So: The basic protocols are still around, so you can still learn the basics of how to set up network services within a lab environment; nothing has really changed there. But donâ(TM)t stop learning once you know the basics; thatâ(TM)s the main lesson here. When you can reliably create a
    • by tlhIngan ( 30335 )

      Exactly.

      It's also making stuff harder to repair, because new vulnerabilities mean you lose the ability to fix it yourself.

      Think about a fingerprint reader. In days gone by, they were simply cameras and you got an image from them, then run your algorithms on them. But nowadays it's such a big deal that fingerprint data must be encrypted and if your hardware supports it, sent over a secure bus to a secure processor, using PKI encryption to ensure both endpoints haven't been compromised.

      All this because a bad

  • by Anonymous Coward

    Aka "both". But by and large, worse, and this will worsen until we fix two things:

    The atrocious state of our technology, IOW the "hyoooooooge" technical debt. That mountain is so big we don't know where to start looking at it. But it's still there. It's become so big it has its own abyss, staring at you. That makes it even harder to look at.

    Our willingness to be oppressed by technology. It doesn't matter if it's because of some "security" threat or other ("for the childrun", "terrists", you name it), govern

  • by Anonymous Coward

    The logical value of (A or (not A)) is always True.

    I am simplifying somewhat here because "better" is not the opposite of "worse" (we must also consider "equal"), however the probability of the situation being exactly equal is zero, so you get the same result.

    You could also ask if it is better AND worse, and the answer would still be yes. Just as you could say Slashdot is both bad and good. There are plenty annoyances, but hey - after 20+ years I am still here reading, so it can't be all bad.

    Some of these p

  • First we have to ask ourselves, what is security?

    Security, as in locked doors, encrypted drives, encrypted mail and digital wallets?

    Or...

    Security as in personal security (the rights to roam free and pursue our own dreams), free from oppressors, freedom of speech, information freedom.
    In a time of fake news where it's possible to manipulate another country just by doctoring the news and opinions of the masses, this is certainly not good.
    Another bad is that if we take away our freedom of speech, we get less say - and the power handed to a privileged few, aka "your" chosen government.

    Internet gave us a lot of freedom. We could exchange information faster than ever before, play games with our friends overseas, book travels and earn money no matter were you where in the world.

    But it also blinded us, with information this fast, there was no time for peer reviews of the news, what source can you truly trust? "Likes" almost became the new "law". Getting likes was almost like the new religion, and nevermind the reliability of the actual sources, just as long as a bunch of likes came along, and the rest thought "meh...might as well join the crowd", and what crowd? These are just numbers. A very real but dangerous development.

    Time to take a step back - and understand that we should keep this technology free, putting too many locks on it also censors our freedom of speech, but security starts with us, we need to educate ourselves and not trust everything blindly. Turn off the net, breathe - go out there, say hi to your neighbor once in a while, talk amongst yourselves.

    • > "Likes" almost became the new "law".

      The Orville even did an episode on that: Majority Rule [denofgeek.com] (aired in 2017) which was a repeat of a Black Mirror episode Nosedive [wikipedia.org] (aired in 2016), which is similar to a Community, App Development and Condiments [wikipedia.org] (aired in 2014)

      It's actually worse then that. Security (or the lack of it) -- whether it be public security (prevention of intrusion) or personal security (protecting your rights) -- can be summarized with two phrases and how they are linked:

      1. Follow the money, a

  • Time spent protecting operating systems from possible bad behaviour of applications is time wasted.

    The current state of Operating Systems is akin to having only single phase AC power, but no fuses or circuit breakers anywhere in the system. Because applications are trusted with everything, any bug can result in the wholesale mis-direction of everything down the wrong path. Most (but not all) of our problems with security result from this misplaced trust.

    It's probably going to be another decade before capab

    • The current state of things is like having fuses designed into equipment, but then finding that somebody has shoved a 30 amp fuse into the holder. I find that from time to time now on equipment I am repairing. Sometimes it causes dramatic equipment failure.

  • Security mainly boils down to âthink about the consequences before implementing somethingâ, and âclean up your own mess to avoid introducing accidental consequencesâ. If a developer lacks these habits, they will write broken software from more perspectives than just security.
  • Much of the internet is built on a model of reasonably open trust. This proved to not be a mistake, but a particularly galling one, which has required patch after patch.

    The problem, as I see it, occurred starting in about the mid 90s. At this point, what the internet actually was, was clear to all. Making assumptions of trustworthiness in 1985 was still quite reasonable: it was possible that all meaningful internet connections were to continue to be monitored for bad behavior manually and actioned when a

  • by blackhedd ( 412389 ) on Sunday May 06, 2018 @03:47AM (#56562242)

    As a cybersec professional of many years tenure (and now an exec at one of the major firms), I have to admit I've asked this same question many many times. If we didn't need to put so much effort into security, and instead put it into features with direct customer benefits, wouldn't we all be better off?

    I think the OP approaches the answer to his question when he refers to preventing bad things from happening. A basic part of engineering is system robustness, resiliency and safety. We don't question the effort we put into assuring those things. We manage, in a variety of ways, the potential impacts arising from possible system failures.

    With cybersecurity, we manage in a variety of ways the potential impacts arising from system vulnerabilities exploitable by bad actors. It's work we'd be doing anyway.

  • by bigtreeman ( 565428 ) <treecolin@@@gmail...com> on Sunday May 06, 2018 @04:09AM (#56562292)

    anonymity and security,
    can't have both
    if criminals know they will be identified and caught they will be less likely to offend.

    • There is something to be said for woeking at a really small company, or on a small team secured ay somewhere. Therw is far, far less anonymnity, but things are then looser and more free.

      The place I am working has a 'news' bulletin that consists of a 'txt' file in a shared network folder. Everybody is expected to open and read and update it every day or so with notepad. The less computer adept have a shortcut to the file on their taskbar. It works because their are only 8 of us in the company. It's secure be

  • by Tablizer ( 95088 ) on Sunday May 06, 2018 @04:17AM (#56562310) Journal

    It's a rather open ended question, but here's an anecdote to consider. A lot of free and open-source software is written in Java. However, our security administrator set an aggressive policy on Java because of past Java security holes. Java-based applications run about 20x slower than they would without the aggressive scanning done on it by our security software. It makes such software virtually useless. We either pay more for alternatives or go without. (I personally believe the security scanning software that starts with an "M" is poorly designed, but that's another topic.)

    I cannot reliably say if our org's policy is too aggressive, because not getting things done may be just as bad as being hacked in the longer run.

    Another oddity is that Microsoft is also leaky, but because we need some software to avoid going back to paper and pencils, Microsoft gets a pass that Java doesn't. It's crazy. Sometimes it feels the 90's were more productive because we didn't have consider security stuff. (That and stupid Web "UI" (non) standards.)

    • Doing without MS doesn't mean only paper and pencil as options....rilly? [sic]. I ditched windows around 2001 when I stopped running a company that fixed windows issues and wrote drivers for oddball hardware. There are around 20 machines here, mostly linux, a couple apple, all work great. Kinda diminishes any authority in the rest of your comment...
  • Everyone has failed so hard at the first three levels of OSI through shitty programming that they rely upon several more layers of OSI to cover up for even shittier programming now.

    Security comes through good programming practices, thorough testing, and sticking to KISS ideas.

  • The problem with security is that it's used as a pretext for surveillance and spying. We get backdoored CPUs so our data and devices are no longer under our control. All in the name of security.

    I'll choose freedom over security any day.

  • This is obviously important, and I don't necessarily see it as a distraction, but rather a complex problem that has some added thrill to being solved. I can't help but wonder though if I (and my species) would have been X times more productive or have come up with some amazing new culture or technology, if we didn't have to deal with obtaining agricultural products.

    In a utopian world, where there are no metabolic processes, we would have likely forfeited many of the farms and fisheries that have been put into place to prevent starvation from happening. So my question is -- are we more technically advanced because of the thoughtfulness that has gone into creating these systems?

    Or are we just losing precious resources and time dealing with the necessity of fending off starvation?

    Point being: OP is a euphoric tard. Security is a natural consequence of game theory, you might as well stop coding if you don't want to deal with it. It's no different than food or water for base survival - it's a result of existence.

  • Cares would totally be much cheaper if we could make them from cardboard or something and like do away with brakes and all that shit.

  • No. And in this case "no" means you really shouldn't be asking this kind of question. The world is not better or worse, a specific application is, a specific scenario is.

  • "I can't help but wonder though if I (and my team) would have been X times more productive or have come up with some amazing new concept or feature, if we didn't have to deal with implementing security measures."

    No, security has to be baked in at the design stage and would have no deleterious effect on the implementation of amazing new concepts or features. It's patently obvious that in the rush to get out new features the innovators failed to come up with a design that can't tell the difference between
    • by ka9dgx ( 72702 )

      NOTHING can tell the difference between
      1> a program deliberately written to do something bad,
      2> a program that does something bad by mistake

      To make this determination requires solving the halting problem. You can not pre-determine the intent of a non-trivial program. This is the root cause of most computer security issues.

      What you can do, is to pre-determine which side effects of running the program you are willing to allow. Most systems place NO limits on side effects of a program, however capabili

  • There are plenty of non software products where designers must incorporate elements of design that to protect users. For example: durable goods, small appliances, bridges, stairs. Vehicles, etc. Software should be no different.
  • And NOWHERE is there a lack of bad actors.

    What a spectacularly stupid question.

  • Security tech and it's mirror image, hacking, have been highly prized since 1943, when the codebreakers at Bletchey Park used a set of computers known as "Colossus" to gain unauthorized access to information in an encrypted system. They hacked Germany's best security technology, the Lorenz Sz 40/42 cipher machines, aka "Tunny".

    Since then, technology and its security systems have evolved dramatically. But so has hacking. Tools stolen from the NSA are now in the hands of those they were fighting. One has t

To the systems programmer, users and applications serve only to provide a test load.

Working...