Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Ask Slashdot: Do Coding Standards Make a Difference? 430

An anonymous reader writes "Every shop I've ever worked in has had a 'Coding Style' document that dictates things like camelCase vs underscored_names, placement of curly braces, tabs vs spaces, etc. As a result, I've lost hundreds of hours in code reviews because some pedant was more interested in picking nits over whitespace than actually reviewing my algorithms. Are there any documents or studies that show a net productivity gain for having these sorts of standards? If not, why do we have them? We live in the future, why don't our tools enforce these standards automagically?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Do Coding Standards Make a Difference?

Comments Filter:
  • Tools support (Score:4, Informative)

    by K. S. Kyosuke ( 729550 ) on Friday December 21, 2012 @02:32PM (#42362795)

    We live in the future, why don't our tools enforce these standards automagically?

    man gofmt

    Of course it's a good idea. You wouldn't want to read a book printed wholly in italics, now would you? It's not that it's impossible to get used to it, only we got un-used to it and standards are useless unless you cling onto them at least for a while.

  • by TheGratefulNet ( 143330 ) on Friday December 21, 2012 @02:37PM (#42362887)

    happened to me. at a large networking company (I don't want to mention the name, for various reasons) I felt more comfortable with my own coding/indenting style, which included lots of whitespace (my eyes are old and the extra vertical WS helps isolate 'paragraphs' from each other and that clustering really helps me a lot).

    my boss felt so strongly about it, though, that he'd constantly use that as a way of beating me up. eventually, I got fired and I think our conflict was a major part of that.

    I have been coding for over 35 years (in C, mostly) and I have a damned good handle on how to code for readability and supportability. but when it comes to styles, the guy in charge wins, no matter what, even if there is no right answer, per se.

    you should not code in a radically different way from the existing codebase, but to complain about line length (at the time, they were HARSH about keeping lines to 80chars max, which was so braindead) and my vertical whitespace just really ticked one or two people off enough that they complained and got me canned.

    so, yes, those that want to pick fights, will do so over stupid little things and make a big damned deal about it, too. part of 'coding politics' I guess. large and small companies all have this problem and it won't ever go away.

    better companies can be flexible. the more rigid ones, I find, have lost their way. writing code is not a mechanical thing; there's an art to it and to strip all individualism from the task is just plain wrong, to me.

  • by neiras ( 723124 ) on Friday December 21, 2012 @02:38PM (#42362907)

    We have always had standardized checkstyle and jtidy rules as part of our build system. We have eclipse formatting configuration that everyone uses as well. Commits don't happen unless checkstyle is happy.

    I thought everyone did this. I guess tooling is less developed in some languages, but it's not too hard to put this kind of thing into practice with a little bit of effort and buy-in.

  • It helps (Score:5, Informative)

    by Cro Magnon ( 467622 ) on Friday December 21, 2012 @02:49PM (#42363035) Homepage Journal

    Once I was put on a project with rather strict standards. I didn't like their naming conventions, and the style was noticeably different from mine. But I soon found that whichever of their programs I was assigned to, it was relatively easy to follow because of the similarity with the other programs in that system. In contrast, the system I'd been on before had no standards, and everyone did things their own way (including me), and I had to study each new program before making any significant changes.

  • by casings ( 257363 ) on Friday December 21, 2012 @02:51PM (#42363055)

    Most diffs can ignore whitespace...

    I don't understand your second point.

  • by beelsebob ( 529313 ) on Friday December 21, 2012 @02:54PM (#42363097)

    I disagree. things where to put the braces (on a line alone vs end of statement), tab size, Camel vs. underscore.... to me those are all personal preferences.

    A couple of trivial examples of why this makes reading code much easier:
    The coding standard says "open braces should always appear on the following line, all control flow statements should have an associated brace." This code is now not allowed:
    if (some(really) && long(condition) || that(extends) && miles(across(the, line)) || and(pushes, the, code, to(the), right, off, the, screen)) doShit();

    Instead, this is needed (though some extra coding standard rules would be useful to clean it up further:
    if (some(really) && long(condition) || that(extends) && miles(across(the, line)) || and(pushes, the, code, to(the), right, off, the, screen))

    Now, I can scan read this code, and see that doSomeStuff() is always executed. I don't need to read the whole line just to check if some one's left something on the end of the if statement.

    Another trivial example, a coding standard might say "expressions should only contain pure values, never state changes. If you need to change state, use a statement." Now this code isn't allowed:
    someFunctionCall(blah(boo), foo, baz, bar, monkies, brains, xyz++, [self setCheese:29], unsafeLaunchMissiles());
    I now as a reader, no longer need to scan all the arguments to find out if there's a state change in there. This makes code way easier to understand.

    So yes, simple, apparently "trivial" coding rules can make an enormous difference to the readability and maintainability of code.

  • The objective... (Score:4, Informative)

    by blackcoot ( 124938 ) on Friday December 21, 2012 @02:59PM (#42363155)

    Is to achieve this: http://www.joelonsoftware.com/articles/Wrong.html [joelonsoftware.com] -- make things that are wrong be more obviously wrong. Using discipline and coding standards is just one part of the appropriately paranoid developer's defensive programming toolkit.

  • by cfulton ( 543949 ) on Friday December 21, 2012 @03:05PM (#42363229)

    why don't our tools enforce these standards automagically?

    They do. Almost every modern IDE will format to a standard and mark code that is not to that standard. Tools like "checkstyle" can document code that is not correct during the develop or build phase. That is why no one is wasting anyone's time here having a corporate standard. Comments about how many hours where lost "picking nits over whitespace" tell the story of developers who are too uninformed, ignorant or more likely self important to follow simple guidelines that for the most part can be automated. These are exactly the type of developer I want no where near my code base. If they won't follow the style standard they sure a f&*$ won't use the DAL as intended or follow the MVC standards. They are the developers who spend 1000 hours generating their own XML parser because they don't like they way DOM or SAX work. Having a standard does not waste time, but the kind of developer who won't follow it does.

  • by Decameron81 ( 628548 ) on Friday December 21, 2012 @03:18PM (#42363405)

    Not sure if you are being serious with your point or not due to your case changes, but I will bite.

    Just because a style is standardized doesn't mean your code is more readable using that style. In fact a lot of the styles expected of me made my code less clear, and when I chose to ignore them, my code was never touched in code reviews, because everything was clear and intuitive without conforming directly to the style.

    If you personally like clear / readable code, then no standard will ever be a replacement for you.

    You're missing the point. I am not claiming a particular coding style is superior, I am claiming a standard coding style across the whole code base is good - personal preferences aside.

    PS: I'm talking about basic stuff here, such as having standards on how to name variables, constants, camel case?, self documenting code?, etc.

  • by Anonymous Coward on Friday December 21, 2012 @04:00PM (#42363887)

    Since I haven't used a goto since the 1980s, except in assembler, I'd love to see some examples.

    And I mean of actual gotos, not similar statements like break and continue.

    Then you may want to look at the linux kernel sometime.

    http://git.kernel.org/?p=linux/kernel/git/torvalds/linux.git;a=blob;f=mm/vmscan.c;h=adc7e9058181eb4d38c2f22d77bd1e413d3457e7;hb=HEAD#l674 [kernel.org]

    That lovely function uses a lot of gotos for obvious reasons. Should be obvious why they use all these gotos (there are other examples in that source file)

  • by VortexCortex ( 1117377 ) <<VortexCortex> ... -retrograde.com>> on Friday December 21, 2012 @04:55PM (#42364471)

    Brace position and spacing both are encompassed by the whitespace argument that I was referring to.

    Except that a diff isn't going to ignore whitespace when broken across multiple lines, verses a one liner. See also: Python.

    You're preaching to the choir, btw, but also you're making the wrong argument. Whitespace does matter, it just shouldn't matter how it's configured, just that it's consistently applied.

    The reality of the situation is that if I reformat a 100 char line to be broken across lines with max 80 chars, and go back and forth doing this it thrashes the code repository. Diffs are general purpose, they don't understand code syntax, you've used them haven't you? Newlines are whitespace. They don't lexically parse code ignoring all whitespace, they work line by line. However, Compilers DO understand their own syntax! IMO, all compilers should have the option to lex & reformat inputs to whatever style you want based on an input rule sheet of some kind. Currently this is handled disparately in a non standardized way by IDEs, scripts, etc. In fact, many projects will have a script or program that you should run your code through before committing it. This allows you to code however you like, while giving the codebase a consistent and uniform look.

    Any moron complaining about whitespace in code doesn't really belong in coding.

    I find this statement moronic and/or ignorant. If you don't think that a uniformity and consistency help while reading and understanding text (including code), then you've never studied the human brain much at all. We're pattern matching machines. If the patterns change from page to page then it IS actually harder to read. The first thing I usually do when working on a new project is equip my editor/IDE with a style guide for the project to transform all code into my preferred style while editing. Then I can edit in the pattern than I'm most used to dealing with -- It's faster because that's how brains work. Being more efficient and taking advantage of my brain's natural tendencies isn't moronic, it's smart.

    On save, or before commit the code is transformed into the project's style guide by an auto formatter. I think this is the best, because I have cognitive science research to back it up, repetition of tasks builds "muscle memory", etc. So, my use of whitespace rules is actually faster than adapting to each project. Now, given that we don't have all compilers with such option to transform data on the fly to our preferences, and we can't always rely on the other coders to be using IDEs that can do this, the next best thing is having a project wide coding standard, and that's exactly what we do. Seriously, what sort of experience do you have where you haven't run into this commonality yourself? What's more likely that everyone is wrong but you or that we've all been down the road before and have arrived at a common consensus? Level up your knowledge. To me you sound rather ignorant and quick to make assumptions, thus foolish.

  • by Fastolfe ( 1470 ) on Friday December 21, 2012 @05:31PM (#42364919)

    Disagree. There are different naming conventions for things like constants and class members. The point of these conventions is to make it clear from the name of the variable what the variable represents. If you aren't confident that the code base is consistently following the same conventions, you have no confidence that something that appears to be a local variable is actually a local variable, which means you need to spend more time poking through code in order to understand it.

    Consistency begets readability, and readability begets maintainability.

  • by UnknownSoldier ( 67820 ) on Friday December 21, 2012 @07:55PM (#42366113)

    Yes, agree 100% !

    Simple example:

    const int SIZE = 256; // all constants are upper case
    #define ALIGN_DOWN(n,pow2) (n & (pow2-1)) // all macros are upper case with underscore in-between words
    class CContainer { // Classes are prefixed with 'C'
      int m_size;
      int getSize() const { return m_size ; } // Coding standard will mention about Brace Style and the case style. (Are one-liners getters allowed?)
    enum EFlags { // Enums/flags are prefixed with 'E'
      FLAG_READ .= (1 << 0),
      FLAG_WRITE = (1 << 1),
      FLAG_EXEC .= (1 << 2);
    // Ignore the periods since /. doesn't know how to keep extra whitespace in code. :-(

  • by Salamander ( 33735 ) <jeff@noSPam.pl.atyp.us> on Friday December 21, 2012 @08:10PM (#42366225) Homepage Journal

    The only issue I have is with code diff utilities that don't work well with multi-monitor setups.

    You should try to appreciate that not everyone shares your circumstance. Sometimes the most senior developers on a project might have to review code while on the road, e.g. visiting customers or presenting at conferences. Not too many laptops have multiple monitors, and you wouldn't want to carry one if it did. Some of the very latest have pretty decent resolution, but they cost a lot more and they have a very fine dot pitch so the number of characters doesn't scale up as much as the number of pixels. Under those circumstances, code that doesn't display well in a *side by side* diff on a single small-ish monitor is a more serious issue than the junior developer's fetish for super-long lines. Eighty columns might not be the absolute best width, but it's in the range that makes such diffs under such circumstances productive, and it's a width that a lot of people (and tools developed over the last few decades) can handle reliably.

    Also, people who study reading have known for half a century that long lines are hard to scan accurately without a saccade leaving the reader's eyes on the previous or next line, which means that they're bad for readability even on wide monitors. There's a reason newspapers used to set type in columns instead of all the way across the page. You'll need a much better reason than personal aesthetics to do something that's bad for readability and a pain for other members of your team. Without such a reason - and I haven't seen any, anywhere in this thread - that's just selfish and immature.

  • by DrVxD ( 184537 ) on Saturday December 22, 2012 @10:52AM (#42369333) Homepage Journal



Life is a whim of several billion cells to be you for a while.