Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Software Security

Do You Write Backdoors? 1004

quaxzarron asks: "I had a recent experience where one of our group of programmers wrote backdoors on some web applications we were developing, so that he could gain access to the main hosting server when the application went live. This got me thinking about how we are dependent on the integrity of the coders for the integrity of our applications. Yet in this case a more than casual glance would allow us to identify potentially malicious code. How does this work when the clients are companies who can't perform such checks - either because they don't know how, or because the code is too large or too complex? How often do companies developing code officially sanction backdoors...even if means calling them 'security features'? How often has the Slashdot crowd put a backdoor in the code they were developing either officially or otherwise? How sustainable is the 'trust' between the developer and the client?"
This discussion has been archived. No new comments can be posted.

Do You Write Backdoors?

Comments Filter:
  • Deadlines (Score:5, Insightful)

    by jimmyCarter ( 56088 ) on Wednesday March 05, 2003 @02:43PM (#5442192) Journal
    I don't know about you guys, but not too many of my projects spare enough time in the project timeline to allow me to write backdoors or Easter eggs or whatever.

    The last thing I'm thinking about when rushing towards the deadline is some fancy backdoor into a web app I'll probably never use anyway.
    • Re:Deadlines (Score:5, Informative)

      by Anonymous Coward on Wednesday March 05, 2003 @02:49PM (#5442282)
      I don't know about you guys, but not too many of my projects spare enough time in the project timeline to allow me to write backdoors or Easter eggs or whatever.

      Some people write backdoors to facilitate debugging. They don't have to worry about checking with the customer for various passwords - they just type in "IAMGOD" or some such hard-coded password and they are in.

      For the record, I don't approve of backdoors. First, they provide security issues - someone just has to look through the executable for strings. Second, these things are never changed when employees move on.

      • Re:Deadlines (Score:5, Interesting)

        by Ponty ( 15710 ) <awc2@buyclamsonline.LIONcom minus cat> on Wednesday March 05, 2003 @03:13PM (#5442573) Homepage
        Just this morning, I write a backdoor into a web project. Very often the testing users give me really strange errors that I just can't verify at all. It's useful to have a "master password" that I'll disable later (probably.) Backdoors are most often used for debugging purposes. Fortunately for the users, I'll be the sysadmin when the system goes live, so there isn't much of a risk (yet.)
        • by motardo ( 74082 ) on Wednesday March 05, 2003 @03:34PM (#5442798)
          You must work for AOL
        • Re:Deadlines (Score:5, Insightful)

          by sql*kitten ( 1359 ) on Wednesday March 05, 2003 @03:51PM (#5443000)
          Just this morning, I write a backdoor into a web project. Very often the testing users give me really strange errors that I just can't verify at all. It's useful to have a "master password" that I'll disable later (probably.) Backdoors are most often used for debugging purposes. Fortunately for the users, I'll be the sysadmin when the system goes live, so there isn't much of a risk (yet.)

          If you're the sysadmin, then it's not a backdoor. After all, you could just fire up a debugger on the process and find out anything you wanted, passwords, data, anything. Or log onto the database as the DBA and just query the tables directly. Or place a packet sniffer on the network.

          A back door implies that it gives you something you couldn't and shouldn't have.
          • Re:Deadlines (Score:5, Insightful)

            by Deus_Ex_Machina ( 167220 ) on Wednesday March 05, 2003 @04:41PM (#5443507)
            Analogous situation: I have the key to the front door of a building, but it's inconvenient to use the front door so I blow a little secret hole in the back wall and use that instead.

            A back door is a back door because it provides another way into the system which circumvents whatever access controls already exist, totally regardless of who WRITES this new circumvention path, or whether the access controls would have restricted them in the first place.

            You're right that he can't do anything with his back door that he couldn't do as the administrator before (with ingenuity and a lot of time), but you're wrong that he hasn't created a back door.

          • Re:Deadlines (Score:5, Insightful)

            by PylonHead ( 61401 ) on Wednesday March 05, 2003 @04:49PM (#5443591) Homepage Journal
            This is clearly not true. Any method of gaining access that circumnavigates the established security procedures is a back door.

            If they fire him tomorrow, they have no way of removing his access from the system, since they don't even know it's there.
            • by fucksl4shd0t ( 630000 ) on Wednesday March 05, 2003 @05:07PM (#5443767) Homepage Journal

              Any method of gaining access that circumnavigates the established security procedures is a back door.

              Hey dude, I thought circumnavigate meant something like circle without penetrating, or something like that. Like, in the seven cities of gold days you would circumnavigate the new world on a couple of trips, go back to spain and get more men and boats and stuff, and then go back to start your exploration of the interior.

            • Re:Deadlines (Score:5, Informative)

              by vladkrupin ( 44145 ) on Wednesday March 05, 2003 @08:57PM (#5445736) Homepage
              This is clearly not true. Any method of gaining access that circumnavigates the established security procedures is a back door.

              If they fire him tomorrow, they have no way of removing his access from the system, since they don't even know it's there

              Everyone seems to focus on the actual piece of code that acts as a 'backdoor' and forgets that just knowledge of the system is just as dangerous. No sufficiently complex system can be foolproof both in design and implementation. During developent debugging code gets left over, some shortcuts are taken, etc.etc. Nobody except the developers who designed and wrote the stuff even know about what exactly is in the code. While I do not put any backdoors in my code intentionally, I have the sufficient knowledge of the system to poke a few holes big enough for a full compromise.

              In short: If you have a sufficiently large system, chances are that a disgruntled developer can compromise or damage it even without placing any backdoors in the code ahead of time. Knowledge is power. Obviously, this does not apply to open-source projects that receive a fair amount of peer review (or just people tinkering with the code).
              • Re:Deadlines (Score:4, Insightful)

                by zero_offset ( 200586 ) on Thursday March 06, 2003 @09:46AM (#5448421) Homepage
                While I do not put any backdoors in my code intentionally, I have the sufficient knowledge of the system to poke a few holes big enough for a full compromise

                Seems to me those vulnerabilities should qualify as problems you address before you ship the product. I'll grant that some of these kinds of problems may have very low criticality -- for example, they may require physical access to the machine and unusual permissions, in which case you're probably screwed anyway -- but it doesn't sound like you're talking about that kind of scenario.

                Basically you're talking about bugs. Just because it doesn't cause the machine to crash or set off alarms with your QA testers doesn't make it any less worthy of fixing.

                It seems likely none of that is news to you, but somebody had to say it...

          • Re:Deadlines (Score:5, Insightful)

            by cameldrv ( 53081 ) on Wednesday March 05, 2003 @06:28PM (#5444632)
            Yes, but what happens when he quits or gets fired? The other sysadmins would disable his account on the machine, but presumably he still has the backdoor password to a possibly web accessable app. This is like making a copy of your office key and keeping it when you leave.
        • Re:Deadlines (Score:5, Interesting)

          by quadcitytj ( 320706 ) <tj@wackycow.com> on Wednesday March 05, 2003 @04:13PM (#5443226) Homepage
          In my opinion, this is a terrible way to build the system, and "debugging" methods like this are the reason so much code sucks.

          You're not developing the project for a "master user," you're developing it for normal users. Debugging code while in the "master" mode will do nothing more than give you a false sense of whether your code is buggy or not.

          It's like installing an app, and then testing it as root. It doesn't tell you anything, and it makes user's lives miserable when they can't get something to work.
          • Re:Deadlines (Score:4, Informative)

            by Ponty ( 15710 ) <awc2@buyclamsonline.LIONcom minus cat> on Wednesday March 05, 2003 @06:41PM (#5444729) Homepage
            If I can't log in as an arbitrary user, then I can't fully test the system. Everyone had different preferences and details, and I, as the master user can't do all that much from my account.

            I think you misunderstand what my message meant: I don't have a "master" mode, I have a overriding password that bypasses the standard authentication system and lets the admin user assume the username/authentication details of an arbitrary user. That's important so I can have the same experience as that user. It doesn't compromise any data security, as I can just as easily see all the user's data when it's sitting on the database server.
            • Re:Deadlines (Score:4, Interesting)

              by slamb ( 119285 ) on Wednesday March 05, 2003 @10:54PM (#5446403) Homepage
              I have a overriding password that bypasses the standard authentication system and lets the admin user assume the username/authentication details of an arbitrary user. That's important so I can have the same experience as that user.

              This is something I wish more systems had, but not in a hardcoded (backdoor) way. Cyrus SASL/IMAPd does it correctly: they've completely separated the concepts of authentication and authorization. So you can say "until further notice, user slamb can log in and do anything user bob can do." It serves the same purpose, but in a maintainable, open way. You can have a group of administrators that can log in as anyone, but without needing to know either everyone's password or some master password that's difficult to change if someone leaves...they can just use their own, and it can be disabled/enabled on a per-person easily.

      • by DaTreeBear ( 651287 ) on Wednesday March 05, 2003 @07:01PM (#5444919)

        I saw first hand how back dooring software could provide job security for one developer.

        I worked at a company that produced some very complex financial and utilities management software. They needed a way to have these two applications talk to each other and their solution was a daemon to act as a conduit between the two. Since it had to assume user privs the daemon was set to run root suid.

        The code had been in production for quite some time when it was assigned to developer to maintain. The code was a mess (it had been written originally by people unfamiliar with programming in the Un*x environment). The developer was tasked with cleaning up the code if he could. Since they were very busy there was little or no supervision over him. As long as the daemon worked everyone was happy.

        Eventually the development department decided to restructure and investigated letting this guy go. He had a reputation of being a bit of a hacker so they came to me (I being the Un*x/Network admin at the time) to see how we might protect ourselves from reprisals should he be let go.

        I was fairly confident that my systems were tight. The biggest weakness as I saw it was this daemon. So I checked out the source code and started going through it. As I did, I discovered that this simple daemon had developed some new and interesting features. Along with it's normal duties, it also doubled as a telnet daemon (you could telnet to the listening port and login just as in telnet - except this one would ignore /etc/securetty thus allowing remote root logins over an unencrypted protocol). Another feature was it's ability to tunnel other ports through it's own listening port.

        The code was too convoluted for me to get a complete grasp on it in the time alloted. I went back to the VP of development, pointed out what I had found, and suggested that he would need to have every piece of code this guy had worked on audited to make sure it was clear of back doors. He visibly paled. The developer in question had been there for over 5 years by this point and had touched nearly everything at one time or another.

        In the end they simply moved him to another department (he is still there as far as I know). They felt it was too cost prohibitive (and dangerous) to let him go.

        They never did tell their customers about these gaping security holes either.

        Lessons learned:

        1. Never trust code you haven't audited yourself. I had a daemon running on my servers that was allowing remote root logins and I didn't even know it.
        2. A lack of integrity can be rewarding.
        3. Customers are WAY too trusting of vendors.
      • Re:Deadlines (Score:5, Interesting)

        by capnjack41 ( 560306 ) <spam_me@crapola.org> on Wednesday March 05, 2003 @09:05PM (#5445788)
        someone just has to look through the executable for strings.

        For this reason, if I write in backdoors like that in a PHP script (a single admin password, for example), I don't actually store the password in a database or the script plaintext, I use an MD5 hash. Even if someone somehow manages to see it (they shouldn't anyway), it's still hard for them to guess what the password is (which they need to POST to the script).

    • Re:Deadlines (Score:5, Insightful)

      by Anonymous Coward on Wednesday March 05, 2003 @02:57PM (#5442398)
      Deadlines can be the friend of someone who wants to slip a backdoor into code. It's as simple as thinking 'this project needs to be done the day after tomorrow, it is going live no matter what, noone is going to check this code segment before then'.

      Also, backdoors often facilitate debugging, which is why they may be in there. You don't have time to build proper debugging tools so you whip up a backdoor..

      And since both of these scenarios involve short amounts of available time, they are likely to be poorly coded, forgotten, and one day exploited.
      • by swordboy ( 472941 ) on Wednesday March 05, 2003 @03:22PM (#5442669) Journal
        Deadlines can be the friend of someone who wants to slip a backdoor into code.

        Reminds me of something... I wonder how many holes were implemented in the Y2K fiasco?

        Michael: It's pretty brilliant. What it does is where there's a bank transaction, and the interests are computed in the thousands a day in fractions of a cent, which it usually rounds off. What this does is it takes those remainders and puts it into your account.

        Peter: This sounds familiar.

        Michael: Yeah. They did this in Superman III.

        Peter: Yeah. What a good movie.

        Michael: A bunch of hackers did this in the 70s and one of them got busted.

        Peter: Well, so they check for this now?

        Michael: Initech's so backed up with all the software we're updating for the year 2000, they'd never notice.

        Peter: You're right. And even if they wanted to, they could never check all that code.

        Michael: It's numbers up their asses.

        Peter: So, Michael, what's to keep you from doing this?

        Michael: It's not worth the risk. I got a good job.

        Peter: What if you didn't have a good job?
    • Re:Deadlines (Score:5, Interesting)

      by jaavaaguru ( 261551 ) on Wednesday March 05, 2003 @03:57PM (#5443072) Homepage
      Some people [susx.ac.uk] had enough time. This has got to be one of the greatest backdoors of all time.
      • Re:Deadlines (Score:5, Interesting)

        by eric777 ( 613330 ) on Wednesday March 05, 2003 @05:56PM (#5444300) Homepage
        For those who can't get to the link, it suggests a clever back door - place a trigger in the original Bell Labs circa 1969 c compiler, which then self replicates in ALL FUTURE C PROGRAMS.

        Please note - this link is to a theoretical discussion of how the first UNIX creators could have installed a very-hard-to-detect backdoor.

        They never actually did this.

        Anyway, this simple example wouldn't work, for two reasons:

        1) It would leave lots of evidence behind in the log files - someone, somewhere, would eventually notice.

        2) Not *every* c compiler was compiled by a cross compiler, creating an uninterrupted chain back to the original C compiler written at Bell Labs. A number of platforms were built from scratch, using bootstrap compilers rather than cross compilers.

        Clever, though.

    • by misfit13b ( 572861 ) on Wednesday March 05, 2003 @05:01PM (#5443720)
      INTERNET, March 5 - It was reported by sources on the internet technology news website Slashdot [slashdot.org] that former President Jimmy Carter admits to not installing back doors into many of his humanitarian housing projects.

      Carter gave the reasoning in that "not too many of my projects spare enough time" for the installation of back doors, forcing many poverty stricken world citizens to walk all of the way around their home to get to the back yard.

      President Carter was also overheard to give dismissive and disparaging remarks about Easter Eggs. The Easter Bunny was unavailable for comment.
  • Of course (Score:3, Funny)

    by Anonymous Coward on Wednesday March 05, 2003 @02:43PM (#5442194)
    After I saw Wargames, I saw the immediate benefit of backdoors. They're very useful for preventing World War 3... oh, and playing games.
  • by Anonymous Coward on Wednesday March 05, 2003 @02:44PM (#5442200)
    If you have to stop and think "is what I'm doing right?" then the answer is usually "no."

    Of course, life is never that simple. I'm sure a backdoor has saved someone's ass on more than on occassion, because the admin forgot the root password or whatever. But don't be an asshole.

  • Fire the kid. (Score:3, Insightful)

    by pixel_bc ( 265009 ) on Wednesday March 05, 2003 @02:44PM (#5442212)
    Unless he was acting on some sort of order from you or someone else who can tell him to add something like that, I'd fire him.

    I'd also look into opening a criminal investigation.
  • Backdoors (Score:3, Funny)

    by digipak ( 647427 ) on Wednesday March 05, 2003 @02:45PM (#5442216)
    I've never coded backdoors into any software I've written. I usually don't use them in the future, and if I really need them, I gain access by other means. I can't see a logical reason to add them in, especially if you're job depends on the integrity of your code.
  • To do what? (Score:5, Insightful)

    by Ars-Fartsica ( 166957 ) on Wednesday March 05, 2003 @02:45PM (#5442222)
    Contrary to popular belief, most programmers don't get their rocks off by showing their friends how they get in through the 'back door'.

    Writing a back door is just more coding. Code for a while and see how much extraneous crap you write just for kicks.

  • of course (Score:5, Funny)

    by kurosawdust ( 654754 ) on Wednesday March 05, 2003 @02:46PM (#5442232)
    my code is so tight, the front door and backdoor are on the same hinge! hooah!
  • by japhar81 ( 640163 ) on Wednesday March 05, 2003 @02:46PM (#5442239)
    But, thats not to say I lack ethics, am a cracker, or am out to get my client.

    How many times have we all heard, duhh.... I forgot my admin password, but I cant reinstall, I need the data.

    So yes, I backdoor, and I document it internally (hardcopy stored in a safe). Its just an extra insurance policy for when some moron that I worked for 6 years ago does something stupid.

    That said, coding backdoors for the sake of getting access to a web farm so you can host your own services is certainly a bad thing(tm). But hell, what are you gonna do? Everyone backdoors. Don't believe me? Watch someone 'in the know' log in to a random windows box using the System account and come talk to me.
  • Open Source? (Score:5, Insightful)

    by jcortega ( 574008 ) on Wednesday March 05, 2003 @02:46PM (#5442240)
    this has been one of the biggest arguements towards using open source software. companies can theoretically trust open source software because everyone sees the code and they can easily modify it. my question is though, even though we have the source, do people actually read the thousands and thousands of lines of code in the program they're using or just the parts that would interest them (for modifying/improvement purposes)?
    • the short answer (Score:5, Interesting)

      by Ender Ryan ( 79406 ) <[ ] ['' in gap]> on Wednesday March 05, 2003 @03:05PM (#5442487) Journal
      The short answer to your question is, "Yes". Over a long enough timeline with enough people looking at the code, backdoors get caught. There was recently, well, maybe 1 year, a backdoor found in an Open Source database that used to be a proprietary product. The backdoor had been there for the entire life of the product. However, it took over a year after becoming open source for it to be caught.

  • by phorm ( 591458 ) on Wednesday March 05, 2003 @02:47PM (#5442253) Journal
    Some of the apps I make have the option to "allow" a backdoor by setting a flag (default on). The client can turn it off if he/she really doesn't trust me, but in most cases they find it useful in case I ever have to bugfix the systems and/or they lose their own passwords.
  • by saforrest ( 184929 ) on Wednesday March 05, 2003 @02:47PM (#5442259) Journal
    My back door is simply default passwords. My company released an application server last year, and after doing a google search a few months later for a string of text that would appear only on our default web image, I found a half-dozen copies of our software installed at various places.

    Out of curiosity, from a personal machine, I tried logging in to as administrator to a few of these machines with the default password our product shipped with. It worked about half the time.

    (Of course, one can't take the results of my search as suggesting that half of our customers didn't change their passwords, as the fact that these people hadn't updated the web image makes the fact that they didn't update the admin password wither not so surprising.)
    • by dmayle ( 200765 ) on Wednesday March 05, 2003 @03:08PM (#5442517) Homepage Journal

      Just a thought, but remember the case of the Princeton professor [slashdot.org] who got in trouble for logging into a Yale site with obvious student credentials? If the act of writing backdoors isn't illegal, than the act logging in using them, or even with default passwords certainly is.

      Security admins could tell you that default passwords are a BAD idea. Better to prompt the use on install, than have 2.5 million credit cards stolen from some retail site because they forgot to change a default password (or have a backdoor).

      I know, I know, you're going to say that a company has a responsibility, and that they're the ones failing the consumers, but if we, as coders, know that a large percentage of the customer base isn't going to follow best practices, than shouldn't we be taken to task for allowing that possibility in the first place?

      An application is an entire system, from how it's installed, to how it's used. And besides, it could be our own personal info that gets harvested at some point in the future...

      • by TheMidget ( 512188 ) on Wednesday March 05, 2003 @03:55PM (#5443049)
        Security admins could tell you that default passwords are a BAD idea. Better to prompt the use on install, than have 2.5 million credit cards stolen from some retail site because they forgot to change a default password (or have a backdoor).

        Security is much more likely breached through some SQL injection exploit than through default passwords. Don't believe me? Just go to a random smalltime e-commerce site. Watch you browser's title bar. If you see an URL of the form http://server.com/some/directory/path.asp?param1=v alue1&param2=value, then try sneaking some quote character into one of the values in the URL. More often than not, you'll be greated with a Sequelserver or Access error message. A smart user can figure out from the error message which kind of database query the app used, and he can sneak in characters that will yield useful results...

        If the database engine is Sequelserver, the attacker will try reading out the sysobjects and syscolumns table to figure out the schema, and from there he may harvest credit card numbers, names, addresses, or just play random pranks with the stored data.

        If on the other hand, the database engine is Access, it's a tad more difficult (because in Access, the msysobjects table is by default protected, and its error messages contain less useful info too). In that case, you need to find the URL that allows administrative login (often called admin.asp, admin_login.asp or somesuch). And then just type '='' or 1=1 or ' into the login field, and you're in.

        This works especially well against fly-by-night operations such as those advertised by spammers. Often spammers' remove-me links are vulnerable too (if you see .asp? in the remove-me URL, chances are that you can have some fun with it...). Great revenge tool if you get too much spam, or if you simply are bored!

  • Payment Insurance (Score:5, Interesting)

    by BadBlood ( 134525 ) on Wednesday March 05, 2003 @02:48PM (#5442266)
    I know a person who owns his own company and writes code on a for-hire basis. He puts in timed expiration code such that if they don't pay him within 30 days of delivery, his code de-activates.

    Where I work, we do similar things, but our motivation is to ensure that users are always running the latest version of our frequently updated codebase. We, as developers, do have the ability to run expired code via the backdoor.
    • Re:Payment Insurance (Score:5, Interesting)

      by B1LL_GAT3Z ( 253695 ) on Wednesday March 05, 2003 @03:27PM (#5442711) Homepage
      I was once commissioned to write a web application that dealt with secure signature technology. As the deadline came up, the dealings with my employer became "shady" - meaning that it looked like he wasn't going to pay out at the end. I wanted to do something similar to what you stated (an auto-timeout) however this application was written in an open source language (Perl) and needed to be kept that way. So - with some quick obfuscating I wrote a quick feature so that at a later date, if the employer didn't pay, I could simply access http://website.com/perl.pl?delete=y and it would delete itself. I'm glad I added this "feature" because it wasn't long after that he "disappeared" claiming all sorts of reasons for his non-payment. I then quickly used my feature and was glad for it.

      Of course, if my employer was skilled enough, he could've gone in and removed the code himself. This leads to the point of the trouble of joining Open Source and backdoors - as it's virtually impossible to do without some skilled programmer looking at it and being able to remove it. I thought you mind find that to be interesting.
      • Re:Payment Insurance (Score:5, Interesting)

        by Ian Bicking ( 980 ) <ianb@cCOLAolorstudy.com minus caffeine> on Wednesday March 05, 2003 @05:43PM (#5444168) Homepage
        I worked before for someone who had code like that put into his application (without his knowledge, of course). There was some pay dispute, and the programmer started triggering it. In this case it deleted the customer database, not the code. Ultimately the programmer was charged criminally (still awaiting trial), and possibly a civil case following.

        Now, I have a feeling there was bad stuff on both sides (and it's taken me a while to extract myself from this job), but you have to be careful when you destroy stuff. It's probably okay when you are deleting something that is your property and isn't paid for. But it's questionable if he already paid for part of the work, or if you were destroying any data created by his operations. If any money had been paid, or if it would compromise data that didn't belong to me, I wouldn't try it unless I had written something into the contract (I've seen pretty generic-looking contract terms which would imply you could effectively confiscate the work if it wasn't paid for). You also need to define when it's not paid for -- 30 days after completion, 60 days...? It's not professional to do something so forceful without making an effort to resolve things more peacefully.

    • by Lumpy ( 12016 ) on Wednesday March 05, 2003 @03:38PM (#5442831) Homepage
      Where I work, we do similar things, but our motivation is to ensure that users are always running the latest version of our frequently updated codebase. We, as developers, do have the ability to run expired code via the backdoor.

      and you are the kind of people that I loathe and will gladly assult on the street.

      I have HAD to crack software my company legally owned to keep it working after the company that wrote it went out of business and is dead and buried. company that made it is gone, software DIES... That is pure bullcrap.

      Please let me know what company you work for so I can make a reccomendation to my company to NEVER EVER buy your crippleware products.

      Timebombs should be 100% illegal.. it's exactly like the car dealer coming and stealing your car after you bought it. If your company's software is a LEASE then you had better say so.
  • Happens everywhere (Score:5, Informative)

    by matts.nu ( 94472 ) on Wednesday March 05, 2003 @02:48PM (#5442270) Homepage
    Here's [phenoelit.de] a list of 1090 backdoors.
  • Backdoor? (Score:5, Interesting)

    by RobertTaylor ( 444958 ) <roberttaylor1234@g m a il.com> on Wednesday March 05, 2003 @02:48PM (#5442271) Homepage Journal
    "I had a recent experience where one of our group of programmers wrote backdoors on some web applications we were developing, so that he could gain access to the main hosting server when the application went live."

    Its like that theory that BAE /Mcdonnel-Douglas embedded the F15 Eagle fighter plane with a backdoor in its computer systems so if its ever used against the USA it will strangely malfunction.

    Unlikely, but interesting concept all the same!
    • Re:Backdoor? (Score:4, Interesting)

      by zlexiss ( 14056 ) on Wednesday March 05, 2003 @05:01PM (#5443721)
      While there's a few unsubstantiated rumors floating on this topic (cites?), control of spare parts is much more effective than backdoors in code.

      Iran's F-14 force was effectively grounded by the embargo Carter placed after the fall of the Shah - fighters and other complex equipment need a steady stream of parts, and a veritable torrent under non-peacetime use. Search fas.org or aerospaceweb for "iran f-14" for a couple views on this, among other sites.

      As pointed out, backdoors carry the risk of being used against you. But if you've got all the spare parts, you get to fly.
  • trust... (Score:5, Interesting)

    by TechnoVooDooDaddy ( 470187 ) on Wednesday March 05, 2003 @02:48PM (#5442274) Homepage
    Trust and loyalty used to be my main focus... I trusted that those stock options i was offered instead of a chunk of salary would be good, and the company trusted that i would deliver good software, on-time.

    I fulfilled my part of the bargain, but when it came to stock option maturity time, I got laid-off.. The company is still in business interestingly enough, and now posting profits even.

    Who do you trust, and how is that trust repaid? I can tell you I no longer have the same sense of loyalty and trust in my employer. Companies are paying on average HALF of what they were for the same work 2 years ago.. Trust... works both ways or it doesn't work at all...
    • by Kostya ( 1146 ) on Wednesday March 05, 2003 @03:13PM (#5442570) Homepage Journal
      You are now ready to be a contractor/mercenary. When I embarked on being a contractor, my friend who was a long-time contractor (20+ years) said after talking with me, "I think you are bitter enough now to become a contractor."

      Which is to say, most people who went into contracting did so just because of stories like you told. They got tired of being jerked around and decided a little uncertainty and paperwork was worth getting little freedom from the corporate brain washing about team and loyalty.

      Granted, many went into it because of money during the dot-com boom. They are no longer contractors now ;-)

      I'm loyal--to getting the job done, according to contract, as long as I'm getting paid. I produce results, give advice, and let the customer go his own way--even if they insist on taking themselves to hell in a handbasket.

      It beats getting all worked up over stupid stuff at work.

      I always loved the "We're a family" line I got when people tried to get me on as a FT employee. I don't know about you, but it is usually true--and they have all the problems that families have too. They can keep them ;-)

  • Backdoors (Score:5, Interesting)

    by JSkills ( 69686 ) <jskills AT goofball DOT com> on Wednesday March 05, 2003 @02:48PM (#5442278) Homepage Journal
    Never written one for malicious pruposes before. Thought about it a lot of course - in the same way people fantasize about robbing a bank or hitting the lottery.

    But when you think about it, all leaving a backdoor in a system does for you is to provide an opportunity of accessing a system in a way that you shouldn't be. This can lead to trouble down the line.

    Clearly, there are legitimate uses for backdoors (to use in case of emergencies, etc.), but unless the backdoor is documented someplace for others in the software development group to be aware of, it's likely the kind of backdoor that is simply not ethical to implement, since it's only usable by one person.

    I'm sure people can provide examples that disprove this, but for the majority of situations, as a developer, having a backdoor in a system can only lead to a security breach at some point ...

  • kind of... (Score:5, Interesting)

    by deander2 ( 26173 ) <public@kered . o rg> on Wednesday March 05, 2003 @02:49PM (#5442281) Homepage

    I am working on an app for the govt, and yes, I have programmed in a backdoor login, as it's very useful for testing and development.

    However, the following are true:
    1) management knows full well of its existence
    2) BY DEFAULT, it is turned off in any build
    3) it is NEVER to be deployed turned on

    I think it's a good rule of thumb.
    • Re:kind of... (Score:3, Interesting)

      by geekoid ( 135745 )
      Wow, can you tell me the method you use to be sure your rules are never, ever, under in circumstances are never broken?
      That some Jr. programmer 5 years from now doesn't
      forget to turn off the backdoor?

  • by jpsst34 ( 582349 ) on Wednesday March 05, 2003 @02:49PM (#5442284) Journal
    Though it wasn't explicitely mentioned in the question, I feel that such a situation may be more common when the developers are hired as temporary / part-time help. In this case, you are a client and the developers may be looking to get something more. If you have your own in-house developers, they'll have more stake in the company and the project, and surely would care more about security - both the security of the software and the security of their job. A hired hand could code the backdoor then move on before you ever notice. Your own developers would be more hesitant to do this because if and when it gets noticed, they'll still be easily found in the cube on the third floor, east wing.

    Maybe a good idea would be to bring on a full time development staff and pay them good money so they don't feel the need to try to get something more. Oh, and tell me where to send my resume once you create these new full time positions.
  • PHP Web-apps (Score:5, Interesting)

    by yamcha666 ( 519244 ) on Wednesday March 05, 2003 @02:49PM (#5442293)

    I work for a small startup that specializes in custom web-applications for indy record labels and small-time bands and clubs. Our main product is a all-in-one web-app that will allow the customer to manage their shows, news, mailing list and numurous other things.

    We offer several levels of this product, one being shared (get 1 account on our servers) which we control, standalone, and custom standalone (the standalones go on their own servers.) The latter two are designed to have one back-door login account for myself and the other programmer to go in there and edit their settings or database if the customer breaks something.

    So there is my 2 cents. Yes, I put small backdoors in my company's web-apps per boss's request.

  • by djkitsch ( 576853 ) on Wednesday March 05, 2003 @02:50PM (#5442299)
    I (like many of you) work on a contract basis per project, and I'm contracted to fix any problems with the software as part of the job.

    If an intruder breaks into a database through a back door I put in (and let's face it, it is asking for trouble), I'm obliged to spend my valuable time closing the hole.

    I'm not of the opinion that it's worth my time and money to show off what a great hacker I am - my clients are really the ones who matter, since they pay my wages, and my skills should be reflected in my work...
  • Sure... (Score:4, Interesting)

    by Anonymous Coward on Wednesday March 05, 2003 @02:50PM (#5442303)
    Backdoors were coded into systems. But only for testing and development purposes. Once the software was being prepared for release, those backdoors would be deleted but in any case, they were usually coded to only work on specific (i.e. the development) machines.

    What really concerned me though was when we were supposed to store credit card numbers encrypted in the database and I used a simple replacement cypher as a placeholder. Then, when I later asked about putting real encryption routines in was told "we aren't going to do that".

    So customers are really in the dark when it comes to the security of their software.


  • by Art Popp ( 29075 ) on Wednesday March 05, 2003 @02:51PM (#5442314)
    There shouldn't any hard-coded trust between the authors of decent software and the buyers/users of that software. The fact is that any useful information that the backdoor could provide to the coder should be available to the purchaser. If the purchaser wants to trust the coder he needs to run sshd and give the coder and account with access to the application he coded. Why anyone would "reinvent" a secure backdoor when it can be accomplished with Freely available tools to a much greater level of security is just beyond me.
  • by egg troll ( 515396 ) on Wednesday March 05, 2003 @02:51PM (#5442315) Homepage Journal
    I know of a couple of examples where backdoors were put in for QA purposes and then left in when the product was shipped. Indeed, waaaay back in the day, a Mac IRC client left in a /ctcp command that would let another user execute any command on another ircle user's box!

    Doing things like /ctcp B1FF exec /quit made IRC almost unuseable for Mac users for a week or so.

    Anyways, my point is that most backdoors put in by developers seem to be accidental rather than intentional.

  • by green pizza ( 159161 ) on Wednesday March 05, 2003 @02:51PM (#5442330) Homepage
    This thread that gotten me wondering, what sort of legal options would one have should they find an employee coding in backdoors?

    Would this be considered felony fraud? The more I think about it, the more I hope so. Think about this -- one coder acting alone could cost a company millions of dollars in lost profit and trust. This would be more than that coder will probably earn in normal income thruout his entire life. I think this is one case where a jury SHOULD seriously consider decades of imprisonment. This isn't a simple case of a kid using DeCSS or defacing a website, this is case of one person destroying the image and trust of an entire company.
  • consequences (Score:5, Interesting)

    by spoonyfork ( 23307 ) <spoonyfork&gmail,com> on Wednesday March 05, 2003 @02:53PM (#5442351) Journal
    I don't but two guys here did just that last year. It was a customer facing website for a large multi-national corporation. The "backdoor" was caught before going live but they were fired with extreme prejudice.
  • by Lord_Slepnir ( 585350 ) on Wednesday March 05, 2003 @02:54PM (#5442359) Journal
    I'd like to post an intelligent responce, but I need more info. Can I have some people send me a list of back doors they've created so that I can investigate further? thanks
  • by new death barbie ( 240326 ) on Wednesday March 05, 2003 @02:54PM (#5442365)
    1) if it's not in the requirements, it shouldn't be in the code
    2)if it's useful or necessary, then it should be in the requirements. But it's not a back door anymore (maybe a side door?)
  • Legal and not (Score:3, Insightful)

    by sir_cello ( 634395 ) on Wednesday March 05, 2003 @02:55PM (#5442376)

    Putting backdoors is unethical, but possibly not illegal depending upon how you make your software available (i.e. license terms and conditions). It may only be illegal where you _use_ the backdoor (because you are then technically trespassing on property of another), or if someone else uses the backdoor (you could be held in negligence).

    I've been involved in a project where an easter egg was planted (command line interface to a subsystem, and if you enter right command, it will drop into a text RPG). You could get in trouble for this in certain ways:
    (a) wasting client money (if the program developed under contract and this functionality is outside of the scope of the development agreement);
    (b) negligence/action if something goes wrong with the functionality or leads to lack of performance of the software, etc.
  • by www.sorehands.com ( 142825 ) on Wednesday March 05, 2003 @02:55PM (#5442378) Homepage
    Most applications have some sort of back door.

    There are different extents to back doors. For example, in some filtering programs, you get admin access. In other programs, you have the ability to log in as a remote user. In another, you can bypass the encrytion passcodes.

    Having a remote access backdoor saves lots of trips to a customer site. Having a backdoor for admin access is good when they lose their passwords. Or remotely shutting down the application is good when they don't pay.

    There is also the other site to consider, if there is a back door, the application is clearly less secure.

    You have to balance the lack of security caused by this by the need for the features the different back doors offer.

    You should tell the client about this, but then it is a problem. If you tell people about back doors, some people may try to hack it. Having the remote ability to shut down an application may defeat the purpose.
  • by ClintJCL ( 264898 ) <clintjcl+slashdot@ g m a i l . com> on Wednesday March 05, 2003 @02:59PM (#5442433) Homepage Journal
    A friend of mine did some web-work for $12,000.

    The guy decided to be a dick about it and not pay him the money he deserved.

    Fortunately for him he put a backdoor in. He told the guy about it. Once activated the system would not work until a password only he knew was entered.

    His payment was promptly received. (He got the idea from a movie.)

  • MS Easter Eggs (Score:4, Interesting)

    by pdrome4robert ( 532173 ) on Wednesday March 05, 2003 @03:06PM (#5442499)
    A microserf friend once told me MS had no policy either way on easter eggs. They were there if programmers took the time to put them there. If an easter egg can get through development, peer-review, testing, packaging, why couldn't a backdoor?
  • by bperkins ( 12056 ) on Wednesday March 05, 2003 @03:13PM (#5442564) Homepage Journal
    Cowboyneal is my backdoor.

    Oh, I'm sorry, I thought this was a poll.
  • In a previous life (Score:4, Interesting)

    by Karl Cocknozzle ( 514413 ) <kcocknozzle AT hotmail DOT com> on Wednesday March 05, 2003 @03:19PM (#5442629) Homepage
    I was attached to a software package that didn't have a backdoor per se, so much as an undocumented account with a password of "a" that you could not take out of the database without doing major surgery. The software also (used to, anyways) put the undocumented account BACK into the users table and and restore the specific records to their "default state".

    Savvier customers changed the username and password (the rule required the user_id entry to stay in the db. But you could change the username/pass to keep undesirables out of the system. Yet many of the customers didn't ever even officially "discover" it... Before I left I never heard of any malicious things being done with this account, but as I told my boss the day I found out about it, "Its only a matter of time."

    I left when everybody around me started getting ".com" fever. Like, wacky. People who made $50k annually were leveraging a fortune in paper stock options to buy brand new Mercedes Benzes and hot tubs...
  • by MyPantsAreOnFire! ( 642687 ) on Wednesday March 05, 2003 @03:20PM (#5442636)
    I work for a small web company developing web apps for other small-to-medium sized companies. The one thing that you learn when you're in a small software company is that nobody wants to pay their bills.

    This is hard to see from a large-company perspective, because as a developer you aren't the one collecting the money, you have accountants and lawyers and rabid CEOs that make sure you get your contract's worth one way or another. But small companies don't have this option--they can't afford lawyers or even the time to spend in court. They have to find where their next paycheck is coming from.

    As a result, many of our clients have tried to jerk us around by either dragging their heels on payments or doing something underhanded like changing passwords to servers to try to lock us out and give us the finger. There have been instances where I've sent out a "it's all done, check it out" email and had the live server's passwords changed on me minutes later, follwed by a "we're not paying" response.

    Simply put, backdoors are a small company's only assurance that it will be paid for the work it has done. Given, the backdoors that I put in aren't to r00t the server or take down a whole subnet, they're limited to disabling the application that we developed. Until the client has paid their bills, it's still our code, and we have every right to put in as many backdoors as we want.
    • by ip_vjl ( 410654 ) on Wednesday March 05, 2003 @04:08PM (#5443183) Homepage
      Wouldn't the better option be to make your application expire a certain number of days after installation UNLESS a code is entered? The theory being that when you recive payment, you provide the code.

      The outcome for you is the same. If you don't get paid, the system locks them out. The outcome for the client is that honest, paying clients don't have hidden (exploitable) backdoors living in their deployed system.

  • by Cruciform ( 42896 ) on Wednesday March 05, 2003 @03:20PM (#5442641) Homepage
    Dear Backdoor,

    I'm sorry I haven't written in so long, but you know how busy things get. Maybe it's time for us to move on. I've found this great credit card database that uses default passwords. What can I say, it has so much more to offer.

    Yours truly...
  • by LoRider ( 16327 ) on Wednesday March 05, 2003 @03:21PM (#5442656) Homepage Journal
    not put backdoors in the software they create. I look at this way, give the client what they want and don't worry about anything else.

    Why open yourself up to potentially losing a client or just looking like an asshole just so you can do something that your client probably doesn't want you to do.

    What happens when someone else find your back door and exploits it? What do you tell your client when they ask you about why there is a back door into their application?

    It is quite possible that you will get sued. Aside from losing your business you will lose any integrity and should be ashamed of yourself for disrespecting your profession.

    Good programmers are ethical and do what they are told.
  • by wcbarksdale ( 621327 ) on Wednesday March 05, 2003 @03:21PM (#5442659)
    and the one to which "kt" refers is described here [acm.org]. Truly ingenious. Even looking at every part of the source yourself can't protect you in a case like that.
  • by argoff ( 142580 ) on Wednesday March 05, 2003 @03:36PM (#5442813)

    1st, when I leave a company that I don't like or a company harms me - I consider that their "punishment" is not having the best man for the job - a backdoor would nullify this high ground and proove that I wasn't the best man for the job. And if a company is good to me, or I like them - than these are the last people in the world I would want to harm or compromize - so either way, it's just plain a poor way of living.

    2nd, I don't know about you, but I worked on more than my fair share of projects where I could tell that the core was written badly, but didn't have the time, resources, or approval to do it the right way myself. There are plenty of things that could go wrong that I can get blamed for even if I do everything right, the last thing I want to do is add something else that can go wrong. No thanks!

    3rd, I want denyability. When I leave a company, I want them to change the passwords, delete accounts, and for the code to be secure. The last thing I want is some breakin or failure put back on me years after the fact. There are plenty of shortcommings in life that can "catch up" with you, even if you do your darndest to be perfict. The last thing in the world I want to do is add some more to that pile.

    4th, I rely on these people for contacts, reference, and refferals. Why risk burning bridges when I don't half too. Why risk a job when if I don't want it I am free to quit. If you don't like a company, why risk going to jail for them. If I must risk going to jail, I would much rather it be for a cause I believe in like that lady who refused to go to the back of the bus.
  • by tlambert ( 566799 ) on Wednesday March 05, 2003 @03:39PM (#5442851)
    Not All Backdoors Are Nefarious.

    I was a senior software engineer at Whistle Communications, and later at IBM, for the Whistle InterJet/IBM Web Connections products. I did most of the last generation of email, user account management, mailing list, internal database, and other infrastructure services for the product.

    This product has back doors. But they are all explicitly guarded.

    From the front panel of the InterJet, you can enable remote management, for a short period of time. This allows a tier 1 support representative to help you configure/maintain your InterJet, while you are on the phone with them.

    This required explicit customer consent for remote Web UI based administration.

    From the Web UI, if you are logged in as "Admin", there are "secret URLs", which you can use to obtain raw access to the configuration database for much of the InterJet: all of the parts I personally wrote, and some of the rest of it, where the engineers used the standard APIs we had agreed upon for user interface and common configuration store code. This was done to work around the Web UI design, which failed to expose many useful features of the product, which we engineers knew would result in customers inability to use the product as it had been sold to them. It was likewise useful for tier 2 support, to avoid engineering escalations.

    This required explicit customer consent for remote Web UI based administration.

    Also from the front panel of the InterJet, you can enable "telnet mode". This was done by going to a particular configuration screen on the front panel, and entering a "T" (for "Telnet") on the front panel keypad at that screen. A time limited ability for a remote engineer to come in and manually access the system to diagnose and treat engineering escalations was thereby enabled.

    This required explicit customer consent for remote shell based administration.

    In addition, this mode only worked from a specific netblock of IP addresses.

    Once in at the shell, it was possible for an engineer to force any of these protections. It was common practice for a persistant problem to leave the remote access for engineers open until the problem was verified to be resolved.

    There was also a "magic" front panel sequence that would permit you to play "Pong" on the LCD display. I filed a sev-1 bug ("total loss of functionality") against the maintainer, because it did not support "Skunks" (scores of 7-0) as a victory condition. 8-).

    All of them were under direct user control, in terms of outside access.

    None of these are "proprietary" or "confidential", they just aren't useful to people without documentation.

    Other than working around the Web UI designer's intent, with the second back door, none of these really qualifies as nefarious (I would argue that working around the Web UI designers intent qualifies as "routing around the damage").

    -- Terry
  • Spyware (Score:4, Interesting)

    by GrumpyGeek ( 38444 ) on Wednesday March 05, 2003 @03:58PM (#5443092)
    I have never been asked to write a back door, but I have been asked write code to covertly send us reports about how the customer is using our software. The motovation behind this is that our software (Health Care) is price based on the 'number of lives' covered by our system.

    My response was that I could do this, but that I thought that doing this without notfying the customer that it was being done was wrong. Stangely enough they did not argue with me, and as far as I know it has not been done.

  • by IBitOBear ( 410965 ) on Wednesday March 05, 2003 @04:29PM (#5443376) Homepage Journal
    Most of these comments do not particularly apply to "the web" as, in my mind, a web browser is just another interface surface (like a printer or a live screen) and the IO parts of a program are only the surface.

    During the construction of a program I almost always end up writing a test harness for each significant module. Where possible I like to include the test harness inside the library for that module.

    I then, when assembling the final product, do compile time control of whether the target application does, or does not, have the hooks to branch into the test harnesses. When an application ehxibits an error that doesn't have a clear source origin I switch to the "debugging version" of a product and that brings in a fully-featured set of back doors and hacks. Clearly the dubugging version is not suitable for production.

    That having been said.

    A certian lazyness on the part of the developers combined with a sloppy mind set being promulgated by the "I can drag and I can drop so I am a programmer" school of language-constructors, debuggers, and IDEs, has led to a plague of escaped code.

    A primary example of these escapes are "cheat codes" in games. Now days, you can't even expect to sell a game at all unless it is rife with cehat codes you can include in the book. These are the "send a message to all from the console saying "I Am Rich" and you will get $100,000 credits at the start of your next (event)" things. They clearly exist so that the developers can go in and exercise the extreme limits of their design but then they are never disabled later for the production release.

    This is dumb and annoying in games. In "real" applications this is potentially catostrophic.

    But the "whats good for bob is goog dor ted" mentality causes the cheating haxor kiddies, who have seen these back channels as required parts of every program they have used growing up (e.g. the games) and now somehow think such things *BELONG* in code.

    Any culture that teaches kids to just use the cheats (the cheat codes are even commonly printed in the manuals now, and then *explained* in detail in the walkthrough & cheat book you can buy seperately) and that any program without those cheats is probably trash, should not be surprised that when those kids enter the workforce they will, as a matter of self-pride include such things in the code they then write.

    (Example: my room mate is 12 years younger than I. He can't function in a game, or at least "can't enjoy" a game, unless he has got the FAQ and walkthrough around "just in case." What has he learned from life about "working it out himself?" and what should that teach the rest of us?)

    Test harnesses are necessary for development.

    They should be expunged from production code.

    Programmers should *know* *how* to write code that doesn't change core behavior when you take out the test harnesses.

    Games and toys should not be an exception as that sets bad habbits.


    We are all doomed...
  • by wawannem ( 591061 ) on Wednesday March 05, 2003 @04:34PM (#5443433) Homepage
    Here is something to think about, at my last job, we ran a very popular piece of software that is used to track and manage student loans for our financial aid department. One of our financial aid reps was having problems with her peecee and I sent out one of our IT support guys who was a full time student and part-time employee. He ended up calling the support desk for the piece of software in question and the rep gave him a backdoor account that was full access right over the phone. Luckily for me, the kid was honest and came back and told us about it. We all had a laugh that such an important piece of software could be compromised so easily and the support rep didn't even think twice about giving info on the backdoor to a student. Just to be safe, I periodically checked that student's account and made sure no phony changes were made. The more I thought about it, the more paranoid I became. I talked to my boss (the IT Director at the college) and found out that we really didn't have any choice in the matter, that piece of software was the only one like it. Thankfully, I don't work there any more, and I don't think the problem was ever exploited, but I wouldn't be surprised if I ever read about it. I won't divulge the information about the backdoor, but I will say that the software in question is called WhizKid and there may be college employees here on Slashdot that are familiar with it.
  • by TrailerTrash ( 91309 ) on Wednesday March 05, 2003 @06:13PM (#5444478)
    We all witnessed Admiral Kirk leveraging the ultimate backdoor in Wrath of Kahn!

    If it's a good enough programming practice for the United Federation of Planets, it's good enough for me.
  • by rice_burners_suck ( 243660 ) on Wednesday March 05, 2003 @11:14PM (#5446506)
    I remember reading about this one guy who put a backdoor in login so he could log in to any unix workstation with a pre-specified username and password. But that would easily be found in the login source code, so he modified the c compiler to recognize that it was compiling login and then silently insert the appropriate code in there. But that would easily be found in the compiler source code. So he modified the compiler to recognize that it was compiling itself and then silently insert the code that recognizes login and silently inserts that code. Then, he deleted these instructions from the compiler source code, but since the compiler would silently insert them in the object code, there became a situation where code that did not exist in source form would create a chain reaction that would allow this dude to login to any unix machine. It was totally invisible and would work as long as nobody changed the compiler or used a different compiler to compile the compiler, or something weird like that.

    But I don't remember the guy's name. Or maybe it was a chick. But whoever it was, they were definitely a staid and steadfast compiler writer.

  • by canadian_right ( 410687 ) <alexander.russell@telus.net> on Thursday March 06, 2003 @02:06AM (#5447266) Homepage
    I was working on a small custom db (in c, way back in the PC dark ages)that was going to hold confidential data, and had a simple user login coded up. Management insisted on putting in a back-door because past experience indicated that a few times a year a customer would ask us to "recover" a lost password. The back door was used to get into the system as an admin and reset the other user passwords for customers.

Air is water with holes in it.