Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Software Security

Do You Write Backdoors? 1004

quaxzarron asks: "I had a recent experience where one of our group of programmers wrote backdoors on some web applications we were developing, so that he could gain access to the main hosting server when the application went live. This got me thinking about how we are dependent on the integrity of the coders for the integrity of our applications. Yet in this case a more than casual glance would allow us to identify potentially malicious code. How does this work when the clients are companies who can't perform such checks - either because they don't know how, or because the code is too large or too complex? How often do companies developing code officially sanction backdoors...even if means calling them 'security features'? How often has the Slashdot crowd put a backdoor in the code they were developing either officially or otherwise? How sustainable is the 'trust' between the developer and the client?"
This discussion has been archived. No new comments can be posted.

Do You Write Backdoors?

Comments Filter:
  • Happens everywhere (Score:5, Informative)

    by matts.nu ( 94472 ) on Wednesday March 05, 2003 @02:48PM (#5442270) Homepage
    Here's [phenoelit.de] a list of 1090 backdoors.
  • Re:Deadlines (Score:5, Informative)

    by Anonymous Coward on Wednesday March 05, 2003 @02:49PM (#5442282)
    I don't know about you guys, but not too many of my projects spare enough time in the project timeline to allow me to write backdoors or Easter eggs or whatever.

    Some people write backdoors to facilitate debugging. They don't have to worry about checking with the customer for various passwords - they just type in "IAMGOD" or some such hard-coded password and they are in.

    For the record, I don't approve of backdoors. First, they provide security issues - someone just has to look through the executable for strings. Second, these things are never changed when employees move on.

  • by piobair ( 586119 ) on Wednesday March 05, 2003 @02:57PM (#5442404)
    Those aren't backdoors they're default passwords.

    A very different animal, indeed.
  • by TedTschopp ( 244839 ) on Wednesday March 05, 2003 @02:59PM (#5442428) Homepage
    Occasionally I have been given a task to write a piece of software and I do not have an account on the system which I am writing the software for. The backdoor is designed to make sure test is working. So I basically put in code which basically says:

    If User = Me then
    bypass security
    else
    Security/Validation
    end if

    This way I can test the app without having to go and validate against the system which I don't have rights to. When we move from test to production, this backdoor is left in until the client validates user acceptance test(UAT) phases, at which point a second production move is done without the offending backdoor. In otherwords the backdoor is the first UAT bug reported.

    I suspect this is common for contractors.

    Ted
  • Re:microsoft (Score:3, Informative)

    by Joe U ( 443617 ) on Wednesday March 05, 2003 @02:59PM (#5442432) Homepage Journal
    No, that was the random seed used to encrypt Frontpage connections to the server.

    (Kinda funny actually, someone who had to support Netscape 4.x for any length of time must have wrote it.)

    However, once the phrase was found out it made it easy to start cracking the encryption. So it was removed and replaced with something else.
  • Re:Payment Insurance (Score:2, Informative)

    by Joe U ( 443617 ) on Wednesday March 05, 2003 @03:07PM (#5442506) Homepage Journal
    People have been sucessfully sued for this practice.

    If it's not fully outlined in the contract, he could have a really fun time in court.
  • by Anonymous Coward on Wednesday March 05, 2003 @03:08PM (#5442522)
    What is a backdoor exactly? I worked for an ISP, to add new customers you put their info in a database and the database created a file on a unix server. Every 20 minutes those files were picked up and the info in them where used to start accounts. Every employee had access to this server and write permissions to the directory that contained the new customer files. To add an account that wasn't listed in the database just start a file with the right info in it, bam..new account.

    Is that a backdoor? Or just lazy design and poor planning?
  • Re:microsoft (Score:3, Informative)

    by rosewood ( 99925 ) <<ur.tahc> <ta> <doowesor>> on Wednesday March 05, 2003 @03:11PM (#5442549) Homepage Journal
    This story [com.com] goes into more detail.
    Call it the case of the disappearing security hole. Initial reports of a "back door" in Microsoft Corp.'s FrontPage server software -- a deliberate security hole put in to allow illicit access -- now seem to be, for the most part, incorrect. While Microsoft (msft) admits that a security flaw does indeed plague a software module in its Web server product, the giant software company contradicted statements by one of its managers confirming the existence of a back door with the pass phrase "Netscape engineers are weenies!"
  • by MyPantsAreOnFire! ( 642687 ) on Wednesday March 05, 2003 @03:20PM (#5442636)
    I work for a small web company developing web apps for other small-to-medium sized companies. The one thing that you learn when you're in a small software company is that nobody wants to pay their bills.

    This is hard to see from a large-company perspective, because as a developer you aren't the one collecting the money, you have accountants and lawyers and rabid CEOs that make sure you get your contract's worth one way or another. But small companies don't have this option--they can't afford lawyers or even the time to spend in court. They have to find where their next paycheck is coming from.

    As a result, many of our clients have tried to jerk us around by either dragging their heels on payments or doing something underhanded like changing passwords to servers to try to lock us out and give us the finger. There have been instances where I've sent out a "it's all done, check it out" email and had the live server's passwords changed on me minutes later, follwed by a "we're not paying" response.

    Simply put, backdoors are a small company's only assurance that it will be paid for the work it has done. Given, the backdoors that I put in aren't to r00t the server or take down a whole subnet, they're limited to disabling the application that we developed. Until the client has paid their bills, it's still our code, and we have every right to put in as many backdoors as we want.
  • Yes and no (Score:3, Informative)

    by ptomblin ( 1378 ) <ptomblin@xcski.com> on Wednesday March 05, 2003 @03:21PM (#5442649) Homepage Journal
    In the past, I've included "secret" passwords at the request of the people who'd be going to the customer sites to help out. Often times you'd find the customer wasn't around to tell you their password when you needed to quickly get in and look at or fix something. I coded a fancy algorithm where password was dynamic based on the day of the week and the month name, but our field circus guys found it too hard to remember the algorith, so I was forced to change it to "*", which I considered very dangerous.

    Another time, we had a one line message window - if you sent a message with a severity of 'w','e','s' (for warning, error, and severe respectively), the message would stay for progressively longer amounts of time before the next message could wipe it out, and it would flash different colours and beep for the more severe ones. For message type 'i' (information) it would immediately be replaced by any subsequent messages. Once when it was late at night and I was getting a bit punch drunk, I made one branch of the program put out the 'i' message "How the fuck did that happen?" followed immediately by a more informative 'e' message. Nobody ever saw the 'i' message, because it was replaced so quickly. Until one day when somebody put a scroll bar on the message window so you could scroll back and see previous message. I got a call from a trade show requesting an immediate patch. Ooops. That's the closest I've ever come to putting in an "easter egg".

    I think putting in secret backdoors to get access without telling your superiors is very bad news, and could quite easily get you fired.
  • by tlambert ( 566799 ) on Wednesday March 05, 2003 @03:39PM (#5442851)
    Not All Backdoors Are Nefarious.

    I was a senior software engineer at Whistle Communications, and later at IBM, for the Whistle InterJet/IBM Web Connections products. I did most of the last generation of email, user account management, mailing list, internal database, and other infrastructure services for the product.

    This product has back doors. But they are all explicitly guarded.

    From the front panel of the InterJet, you can enable remote management, for a short period of time. This allows a tier 1 support representative to help you configure/maintain your InterJet, while you are on the phone with them.

    This required explicit customer consent for remote Web UI based administration.

    From the Web UI, if you are logged in as "Admin", there are "secret URLs", which you can use to obtain raw access to the configuration database for much of the InterJet: all of the parts I personally wrote, and some of the rest of it, where the engineers used the standard APIs we had agreed upon for user interface and common configuration store code. This was done to work around the Web UI design, which failed to expose many useful features of the product, which we engineers knew would result in customers inability to use the product as it had been sold to them. It was likewise useful for tier 2 support, to avoid engineering escalations.

    This required explicit customer consent for remote Web UI based administration.

    Also from the front panel of the InterJet, you can enable "telnet mode". This was done by going to a particular configuration screen on the front panel, and entering a "T" (for "Telnet") on the front panel keypad at that screen. A time limited ability for a remote engineer to come in and manually access the system to diagnose and treat engineering escalations was thereby enabled.

    This required explicit customer consent for remote shell based administration.

    In addition, this mode only worked from a specific netblock of IP addresses.

    Once in at the shell, it was possible for an engineer to force any of these protections. It was common practice for a persistant problem to leave the remote access for engineers open until the problem was verified to be resolved.

    There was also a "magic" front panel sequence that would permit you to play "Pong" on the LCD display. I filed a sev-1 bug ("total loss of functionality") against the maintainer, because it did not support "Skunks" (scores of 7-0) as a victory condition. 8-).

    All of them were under direct user control, in terms of outside access.

    None of these are "proprietary" or "confidential", they just aren't useful to people without documentation.

    Other than working around the Web UI designer's intent, with the second back door, none of these really qualifies as nefarious (I would argue that working around the Web UI designers intent qualifies as "routing around the damage").

    -- Terry
  • by Anonymous Coward on Wednesday March 05, 2003 @03:40PM (#5442862)
    A backdoor is a socket bound to a public interface that is undocumented.

    If the coder was good enough to handle your app, he's good enough to write a robust and secure backdoor.

    If you want to know what an app is doing when you run it, run it under strace, and you will know everything it does.

    If it's a windows app then don't even worry about it, cuz you've already lost the game.
  • FoxPro Command Line (Score:3, Informative)

    by dbCooper0 ( 398528 ) <dbc AT triton DOT net> on Wednesday March 05, 2003 @03:44PM (#5442904) Journal
    I used to write vertical market apps in Fox, and I'd always include a backdoor for getting at a command line. This was not remotely accessable except for Carbon Copy or PC Anywhere.

    Back in the days of GOD (Good Ol' DOS) the variable memory need would grow too large for that reserved, needing tweaking. Or a scratch database would get corrupt from a hardware failure.

    Almost all things that could go wrong could be corrected without having to tear the code apart...because it always worked in my development systems; it only broke in production environments. The "backdoors" proved invaluable for tending to the screwups of the DEUs (Defective End Users :)- example: one of my clients had forgotten to use the AR functions and had literally MILLIONS of dollars owed to them in the system (only), all because they never entered checks received. Arghhh!

  • Re:Why yes... (Score:2, Informative)

    by mojorisin67_71 ( 238883 ) on Wednesday March 05, 2003 @03:54PM (#5443038)
    Back door man - Words & music by Willie Dixon
    Jim just sang the song. ;-)

    The other song in their studio albums Jim or any of the other Doors did not write was Alabama Song.
  • by Anonymous Coward on Wednesday March 05, 2003 @04:06PM (#5443167)
    fire axe = flamebait.. Get it???
  • I have, will again. (Score:3, Informative)

    by Anonymous Coward on Wednesday March 05, 2003 @04:06PM (#5443170)
    I was writing a web application for college faculty to view pictures of the students in their classes. For authentication, I used the campus LDAP server so that it used the same usernames and passwords as everything else on campus, and for security I set it so that only accounts marked as faculty could view it. Unfortunately, I was a student at the time and so the only way I could use it myself was to make a backdoor. No one was going to give me faculty access, and no faculty member was going to give me their password. Without the backdoor, it would never have been written. However, I still had to enter my password, so it was secure as long as my account was both enabled and unbreached.

    Of course, I later learned that there was a faculty account set up with no password for one stupid printer in some jerk's office, and so the whole thing was in fact wide open. I had tried to get the whole thing moved to a more secure setup, but I wrote it as a summer contractor and not an employee (semantics) and so I had no pull with the tech people. Some students got in, copied every picture, and created their own copy that was fully public and even advertized. So in the end, the backdoor I put in remained secure, but the backdoor some lazy fuck put in in a completely unrelated application so they could have their own personal printer screwed me over, released private information into the wild, and almost got at least one (good) student expelled for discovering the whole thing.
  • by Anonymous Coward on Wednesday March 05, 2003 @04:29PM (#5443369)
    Really? The computers would literally sigh from boredom? Since when do computers literally sigh? Or feel boredom? "Literally" is not to be used for emphasis; especially when you're applying it to something you mean figuratively.
  • Writing Backdoors? (Score:2, Informative)

    by ronfar ( 52216 ) on Wednesday March 05, 2003 @05:06PM (#5443759) Journal
    Hmm, I never deliberately wrote a backdoor. I have worked on coding projects that had backdoors in them in the form of well known exploits. For example, I once worked on a Web based Email system (which is now defunct, by the way, so I'm not giving anything away.) in which I discovered a Javascript exploit that would allow users to be tricked into re-entering their password so that an unscrupulous person could get it (as well as various other Javascript trick and traps). Basically, if you opened your Email and someone had Emailed you a Javascript, it would execute as soon as you opened the mail.

    It was left in for a while because a) We had an extremely low volume of users b) We were running low on funding and it was believed that more "political" improvements were neccessary to get more VC money and c) We were planning to scrap our internal Email as soon as we finalized a deal with a third party vendor.

    I eventually fixed that bug, but considering that we had no QA department to speak of I wouldn't be surprised if others got through that I didn't know about. That company was all about smoke and mirrors anyway, I think upper management's philosophy was along the lines of "take the money and run." I think that the way creditors were stiffed was a more serious matter than the backdoors into our system, which we all joked internally that we hoped none of our users were taking seriously. I know, it is a sad thing to joke about, but what happened was our old bosses (who had been replaced) had taken a proof of concept that one of the developers (who had also left by that time) had created and put it out on the Web as if it were a live site rather than a working prototype. So, we basically couldn't take it down if we wanted to because upper management felt that having a bad Website was better than having no site at all.

    We ended up reengineering the whole thing from scratch, and hadn't finished with it by the time the money ran out. (Ah! The good old dotcom days!)

  • Re:Extreme? (Score:2, Informative)

    by spoonyfork ( 23307 ) <[moc.liamg] [ta] [krofynoops]> on Wednesday March 05, 2003 @06:03PM (#5444382) Journal

    And how does that differ from normal prejudice?

    There is compliant disposition of employment and then there is non-compliant disposition. Picture a couple of burly security guards with an empty cardboard box explaining to you that you need to turn in your security credentials and pack your personal possessions under supervision during the next 5 minutes before being "escorted" off of the premises.

    Contrast that with the picture of a large party being thrown in your name as you complete your "two weeks notice" into early retirement for a job well done after a career of successes.

  • Re:Payment Insurance (Score:2, Informative)

    by Elias Ross ( 1260 ) on Wednesday March 05, 2003 @06:04PM (#5444394) Homepage
    If you want to hide your source, you can easily obfuscate. For example, I would suggest applying "gzip" to the final build. Then you can run the program by invoking "gcat" on it, and piping it to "/usr/bin/perl".

    For example:

    $ gzip test.pl
    $ mv test.pl.gz test.pl
    $ zcat test.pl | perl

    This would probably work well enough to hide source from non-techies.
  • by JohnQPublic ( 158027 ) on Wednesday March 05, 2003 @06:40PM (#5444724)
    We should never forget that the first big Internet worm spread itself largely though a back door written into sendmail. The author, Eric Allman, deliberately put in two backdoors as SMTP commands, "DEBUG" and "WIZ", one of which (DEBUG) was used by Robert Morris's worm. While Google can't seem to locate it, there was a contemporary statement by Allman that the reason for those two commands was a Berkeley sysadmin who wouldn't give him privileges to update sendmail, so he did it himself, the hard way.

    Anyone who writes a backdoor should be fired ASAP and the door should be closed. Failing to do so can easily make your company liable for damages caused by someone using it. It's a miracle that Allman didn't get prosecuted with Morris - he probably would today, but the legal folks were more clueless about computing risks in those days.
  • Re:Deadlines (Score:4, Informative)

    by Ponty ( 15710 ) <awc2 AT buyclamsonline DOT com> on Wednesday March 05, 2003 @06:41PM (#5444729) Homepage
    If I can't log in as an arbitrary user, then I can't fully test the system. Everyone had different preferences and details, and I, as the master user can't do all that much from my account.

    I think you misunderstand what my message meant: I don't have a "master" mode, I have a overriding password that bypasses the standard authentication system and lets the admin user assume the username/authentication details of an arbitrary user. That's important so I can have the same experience as that user. It doesn't compromise any data security, as I can just as easily see all the user's data when it's sitting on the database server.
  • by Gerry Gleason ( 609985 ) <gerry@geraldgl[ ]on.com ['eas' in gap]> on Wednesday March 05, 2003 @07:03PM (#5444940)
    That doesn't sound like a backdoor to me. First, it uses an external authentication database and doesn't circumvent that. Second, it doesn't give any special access (just normal access which is normally restricted). You're just adding external filtering to the authentication data (i.e. turning on a bit for a specific user). It's just that the "faculty bit" chosen didn't map exactly to the access desired on this system. It might have been nice to overide the fact that the bit was on for that unsecured account (or perhaps any account with a null password?).

    My point is that I would consider a feature like this to be a necessary design element of the solution (given the reliance on the LDAP authentication database), and not a backdoor in the sense discussed with this question. Now, the idiocy that led to an unsecured account with faculty priviledge, that's another matter. Not really a backdoor, but a serious breach of security. Didn't they have policies against this? This is exactly why you need security policies, so that the admin has something to point to with greater authority than the idiot insisting they "need" it.

  • Re:Deadlines (Score:5, Informative)

    by vladkrupin ( 44145 ) on Wednesday March 05, 2003 @08:57PM (#5445736) Homepage
    This is clearly not true. Any method of gaining access that circumnavigates the established security procedures is a back door.

    If they fire him tomorrow, they have no way of removing his access from the system, since they don't even know it's there


    Everyone seems to focus on the actual piece of code that acts as a 'backdoor' and forgets that just knowledge of the system is just as dangerous. No sufficiently complex system can be foolproof both in design and implementation. During developent debugging code gets left over, some shortcuts are taken, etc.etc. Nobody except the developers who designed and wrote the stuff even know about what exactly is in the code. While I do not put any backdoors in my code intentionally, I have the sufficient knowledge of the system to poke a few holes big enough for a full compromise.

    In short: If you have a sufficiently large system, chances are that a disgruntled developer can compromise or damage it even without placing any backdoors in the code ahead of time. Knowledge is power. Obviously, this does not apply to open-source projects that receive a fair amount of peer review (or just people tinkering with the code).
  • by DaTreeBear ( 651287 ) on Wednesday March 05, 2003 @08:58PM (#5445743)

    When I said my servers were running a daemon that allowed remote root logins, I wasn't meaning in the sense that they were open to Internet at large.

    All of the servers I was dealing with were on our internal LAN and behind our firewall. The application in question was a daemon that was supposed to listen on a given port. In fact, we had scans set up to monitor and alert us if the service went down. So our portscans showed it as a listening port as it should have. We would have had to put a sniffer against it to see that it was passing traffic that it wasn't supposed to.

    I am not saying I couldn't have detected that the daemon was doing a bit more than it ought to but it wasn't quite as simple as you suggest.

  • Re:Deadlines (Score:1, Informative)

    by Anonymous Coward on Wednesday March 05, 2003 @09:33PM (#5445936)
    You're not developing the project for a "master user," you're developing it for normal users. Debugging code while in the "master" mode will do nothing more than give you a false sense of whether your code is buggy or not.

    You seem to be confused on the issue here. If you're trying to get a sense of whether your code is buggy or not, that's not debugging, that's testing. If you're debugging, you already know your code is buggy -- you're trying to find out why (or how) it's broken.

    When you test things in "normal" mode, and something fails, you enter into "master" mode to find out what's going wrong.
  • Re:Deadlines (Score:2, Informative)

    by t3kad0n ( 636763 ) on Wednesday March 05, 2003 @10:33PM (#5446281)
    Dang, heres the URL: http://www.acm.org/classics/sep95/
  • by canadian_right ( 410687 ) <alexander.russell@telus.net> on Thursday March 06, 2003 @02:06AM (#5447266) Homepage
    I was working on a small custom db (in c, way back in the PC dark ages)that was going to hold confidential data, and had a simple user login coded up. Management insisted on putting in a back-door because past experience indicated that a few times a year a customer would ask us to "recover" a lost password. The back door was used to get into the system as an admin and reset the other user passwords for customers.
  • by Ratbert42 ( 452340 ) on Thursday March 06, 2003 @08:35AM (#5448182)
    I did it and served 6 months [cornell.edu] for it. Cost me $50k and my job too. Whee.

    Don't do it.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...