Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Software Security

Do You Write Backdoors? 1004

quaxzarron asks: "I had a recent experience where one of our group of programmers wrote backdoors on some web applications we were developing, so that he could gain access to the main hosting server when the application went live. This got me thinking about how we are dependent on the integrity of the coders for the integrity of our applications. Yet in this case a more than casual glance would allow us to identify potentially malicious code. How does this work when the clients are companies who can't perform such checks - either because they don't know how, or because the code is too large or too complex? How often do companies developing code officially sanction backdoors...even if means calling them 'security features'? How often has the Slashdot crowd put a backdoor in the code they were developing either officially or otherwise? How sustainable is the 'trust' between the developer and the client?"
This discussion has been archived. No new comments can be posted.

Do You Write Backdoors?

Comments Filter:
  • by japhar81 ( 640163 ) on Wednesday March 05, 2003 @02:46PM (#5442239)
    But, thats not to say I lack ethics, am a cracker, or am out to get my client.

    How many times have we all heard, duhh.... I forgot my admin password, but I cant reinstall, I need the data.

    So yes, I backdoor, and I document it internally (hardcopy stored in a safe). Its just an extra insurance policy for when some moron that I worked for 6 years ago does something stupid.

    That said, coding backdoors for the sake of getting access to a web farm so you can host your own services is certainly a bad thing(tm). But hell, what are you gonna do? Everyone backdoors. Don't believe me? Watch someone 'in the know' log in to a random windows box using the System account and come talk to me.
  • by phorm ( 591458 ) on Wednesday March 05, 2003 @02:47PM (#5442253) Journal
    Some of the apps I make have the option to "allow" a backdoor by setting a flag (default on). The client can turn it off if he/she really doesn't trust me, but in most cases they find it useful in case I ever have to bugfix the systems and/or they lose their own passwords.
  • by saforrest ( 184929 ) on Wednesday March 05, 2003 @02:47PM (#5442259) Journal
    My back door is simply default passwords. My company released an application server last year, and after doing a google search a few months later for a string of text that would appear only on our default web image, I found a half-dozen copies of our software installed at various places.

    Out of curiosity, from a personal machine, I tried logging in to as administrator to a few of these machines with the default password our product shipped with. It worked about half the time.

    (Of course, one can't take the results of my search as suggesting that half of our customers didn't change their passwords, as the fact that these people hadn't updated the web image makes the fact that they didn't update the admin password wither not so surprising.)
  • Almost Every One. (Score:1, Interesting)

    by Anonymous Coward on Wednesday March 05, 2003 @02:48PM (#5442263)
    Almost all of my applications contain back doors if they are intended for others to use. Back in the BBS days we used to swap programs, and certain people were hated by all. So we'd write programs, and either include 2 versions, one with backdoor, and one without, or just include it in all of them. This would allow us to remotely reboot, get shell access, etc if our enemy happened to load our code. Usually it required a bunch of weird ascii codes to be typed in, as well as a password. We were very prone to checking executables for code such as this, but with alot of nifty tricks, you could hide the backdoors so that they weren't blatantly obvious when looking through the binary file...
  • Payment Insurance (Score:5, Interesting)

    by BadBlood ( 134525 ) on Wednesday March 05, 2003 @02:48PM (#5442266)
    I know a person who owns his own company and writes code on a for-hire basis. He puts in timed expiration code such that if they don't pay him within 30 days of delivery, his code de-activates.

    Where I work, we do similar things, but our motivation is to ensure that users are always running the latest version of our frequently updated codebase. We, as developers, do have the ability to run expired code via the backdoor.
  • Backdoor? (Score:5, Interesting)

    by RobertTaylor ( 444958 ) <roberttaylor1234@@@gmail...com> on Wednesday March 05, 2003 @02:48PM (#5442271) Homepage Journal
    "I had a recent experience where one of our group of programmers wrote backdoors on some web applications we were developing, so that he could gain access to the main hosting server when the application went live."

    Its like that theory that BAE /Mcdonnel-Douglas embedded the F15 Eagle fighter plane with a backdoor in its computer systems so if its ever used against the USA it will strangely malfunction.

    Unlikely, but interesting concept all the same!
  • trust... (Score:5, Interesting)

    by TechnoVooDooDaddy ( 470187 ) on Wednesday March 05, 2003 @02:48PM (#5442274) Homepage
    Trust and loyalty used to be my main focus... I trusted that those stock options i was offered instead of a chunk of salary would be good, and the company trusted that i would deliver good software, on-time.

    I fulfilled my part of the bargain, but when it came to stock option maturity time, I got laid-off.. The company is still in business interestingly enough, and now posting profits even.

    Who do you trust, and how is that trust repaid? I can tell you I no longer have the same sense of loyalty and trust in my employer. Companies are paying on average HALF of what they were for the same work 2 years ago.. Trust... works both ways or it doesn't work at all...
  • Backdoors (Score:5, Interesting)

    by JSkills ( 69686 ) <jskills@[ ]fball.com ['goo' in gap]> on Wednesday March 05, 2003 @02:48PM (#5442278) Homepage Journal
    Never written one for malicious pruposes before. Thought about it a lot of course - in the same way people fantasize about robbing a bank or hitting the lottery.

    But when you think about it, all leaving a backdoor in a system does for you is to provide an opportunity of accessing a system in a way that you shouldn't be. This can lead to trouble down the line.

    Clearly, there are legitimate uses for backdoors (to use in case of emergencies, etc.), but unless the backdoor is documented someplace for others in the software development group to be aware of, it's likely the kind of backdoor that is simply not ethical to implement, since it's only usable by one person.

    I'm sure people can provide examples that disprove this, but for the majority of situations, as a developer, having a backdoor in a system can only lead to a security breach at some point ...

  • kind of... (Score:5, Interesting)

    by deander2 ( 26173 ) <public@keredCOW.org minus herbivore> on Wednesday March 05, 2003 @02:49PM (#5442281) Homepage

    I am working on an app for the govt, and yes, I have programmed in a backdoor login, as it's very useful for testing and development.

    However, the following are true:
    1) management knows full well of its existence
    2) BY DEFAULT, it is turned off in any build
    3) it is NEVER to be deployed turned on

    I think it's a good rule of thumb.
  • by jpsst34 ( 582349 ) on Wednesday March 05, 2003 @02:49PM (#5442284) Journal
    Though it wasn't explicitely mentioned in the question, I feel that such a situation may be more common when the developers are hired as temporary / part-time help. In this case, you are a client and the developers may be looking to get something more. If you have your own in-house developers, they'll have more stake in the company and the project, and surely would care more about security - both the security of the software and the security of their job. A hired hand could code the backdoor then move on before you ever notice. Your own developers would be more hesitant to do this because if and when it gets noticed, they'll still be easily found in the cube on the third floor, east wing.

    Maybe a good idea would be to bring on a full time development staff and pay them good money so they don't feel the need to try to get something more. Oh, and tell me where to send my resume once you create these new full time positions.
  • by MjDascombe ( 549226 ) on Wednesday March 05, 2003 @02:49PM (#5442285) Journal
    Backdoors can be a good insurance policy, and their theoretical presence might guarentee your continued employment, but if your employers find them, I can guarentee you won't be working there for much longer :P
  • PHP Web-apps (Score:5, Interesting)

    by yamcha666 ( 519244 ) on Wednesday March 05, 2003 @02:49PM (#5442293)

    I work for a small startup that specializes in custom web-applications for indy record labels and small-time bands and clubs. Our main product is a all-in-one web-app that will allow the customer to manage their shows, news, mailing list and numurous other things.

    We offer several levels of this product, one being shared (get 1 account on our servers) which we control, standalone, and custom standalone (the standalones go on their own servers.) The latter two are designed to have one back-door login account for myself and the other programmer to go in there and edit their settings or database if the customer breaks something.

    So there is my 2 cents. Yes, I put small backdoors in my company's web-apps per boss's request.

  • by Anonymous Coward on Wednesday March 05, 2003 @02:49PM (#5442295)
    Anyone care to explain this "Watch someone 'in the know' log in to a random windows box using the System account" crack to me?

    Are you implying there is a 'backdoor' account in all copies of Windows?

    ???
  • Sure... (Score:4, Interesting)

    by Anonymous Coward on Wednesday March 05, 2003 @02:50PM (#5442303)
    Backdoors were coded into systems. But only for testing and development purposes. Once the software was being prepared for release, those backdoors would be deleted but in any case, they were usually coded to only work on specific (i.e. the development) machines.


    What really concerned me though was when we were supposed to store credit card numbers encrypted in the database and I used a simple replacement cypher as a placeholder. Then, when I later asked about putting real encryption routines in was told "we aren't going to do that".


    So customers are really in the dark when it comes to the security of their software.


    Rich

  • by egg troll ( 515396 ) on Wednesday March 05, 2003 @02:51PM (#5442315) Homepage Journal
    I know of a couple of examples where backdoors were put in for QA purposes and then left in when the product was shipped. Indeed, waaaay back in the day, a Mac IRC client left in a /ctcp command that would let another user execute any command on another ircle user's box!


    Doing things like /ctcp B1FF exec /quit made IRC almost unuseable for Mac users for a week or so.


    Anyways, my point is that most backdoors put in by developers seem to be accidental rather than intentional.

  • winxp? (Score:2, Interesting)

    by rizzo420 ( 136707 ) on Wednesday March 05, 2003 @02:51PM (#5442327) Journal
    ok, i don't think there's a backdoor, but i know windows xp comes installed with a special microsoft tech support user. how do they get to use that user to fix problems? that's what i don't understand. it's really odd i think. i wouldn't be surprised if microsoft started putting backdoors into their software that only closed when you entered a unique serial number that bounced back from an online serial number database. similar to the way Q3A uses the cd key. although i know there are some hacks to the Q3A cd key to allow you to use a pirated copy, so this may not work. i just wouldn't put it past microsoft to do something like that.
  • by leftism11 ( 177941 ) on Wednesday March 05, 2003 @02:52PM (#5442335)
    When I was a consultant at a Big 6 firm (back in the day), a colleague of mine wrote a Windows app for a client. He added code that would cause the application to stop working after a certain date, so that if the client doesn't pay their invoices, he won't update the code, and the app will simply stop working.

    I personally considered this to be very unprofessional, and probably not legal, but he claimed that it was perfectly legit. Of course, the client didn't know this, and he never told them (they did pay their invoices on time).

    Definitely not my style, but it is evidence to me that it is done on a regular basis.
  • consequences (Score:5, Interesting)

    by spoonyfork ( 23307 ) <spoonyfork&gmail,com> on Wednesday March 05, 2003 @02:53PM (#5442351) Journal
    I don't but two guys here did just that last year. It was a customer facing website for a large multi-national corporation. The "backdoor" was caught before going live but they were fired with extreme prejudice.
  • Re:kind of... (Score:3, Interesting)

    by geekoid ( 135745 ) <dadinportland@yah o o .com> on Wednesday March 05, 2003 @02:56PM (#5442389) Homepage Journal
    Wow, can you tell me the method you use to be sure your rules are never, ever, under in circumstances are never broken?
    That some Jr. programmer 5 years from now doesn't
    forget to turn off the backdoor?

  • by e2d2 ( 115622 ) on Wednesday March 05, 2003 @02:57PM (#5442394)
    I know that probably was a joke but.. If you think the NSA needs a key to get to your data you need to go read up on the amount of computing power they have in their hands. I recommend "Puzzle Palace" and "Body of Secrets" from James Bamford. Really interesting stuff. They could basically pick through your weak built in encryption like Rosie Odonnell picks through a rack of ribs. Their computers would literally sigh from boredom.

  • Back Doors.. (Score:3, Interesting)

    by glh ( 14273 ) on Wednesday March 05, 2003 @02:58PM (#5442416) Homepage Journal
    In my experience, back doors tend to be "in the mind of the programmer" as opposed to code that was physically made to be that way. It's not that the programmer sat there and added a way to hack in-- it's just the fact that since he knows how the system works, he knows how to gain access if need be. After all, just about everything electronic can be exploited at some point.

    I think you really have to consider the intent. If someone tries to go into the code and deliberately put a back door in w/o proper decision authority, that should be considered unethical. If the client wants it tighter than Fort Knox and loses the key, it's not your fault.
  • by Matthew Austern ( 259952 ) on Wednesday March 05, 2003 @02:59PM (#5442420) Homepage
    There is only trust as long as: a) The source is available, i.e., fully disclosed. b) The code is readable and understandable

    Having the source available isn't necessarily as much help as you might think. Since nobody else has mentioned it yet, I suppose I'd better do the honors and point y'all to Ken Thompson's classic talk on backdoors, Reflections on Trusting Trust [acm.org].

    Highly recommended. It's a good reminder of just how devious it's possible to be.

  • by ClioCJS ( 264898 ) <cliocjs+slashdot AT gmail DOT com> on Wednesday March 05, 2003 @02:59PM (#5442433) Homepage Journal
    A friend of mine did some web-work for $12,000.

    The guy decided to be a dick about it and not pay him the money he deserved.

    Fortunately for him he put a backdoor in. He told the guy about it. Once activated the system would not work until a password only he knew was entered.

    His payment was promptly received. (He got the idea from a movie.)

  • by MrWorf ( 216691 ) on Wednesday March 05, 2003 @03:01PM (#5442449) Homepage
    Backdoors are very hard to justify and also adds coding (as mentioned here before).

    However, I've been involved in projects where we've added an easter egg.

    Why? I don't know, it's fun, easy and it's cool when you can show someone that you actually was part of the development. Ofcourse, these projects were inhouse product development. I would probably not consider doing such a thing in a customer's product that I'm working with, besides, that's not what they are paying me for.
  • by Anonymous Coward on Wednesday March 05, 2003 @03:03PM (#5442474)
    I've written backdoors into web apps I've written, but usually because SOMEONE has to have absolute access to the data. It usually leads to some admin tools I've also written for the site, so that I can change data or whatever when needed. They aren't used for malicious reasons, and don't allow access to everything on the system. Although, hypothetically, if I were ever fired, it might be possible for me to get into those admin tools and issue the Apocolyse command that I've also written into the site, requiring the company to hired me back as a consultant at $125 an hour to fix things. Hypothetically. Not like that's going to happen. Really...
  • backdoor root access (Score:3, Interesting)

    by Cleveland Steamer ( 625191 ) on Wednesday March 05, 2003 @03:05PM (#5442485)
    I've never written a backdoor for any of the applications I've released publicly. However, when I graduated from University and resigned from a system administrator position in one of its departments, I wrote a little backdoor program that gave me root privileges because I knew the guy who was to replace me was completely incompetent. I knew I would get to keep my account, so I wrote the program so that it would only work from my user ID.

    The program came in handy a few times. I finally deleted it about six months later.

  • the short answer (Score:5, Interesting)

    by Ender Ryan ( 79406 ) on Wednesday March 05, 2003 @03:05PM (#5442487) Journal
    The short answer to your question is, "Yes". Over a long enough timeline with enough people looking at the code, backdoors get caught. There was recently, well, maybe 1 year, a backdoor found in an Open Source database that used to be a proprietary product. The backdoor had been there for the entire life of the product. However, it took over a year after becoming open source for it to be caught.

  • MS Easter Eggs (Score:4, Interesting)

    by pdrome4robert ( 532173 ) on Wednesday March 05, 2003 @03:06PM (#5442499)
    A microserf friend once told me MS had no policy either way on easter eggs. They were there if programmers took the time to put them there. If an easter egg can get through development, peer-review, testing, packaging, why couldn't a backdoor?
  • by Lord Kestrel ( 91395 ) on Wednesday March 05, 2003 @03:11PM (#5442551)
    I used to work for a company that wrote billing software. Our billing app that we wrote had an easy way to get root on the billing box, via a simple exploit. Although it wasn't planned as that, it was discovered shortly after we shipped it, and was unpatched for years.

    As a result of this, anyone who knew our software could get in as root to any of the servers running our billing software. I haven't worked for that company in 4 years, and I don't even think they are still around, but anyone running that billing software can be compromised (hopefully no one is still using it, but you never know).

    We did have a standard user that we setup in the database as well, so we could perform maintenance, but we told them about it, and coordinated maintenance with them. That could be contrived as a back door as well though, as it did allow remote access by our company.
  • Re:the short answer (Score:3, Interesting)

    by Dr. Evil ( 3501 ) on Wednesday March 05, 2003 @03:12PM (#5442561)

    On the other hand, some security vulnerabilities could be carefully engineered or intentionally neglected by a malicious developer. Written carefully enough they could even look like an honest mistake... like the latest buffer-overflow in Sendmail for example?

    Then I suppose the incentive comes into play.

  • by Kostya ( 1146 ) on Wednesday March 05, 2003 @03:13PM (#5442570) Homepage Journal
    You are now ready to be a contractor/mercenary. When I embarked on being a contractor, my friend who was a long-time contractor (20+ years) said after talking with me, "I think you are bitter enough now to become a contractor."

    Which is to say, most people who went into contracting did so just because of stories like you told. They got tired of being jerked around and decided a little uncertainty and paperwork was worth getting little freedom from the corporate brain washing about team and loyalty.

    Granted, many went into it because of money during the dot-com boom. They are no longer contractors now ;-)

    I'm loyal--to getting the job done, according to contract, as long as I'm getting paid. I produce results, give advice, and let the customer go his own way--even if they insist on taking themselves to hell in a handbasket.

    It beats getting all worked up over stupid stuff at work.

    I always loved the "We're a family" line I got when people tried to get me on as a FT employee. I don't know about you, but it is usually true--and they have all the problems that families have too. They can keep them ;-)

  • Re:Deadlines (Score:5, Interesting)

    by Ponty ( 15710 ) <awc2&buyclamsonline,com> on Wednesday March 05, 2003 @03:13PM (#5442573) Homepage
    Just this morning, I write a backdoor into a web project. Very often the testing users give me really strange errors that I just can't verify at all. It's useful to have a "master password" that I'll disable later (probably.) Backdoors are most often used for debugging purposes. Fortunately for the users, I'll be the sysadmin when the system goes live, so there isn't much of a risk (yet.)
  • In a previous life (Score:4, Interesting)

    by Karl Cocknozzle ( 514413 ) <kcocknozzle@hotm ... com minus distro> on Wednesday March 05, 2003 @03:19PM (#5442629) Homepage
    I was attached to a software package that didn't have a backdoor per se, so much as an undocumented account with a password of "a" that you could not take out of the database without doing major surgery. The software also (used to, anyways) put the undocumented account BACK into the users table and and restore the specific records to their "default state".

    Savvier customers changed the username and password (the rule required the user_id entry to stay in the db. But you could change the username/pass to keep undesirables out of the system. Yet many of the customers didn't ever even officially "discover" it... Before I left I never heard of any malicious things being done with this account, but as I told my boss the day I found out about it, "Its only a matter of time."

    I left when everybody around me started getting ".com" fever. Like, wacky. People who made $50k annually were leveraging a fortune in paper stock options to buy brand new Mercedes Benzes and hot tubs...
  • by tomhudson ( 43916 ) <[moc.nosduh-arab ... [nosduh.arabrab]> on Wednesday March 05, 2003 @03:20PM (#5442634) Journal
    <quote> A backdoor allows you to make changes and tweaks after the application (or site) goes live but they just aren't worth it. Especially if it ever dawns on the customer that you have made changes (such as a bug fix) without consulting them. If you need to change the software then you need to consult the customer. Period. If the customer ever figures out that there is a back door or if it is abused by a third party, they will never hire you or your company again.</quote>

    ... M$ products have back doors, easter eggs, etc, and their "click-thru" licensing of mediaplayer also lets your box "phone home". Their XP updater routinely sends info about all software installed on your machine (see yesterday's art. at theregister.co.uk). ...

    the facts are:

    1. most people don't give a shit, don't have a clue...
    2. if your app does what it's supposed to, they won't care about any back doors ...
    3. the first time they screw up, they'll be glad there was a back door!
    4. if they can't trust you to refrain from abusing any back door you put in, they shouldn't be using you anyway!!! Now, does this mean that I'm going to start putting back doors in? Probably. At least when the product/code/app is going to be used by less-than-clued-in end users who are likely to screw up things to the point where some "emergency entrance hatch" is required.

      As to giving the customers the source code, I remember doing that in the mid-'90s before it was fashionable, and having their "computer people" continuously removing it from corporate systems because it wasn't on the "list of approved programs".

  • by wcbarksdale ( 621327 ) on Wednesday March 05, 2003 @03:21PM (#5442659)
    and the one to which "kt" refers is described here [acm.org]. Truly ingenious. Even looking at every part of the source yourself can't protect you in a case like that.
  • Re:the short answer (Score:3, Interesting)

    by Drakonian ( 518722 ) on Wednesday March 05, 2003 @03:22PM (#5442671) Homepage
    Couldn't that be an argument against open source? (In a John-Ashcroft-freedom-by-reducing-liberties sort of way)

    Product X has a backdoor. Product X is released as open source. A few vigilant hackers start pouring over it ASAP and find the backdoor and exploit it until it's found by someone in the good community. (Maybe a full year later, as you said.)

    Just food for thought.

  • Re:Payment Insurance (Score:5, Interesting)

    by B1LL_GAT3Z ( 253695 ) on Wednesday March 05, 2003 @03:27PM (#5442711) Homepage
    I was once commissioned to write a web application that dealt with secure signature technology. As the deadline came up, the dealings with my employer became "shady" - meaning that it looked like he wasn't going to pay out at the end. I wanted to do something similar to what you stated (an auto-timeout) however this application was written in an open source language (Perl) and needed to be kept that way. So - with some quick obfuscating I wrote a quick feature so that at a later date, if the employer didn't pay, I could simply access http://website.com/perl.pl?delete=y and it would delete itself. I'm glad I added this "feature" because it wasn't long after that he "disappeared" claiming all sorts of reasons for his non-payment. I then quickly used my feature and was glad for it.

    Of course, if my employer was skilled enough, he could've gone in and removed the code himself. This leads to the point of the trouble of joining Open Source and backdoors - as it's virtually impossible to do without some skilled programmer looking at it and being able to remove it. I thought you mind find that to be interesting.
  • Re:Payment Insurance (Score:3, Interesting)

    by pla ( 258480 ) on Wednesday March 05, 2003 @03:29PM (#5442731) Journal
    If someone who did work for me ever tried to pull something like this, they would never do work for me again.

    Um... Duh?

    Someone who did this to you would not have gotten paid. Thus, I have very little doubt that they would *NOT* work for you ever again, but that would result from *THEIR* choice, not yours. People do not generally like to work without getting paid.

    "Aww, c'mon man, PLEASE let me spend another six months coding for you, only to have the check bounce!"
  • Video Games (Score:2, Interesting)

    by feepness ( 543479 ) on Wednesday March 05, 2003 @03:29PM (#5442733)
    I've worked in the video game industry and of course there are "backdoors" in all the products. These codes are part of the fun for both the dev team and the elite gamer. We have about a dozen different easter eggs in various products we release. I am always surprised to see the following as the codes show up on websites:

    • Codes we share amongst ourselves (dev team and with the testers) get out. This is obvious.
    • Some codes that I do not share amongst the team got out. And these are complicated as HELL. Like doing a series of things in the game and it having another effect minutes later that you must also know to activate. I show just a few people that these things are even possible, without sharing how. (Yes it's fun to feel cool.)
    • Some codes that others added that I do not know about show up in public.
    Conclusions: No backdoor is safe. Once released (or delivered), the code is in the hands of the enemy. Period.

    Corollary: People have way more time on their hands than they should.
  • Re:the short answer (Score:3, Interesting)

    by Ender Ryan ( 79406 ) on Wednesday March 05, 2003 @03:33PM (#5442786) Journal
    Yes, I suppose so, but that's insanely short-sighted. Over the long term, you're much better off if it gets open sourced, even if there is an existing backdoor. My reasoning is that, even without the source, hackers can and probably will find existing backdoors, eventually, but the "good community" may never find it.

    But who thinks about the long term these days? Even the richest people in this country are committing all sorts of fraud for a quick buck.

  • Re:Of course (Score:3, Interesting)

    by Anonymous Coward on Wednesday March 05, 2003 @03:35PM (#5442808)
    Back in the mid-90's, during the BBS boom, there was a dial-up program called Terminate. Anyway, it had an entire fake Wargames simulation built into it, with the appropriate questions and responses. It even had a functioning wardialer for a period!
  • Re:Backdoor? (Score:2, Interesting)

    by br0ck ( 237309 ) on Wednesday March 05, 2003 @03:40PM (#5442861)
    Like many backdoors, what if an enemy of the USA is able to figure out how to exploit this vulnerability?
  • by GReaToaK_2000 ( 217386 ) on Wednesday March 05, 2003 @03:43PM (#5442897)
    If a back door is exploited and the company looses lots of money, etc... They simply go through their source integrity system and hold the coder partially responsible... Even if the programmer no longer works for the company...

    It's responsibility...

    But then again this country's general populace doesn't accept the concept of responsibility... Just look at the number of stupid lawsuits that are out there and the number of criminals that have gotten off with a good lawyer...

    my 2 cents
  • by secolactico ( 519805 ) on Wednesday March 05, 2003 @03:48PM (#5442964) Journal
    Indeed, I agree with you. If there's an appropiate password recovery procedure, however, there should be no need of backdoors. Even when there are, they should be safely limited to, say, console access.
  • by TheMidget ( 512188 ) on Wednesday March 05, 2003 @03:55PM (#5443049)
    Security admins could tell you that default passwords are a BAD idea. Better to prompt the use on install, than have 2.5 million credit cards stolen from some retail site because they forgot to change a default password (or have a backdoor).

    Security is much more likely breached through some SQL injection exploit than through default passwords. Don't believe me? Just go to a random smalltime e-commerce site. Watch you browser's title bar. If you see an URL of the form http://server.com/some/directory/path.asp?param1=v alue1&param2=value, then try sneaking some quote character into one of the values in the URL. More often than not, you'll be greated with a Sequelserver or Access error message. A smart user can figure out from the error message which kind of database query the app used, and he can sneak in characters that will yield useful results...

    If the database engine is Sequelserver, the attacker will try reading out the sysobjects and syscolumns table to figure out the schema, and from there he may harvest credit card numbers, names, addresses, or just play random pranks with the stored data.

    If on the other hand, the database engine is Access, it's a tad more difficult (because in Access, the msysobjects table is by default protected, and its error messages contain less useful info too). In that case, you need to find the URL that allows administrative login (often called admin.asp, admin_login.asp or somesuch). And then just type '='' or 1=1 or ' into the login field, and you're in.

    This works especially well against fly-by-night operations such as those advertised by spammers. Often spammers' remove-me links are vulnerable too (if you see .asp? in the remove-me URL, chances are that you can have some fun with it...). Great revenge tool if you get too much spam, or if you simply are bored!

  • Re:Sure... (Score:2, Interesting)

    by beebware ( 149208 ) on Wednesday March 05, 2003 @03:55PM (#5443057) Homepage
    I've put in small backdoors which will only work on certain machine names (i.e. my developmental machine was called 'Alice' therefore the code wouldn't work on the live server called "Bob"), that required access from a set range of IPs (our internal LAN) AND required knowledge of the URL and additional password. Ideal for testing and development purposes, but once the system goes live the backdoors are usless.
  • One word: liability (Score:3, Interesting)

    by WhaDaYaKnow ( 563683 ) on Wednesday March 05, 2003 @03:56PM (#5443060)
    Writing backdoors is an extremely stupid and dangerous thing to do.

    Obviously you take the risk to get fired on the spot. But of far greater risk is your liability. What if your backdoor causes (unexpected) damage to the company? Do you have the pockets to make up for that? Because you can rest assured they will be knocking at your (front) door.

    If you have to write a backdoor for 'good' reasons, make sure the company is aware of it, and there should be no problems at all.

    The 'putting in a backdoor to make sure a customer pays in time' is stupid as well. If someone writes software for me and comes back with an 'update' after we pay that removes a backdoor, that was exactly the last time that person would work for me.

    In fact, that programmer would have signed a contract that specifically states that we do not allow backdoors, but I guess not all companies think of that... Regardless, that programmer has wasted the company's time, with all sorts of (legal) consequences.
  • Re:Deadlines (Score:5, Interesting)

    by jaavaaguru ( 261551 ) on Wednesday March 05, 2003 @03:57PM (#5443072) Homepage
    Some people [susx.ac.uk] had enough time. This has got to be one of the greatest backdoors of all time.
  • by jmagar.com ( 67146 ) on Wednesday March 05, 2003 @03:58PM (#5443085) Homepage
    I have been pondering the implications of this type of thing for a while now. I am currently building in a backdoor AND telling the user about it. You can read more of the discussion here:

    Remote Code Hosting [jmagar.com]

  • by ceswiedler ( 165311 ) <chris@swiedler.org> on Wednesday March 05, 2003 @03:58PM (#5443091)
    I was working for a company that went belly-up around 9/11. I was one of the few people kept on as a contractor, but I wasn't guaranteed that I would be paid. The product was going to be installed at a single (large) client. I put in a simple check to exit with an error message if the app was run past a date a few months in the future. I planned on eventually taking it out if all went well.

    I was at least a little clever in how I put it in; didn't make the check too obvious. But our source control system gave me away (I knew it would, but I wasn't trying to be that secure). I've wondered though, how difficult it would be to plant the change in a much older version of the source file. We were using SourceSafe, but I've never looked into the format of the actual files or how that would be done.
  • Spyware (Score:4, Interesting)

    by GrumpyGeek ( 38444 ) on Wednesday March 05, 2003 @03:58PM (#5443092)
    I have never been asked to write a back door, but I have been asked write code to covertly send us reports about how the customer is using our software. The motovation behind this is that our software (Health Care) is price based on the 'number of lives' covered by our system.

    My response was that I could do this, but that I thought that doing this without notfying the customer that it was being done was wrong. Stangely enough they did not argue with me, and as far as I know it has not been done.

  • by Anonymous Coward on Wednesday March 05, 2003 @04:05PM (#5443157)
    Real legal actions

    http://www.harktheherald.com/article.php?sid=73945 [harktheherald.com]

  • Re:Deadlines (Score:5, Interesting)

    by quadcitytj ( 320706 ) <tj@wackycow.com> on Wednesday March 05, 2003 @04:13PM (#5443226) Homepage
    In my opinion, this is a terrible way to build the system, and "debugging" methods like this are the reason so much code sucks.

    You're not developing the project for a "master user," you're developing it for normal users. Debugging code while in the "master" mode will do nothing more than give you a false sense of whether your code is buggy or not.

    It's like installing an app, and then testing it as root. It doesn't tell you anything, and it makes user's lives miserable when they can't get something to work.
  • Re:the short answer (Score:2, Interesting)

    by sandman_eh ( 620148 ) on Wednesday March 05, 2003 @04:19PM (#5443279) Homepage
    . There was recently, well, maybe 1 year, a backdoor found in an Open Source database

    I guess you are referring to Interbase [ibphoenix.com] here.

    IIRC the 'backdoor' was rarely changed default system password. This combined with bootstrap caused some interesting behaviour forming a backdoor.

    In Interbase a seperate database schema is used to store database username and password pairs, unfortunely you can't access that until you are authenticated, so a backdoor was added so get round this problem.

    BTW, this is all from memory so check the archives before taking what i've said as gospel.

  • Re:Backdoor? (Score:1, Interesting)

    by Anonymous Coward on Wednesday March 05, 2003 @04:26PM (#5443336)
    The French did that with some of the SAM equipment that they sold. During the Gulf War, the Iraquis were very surprised when their SAM batteries failed to shoot down French Mirage jets.
  • by wawannem ( 591061 ) on Wednesday March 05, 2003 @04:34PM (#5443433) Homepage
    Here is something to think about, at my last job, we ran a very popular piece of software that is used to track and manage student loans for our financial aid department. One of our financial aid reps was having problems with her peecee and I sent out one of our IT support guys who was a full time student and part-time employee. He ended up calling the support desk for the piece of software in question and the rep gave him a backdoor account that was full access right over the phone. Luckily for me, the kid was honest and came back and told us about it. We all had a laugh that such an important piece of software could be compromised so easily and the support rep didn't even think twice about giving info on the backdoor to a student. Just to be safe, I periodically checked that student's account and made sure no phony changes were made. The more I thought about it, the more paranoid I became. I talked to my boss (the IT Director at the college) and found out that we really didn't have any choice in the matter, that piece of software was the only one like it. Thankfully, I don't work there any more, and I don't think the problem was ever exploited, but I wouldn't be surprised if I ever read about it. I won't divulge the information about the backdoor, but I will say that the software in question is called WhizKid and there may be college employees here on Slashdot that are familiar with it.
  • I kinda did this... (Score:3, Interesting)

    by mooman ( 9434 ) on Wednesday March 05, 2003 @04:37PM (#5443458) Homepage
    I was subcontracted to build a Delphi app for some other folks. The payment was going to be some pocket change up front and then a percent of sales.

    Paranoid that the minute I gave them the program it was going to turn into "Mooman? We never heard of no Mooman" and screw me from the sales, I made a backdoor/easter egg: While the splash screen was showing, if you type m-o-o, the splash would change to information about my little company.

    Since the people I was providing the code to weren't Delphi folks, I figured it was a safe CYA to make sure that I got credit where it was due...

    I also wrote a perl-based self-registration CGI for them too, and in it I set up a backdoor just so I could get the count of the number of registrations.. Again, just to keep 'em honest.

    Not malicious by any stretch but I feel completely justified in what I did...

  • Re:Payment Insurance (Score:3, Interesting)

    by Procyon101 ( 61366 ) on Wednesday March 05, 2003 @04:40PM (#5443490) Journal
    My boss used to do this. When I was in college I had a part time job laying brick. My boss would embed a glass pane halfwy up any chimney we built. He would then drop a brick down the chimney when the check cleared. When one check didn't clear, he got a call a few months later that the fireplace was defective. He told them he hadn't "turned it on" yeat because he hadn't been paid. As soon as he was paid, it started working.
  • Re:Deadlines (Score:3, Interesting)

    by Christopher Bibbs ( 14 ) on Wednesday March 05, 2003 @04:44PM (#5443537) Homepage Journal
    Sure it tells you something. When it works for root and not a regular user, you know you buggered the permissions.

    I carry around all sorts of hacks and backdoors for the software project I work on. Toggle the hidden switch and bingo! You get access to parts of the code that require a different license. Helps me to debug things (extra state analysis routines) and I don't have to get the customer a temporary license.
  • Re:Deadlines (Score:2, Interesting)

    by Anonymous Coward on Wednesday March 05, 2003 @04:46PM (#5443567)
    Don't know if it's a back door, but....

    I mostly have worked on embedded systems. One of them typically didn't have a keyboard installed, but did have a standard PS/2 keyboard port hidden on the back. Pressing any KEY on that would get you past any password prompt on the front interface (LCD/Membrane 10-key pad.)

    On another system, a POS system, there's a master password that changes daily, based on a date calculation.

    Both of these are used for support purposes, though. On the POS system, there are some changes that can only be made using the "master" login and password. On the controller, field techs use it to reset a password after the owner of the system has lost or forgotten it.
  • by fzammett ( 255288 ) on Wednesday March 05, 2003 @04:50PM (#5443595) Homepage
    Interesting question... I have never and would never build a back-door into anything I code, unless it was approved and well-documented (some clients might want it).

    But, on the other hand, I have build Easter Eggs into systems I've written. One system that is used by thousands of users at my current employer has a silly little Snake game in DHTML if you find it, another high-volume system has Blackjack built-in.

    Neither has ever been found, but I have told a number of people, including the managers that sponsored the projects, about them (after the systems were deployed). They didn't seem to mind too much (looked at me kind of funny, but didn't really bitch).

    The question... is THAT ok? I know probably most big-time sofware has eggs, but as a matter of course, should it be acceptable, or is it generally unacceptable like back doors seem to be, judging by the general tone of the responses here?

    Many of the same arguments apply, such as extra code that could break and put blame on you... They might even be exploitable as security risks if really pooly written... And of course, it most probably was NOT in the client's requirements, so should you do it, even if your intention is not nefarious (mine certainly weren't).

    I don't know, I'm kind of torn now that I think about it. I've done it before and didn't think twice, not so sure I would in the future though. Thoughts?
  • Re:Backdoor? (Score:4, Interesting)

    by zlexiss ( 14056 ) on Wednesday March 05, 2003 @05:01PM (#5443721)
    While there's a few unsubstantiated rumors floating on this topic (cites?), control of spare parts is much more effective than backdoors in code.

    Iran's F-14 force was effectively grounded by the embargo Carter placed after the fall of the Shah - fighters and other complex equipment need a steady stream of parts, and a veritable torrent under non-peacetime use. Search fas.org or aerospaceweb for "iran f-14" for a couple views on this, among other sites.

    As pointed out, backdoors carry the risk of being used against you. But if you've got all the spare parts, you get to fly.
  • Re:Deadlines (Score:2, Interesting)

    by digital photo ( 635872 ) on Wednesday March 05, 2003 @05:01PM (#5443722) Homepage Journal

    So true!

    Seriously, unless the coder in question is devious, lazy, or has personal motives... they are most likely busy working on getting the production code to work.

    The last thing you need is to debug a problem only to find out that the "special something something" is what was causing the problem because you didn't code your special access correctly.

    Now, disgruntled employees or those who feel the world are constantly against them, who feel that they have been slighted and WANTS to get back at someone or something to even out some score... they will find ample time to make more than just back doors regardless of their workload. Who needs sleep when revenge is just beyond the next subroutine?

    As always, a good working relationship and strong trust going both ways between co-workers and management is key. Without that, backdoors would be the least of your worries.

  • by tdelaney ( 458893 ) on Wednesday March 05, 2003 @05:07PM (#5443761)
    A good example is a web-based app I developed for our intranet. We were having trouble debugging customers problems with it (normally PEBCAK - and we'd get useless reports back from customers, and be unable to replicate the problem since we couldn't log in as them).

    So, we created a method whereby an administrator could log in as any non-administrator user by supplying the non-admin username and using their own admin password (on a page separate to the normal page). No global master password. One-way encryption of passwords.

    When someone logged in this way it was logged to the database that the customer was being spoofed (and by whom) - audit trail. Once the login check had passed the admin could act as if they were that user.

    This was considerably more secure for the customer than asking for the customer's password and telling them to "change it later" (which was what we'd had to do previously).

    This became an official feature of the app - albeit a not-very-well-known one. To use it you had to have superuser access - which usually meant that you had direct access to the back-end database anyway.

    This reduced the time required to deal with problems by a massive amount.
  • by kakos ( 610660 ) on Wednesday March 05, 2003 @05:10PM (#5443793)
    The NSA, interestingly enough, measures their computing power in acres. Yes, not Megahertz or flops or anything lik e that. They measure their computing power in terms of how many acres of computers they have. I think the current value is several thousand acres of computing power.
  • Telco eastern egg (Score:2, Interesting)

    by DozePih ( 188698 ) on Wednesday March 05, 2003 @05:31PM (#5444025)
    Some years ago I worked for a really big telecom firm shipping POTS telco equipment to almost all countries in the world. We had this really smart guy who was working with a command module (a module that receives the CLI command the operator would type to configure something). During system test, one of our testers saw something in the code which looked really weired. It was not really weired but the code in this particular piece of code did something which wasn't obvious. The tester tried to enter a commad which excercised that particular piece of statements. Nothing happend. After a while he discovered this code was dependant on a specific type of hardware configuration. After this was set up he tried again to fire off the command. The reply was:

    Perre was here!

    (his name was Per, Perre is slang). He was sent to the boss and he had to take it away. But I actually think this code got out to customers. If Per reads this you might wanna fill in what really happend at Ludde's office.
  • Re:Payment Insurance (Score:5, Interesting)

    by Ian Bicking ( 980 ) <ianb AT colorstudy DOT com> on Wednesday March 05, 2003 @05:43PM (#5444168) Homepage
    I worked before for someone who had code like that put into his application (without his knowledge, of course). There was some pay dispute, and the programmer started triggering it. In this case it deleted the customer database, not the code. Ultimately the programmer was charged criminally (still awaiting trial), and possibly a civil case following.

    Now, I have a feeling there was bad stuff on both sides (and it's taken me a while to extract myself from this job), but you have to be careful when you destroy stuff. It's probably okay when you are deleting something that is your property and isn't paid for. But it's questionable if he already paid for part of the work, or if you were destroying any data created by his operations. If any money had been paid, or if it would compromise data that didn't belong to me, I wouldn't try it unless I had written something into the contract (I've seen pretty generic-looking contract terms which would imply you could effectively confiscate the work if it wasn't paid for). You also need to define when it's not paid for -- 30 days after completion, 60 days...? It's not professional to do something so forceful without making an effort to resolve things more peacefully.

  • Re:Deadlines (Score:5, Interesting)

    by eric777 ( 613330 ) on Wednesday March 05, 2003 @05:56PM (#5444300) Homepage
    For those who can't get to the link, it suggests a clever back door - place a trigger in the original Bell Labs circa 1969 c compiler, which then self replicates in ALL FUTURE C PROGRAMS.

    Please note - this link is to a theoretical discussion of how the first UNIX creators could have installed a very-hard-to-detect backdoor.

    They never actually did this.

    Anyway, this simple example wouldn't work, for two reasons:

    1) It would leave lots of evidence behind in the log files - someone, somewhere, would eventually notice.

    2) Not *every* c compiler was compiled by a cross compiler, creating an uninterrupted chain back to the original C compiler written at Bell Labs. A number of platforms were built from scratch, using bootstrap compilers rather than cross compilers.

    Clever, though.

  • Wrong (Score:0, Interesting)

    by Rui del-Negro ( 531098 ) on Wednesday March 05, 2003 @06:14PM (#5444485) Homepage
    My house has a back door. Even if I'm the only one to use it, it's still a back door. It's still there and it can be used by others if they find it unlocked.

    RMN
    ~~~
  • by DaTreeBear ( 651287 ) on Wednesday March 05, 2003 @07:01PM (#5444919)

    I saw first hand how back dooring software could provide job security for one developer.

    I worked at a company that produced some very complex financial and utilities management software. They needed a way to have these two applications talk to each other and their solution was a daemon to act as a conduit between the two. Since it had to assume user privs the daemon was set to run root suid.

    The code had been in production for quite some time when it was assigned to developer to maintain. The code was a mess (it had been written originally by people unfamiliar with programming in the Un*x environment). The developer was tasked with cleaning up the code if he could. Since they were very busy there was little or no supervision over him. As long as the daemon worked everyone was happy.

    Eventually the development department decided to restructure and investigated letting this guy go. He had a reputation of being a bit of a hacker so they came to me (I being the Un*x/Network admin at the time) to see how we might protect ourselves from reprisals should he be let go.

    I was fairly confident that my systems were tight. The biggest weakness as I saw it was this daemon. So I checked out the source code and started going through it. As I did, I discovered that this simple daemon had developed some new and interesting features. Along with it's normal duties, it also doubled as a telnet daemon (you could telnet to the listening port and login just as in telnet - except this one would ignore /etc/securetty thus allowing remote root logins over an unencrypted protocol). Another feature was it's ability to tunnel other ports through it's own listening port.

    The code was too convoluted for me to get a complete grasp on it in the time alloted. I went back to the VP of development, pointed out what I had found, and suggested that he would need to have every piece of code this guy had worked on audited to make sure it was clear of back doors. He visibly paled. The developer in question had been there for over 5 years by this point and had touched nearly everything at one time or another.

    In the end they simply moved him to another department (he is still there as far as I know). They felt it was too cost prohibitive (and dangerous) to let him go.

    They never did tell their customers about these gaping security holes either.

    Lessons learned:

    1. Never trust code you haven't audited yourself. I had a daemon running on my servers that was allowing remote root logins and I didn't even know it.
    2. A lack of integrity can be rewarding.
    3. Customers are WAY too trusting of vendors.
  • by Kostya ( 1146 ) on Wednesday March 05, 2003 @07:10PM (#5445010) Homepage Journal
    Your reading too much into a tongue and cheek statement.

    What my friend meant was that it usually takes a good amount of bitterness to make anyone consider contracting. It's a scary step and most are intimidated by their manager's comments about "contractors have no benefits".

    That's all it meant--i.e. "perhaps you are now bitter enough to take a risk". But what you said still applies. I had many friends who were bitter enough to give it a try. Only one friend of mine did and is still doing it.

    My comment was the same thing: if you feel that betrayed, realize there are other options.

    As for me, I did it because I was fed up with flushing my life down the drain in salaried positions. When I get paid by the hour, I find I get more equitable treatment. Employers *think* about what they want me to work on. And usually they are more serious--since my time equals money in a very easy to use formula. And I don't feel like I am being cheated when I get a heavy work load--more hours equals more compensation.

    Less pay or not, salary is for suckers. Even if contracting is making half the money, right now pay is down across the board and salaried employees are being asked to work twice the hours. So do the math.

    (OTOH, during the dotcom days, I made some serious money. Ah, things will never be like that again *g*)

    For me, contracting is about quality of life--as in, I have one now.

  • voting (Score:2, Interesting)

    by zogger ( 617870 ) on Wednesday March 05, 2003 @07:23PM (#5445114) Homepage Journal
    --main reason I am against computerised voting. I'd bet a LOT there's all sortsa nifty "features" in the code that got "reviewed" and judged "ok".

    But, it's just so gosh darn conveninet to have the computer vote for me! And it's so modern and techy trendy! And "they" wouldn't do anything to affect a vote would they? I mean it's not like anything as important as control of a state or the world's most powerful country is actually enough motive and incentive for the nice people at E-VOTIN'TECH.con Inc. to do naughty things, is it?

    Nawww-never happen in a billyun years, people are all honest, rilly and trully!
  • Re:Payment Insurance (Score:3, Interesting)

    by Procyon101 ( 61366 ) on Wednesday March 05, 2003 @07:25PM (#5445136) Journal
    Actually I find that facinating, being a witness to the event, that the legend came full circle. I was not aware of this as being an urban legend. This may have been where he got the idea, having heard it from someone else in a bar or something (not being a technical person I'm sure he wasn't on the internet.)

    In construction it is a very common thing to not be paid by your general contractor due to cascading bankruptcies. I have seen many contractors take different precausions against not getting paid, of which I found this one particularly clever. It's possible it's not a legend at all and does stem back to to the 30's and 40's or even before since the practice is very easy and cheap for a mason to perform, requiring only about a 10"x10" thin sheet of glass and about 15 seconds of installation time; about the same cost and time that accompanies laying a couple brick ties.

    Needless to say (not trying to convince you of the legend, as both I and you are random internet entities that could care less what each other think) but that's the first urban legend I ever saw practiced :0
  • Re:Deadlines (Score:2, Interesting)

    by Orthanc_duo ( 452395 ) <forum&orthanc,co,nz> on Wednesday March 05, 2003 @07:46PM (#5445291) Homepage Journal
    In my opinion, this is a terrible way to build the system, and "debugging" methods like this are the reason so much code sucks

    I have had situations where code runs fine on the development box but when it is copied to the (supposedly) identically configured production box things fall over. I don't have login access to the production box so I have a backdoor that allows me terminal access for debuggin purposes. Without this bugs that currently take me 5 minutes to find would take someone else sever hours of sorthing through httpd.config files and trying to see what diference is breaking my code.
  • Re:Deadlines (Score:2, Interesting)

    by misof ( 617420 ) on Wednesday March 05, 2003 @08:43PM (#5445639)

    someone just has to look through the executable for strings.

    Oh my... looks like you never wrote something remotely similar to a backdoor. And I mean no just-for-debugging-and-then-i-remove-it backdoor, I mean a real one. A backdoor that's meant to be misused. E.g. you are writing some software for your bank ;) That kind of backdoor. The first rule is that the backdoor should be as invisible as it is possible. And some strange password-like string is the simplest way of shouting "hey, I'm here!"

    A real backdoor should look e.g. like the one legendary that once was in a C compiler... for more info see the jargon file entry on backdoor [watson-net.com].

  • by PiratePTG ( 608376 ) on Wednesday March 05, 2003 @08:52PM (#5445697)
    Yes. I put a backdoor into every program I ever wrote.

    Why? Because I REALLY do not like to lose. If I ever got screwed by a client, they would stand to lose more than me.

    Have I ever USED one of my backdoors? Only once in over 24 years of working with computers. I wrote a program for a college professor who then turned around, changed the opening banner/credits/logon page to HIS name, and then sold it to the college as his own work. I went in, changed the page back, blew away the user and password file, and disabled the logon sequence. Everyone on the college's staff who had a computer got to see what I said about him the next morning when they tried to log on.

    A few weeks later, after all the shit was through flying, I gave the college the program for free. Along with the source code (Open source circa 1979!).

  • Re:Deadlines (Score:5, Interesting)

    by capnjack41 ( 560306 ) <spam_me@crapola.org> on Wednesday March 05, 2003 @09:05PM (#5445788)
    someone just has to look through the executable for strings.

    For this reason, if I write in backdoors like that in a PHP script (a single admin password, for example), I don't actually store the password in a database or the script plaintext, I use an MD5 hash. Even if someone somehow manages to see it (they shouldn't anyway), it's still hard for them to guess what the password is (which they need to POST to the script).

  • Re:Deadlines (Score:1, Interesting)

    by Anonymous Coward on Wednesday March 05, 2003 @09:19PM (#5445868)
    After I quit, I was reading my former boss's email for the next year (having set up the email and web for the company), though he tried to get an injunction to "prevent me from hacking their system" (to muddy the case I had to claim unpaid wages), he never bothered to change his password, till they changed ISPs. At least, there were no surprises in the case...
  • Required Backdoors (Score:2, Interesting)

    by oldCoder ( 172195 ) on Wednesday March 05, 2003 @10:49PM (#5446375)
    Before the internet and hacking became so popular, it was often necessary for a system vendor to leave a hard-coded backdoor so when the client (user) totally broke the thing and called complaining, you could fix it. Sometimes this would save the downtime required to send somebody flying cross country. This was especially useful when selling systems to organizations that didn't really understand computers. In more sophisticated and security-conscious organizations, we would tell them to turn on the modem in the back when they needed our assistance, and they were willing to pay for the connectivity.

    The less sophisticated customers would never authorize anything like that until and unless they were in a panic, so we learned to pre-install it. In general, saving the customer from himself is necessary to maintain good customer relations, and is probably the origin of the term "Customer Engineering".

  • Re:Deadlines (Score:4, Interesting)

    by slamb ( 119285 ) on Wednesday March 05, 2003 @10:54PM (#5446403) Homepage
    I have a overriding password that bypasses the standard authentication system and lets the admin user assume the username/authentication details of an arbitrary user. That's important so I can have the same experience as that user.

    This is something I wish more systems had, but not in a hardcoded (backdoor) way. Cyrus SASL/IMAPd does it correctly: they've completely separated the concepts of authentication and authorization. So you can say "until further notice, user slamb can log in and do anything user bob can do." It serves the same purpose, but in a maintainable, open way. You can have a group of administrators that can log in as anyone, but without needing to know either everyone's password or some master password that's difficult to change if someone leaves...they can just use their own, and it can be disabled/enabled on a per-person easily.

  • by rice_burners_suck ( 243660 ) on Wednesday March 05, 2003 @11:14PM (#5446506)
    I remember reading about this one guy who put a backdoor in login so he could log in to any unix workstation with a pre-specified username and password. But that would easily be found in the login source code, so he modified the c compiler to recognize that it was compiling login and then silently insert the appropriate code in there. But that would easily be found in the compiler source code. So he modified the compiler to recognize that it was compiling itself and then silently insert the code that recognizes login and silently inserts that code. Then, he deleted these instructions from the compiler source code, but since the compiler would silently insert them in the object code, there became a situation where code that did not exist in source form would create a chain reaction that would allow this dude to login to any unix machine. It was totally invisible and would work as long as nobody changed the compiler or used a different compiler to compile the compiler, or something weird like that.

    But I don't remember the guy's name. Or maybe it was a chick. But whoever it was, they were definitely a staid and steadfast compiler writer.

  • Re:the short answer (Score:3, Interesting)

    by Reziac ( 43301 ) on Thursday March 06, 2003 @04:02AM (#5447558) Homepage Journal
    I've personally seen this, where what we have every reason to believe was an innocent mistake by the first developer, we think was deliberately *preserved* by the second developer for the express purpose of using it as a backdoor for malicious code (he'd expressed a desire to take revenge on "ungrateful users"). Long story short, but you get the idea.

    Fortunately, in this case the flaw was noticed and corrected by a later developer.

  • by CreatorOfSmallTruths ( 579560 ) on Thursday March 06, 2003 @05:25AM (#5447784)
    wow. great post man.

    it had been written originally by people unfamiliar with programming in the Un*x environment

    What do you mean by unfamiliarity with environments? I am a windows programmer who write programs for Solaris as well. I haven't seen any difference in the implementation (except for the obvious win32 / POSIX differences). Please tell me more.

    1.Never trust code you haven't audited yourself. I had a daemon running on my servers that was allowing remote root logins and I didn't even know it.

    .. Or simply use a programming language which doesn't go deep enough to cause problems (some "sand boxed" pl).

    Customers are WAY too trusting of vendors.

    ... That is correct, but what else can they do? there is so much one can learn in a life time, so if someone went and specialized in customer service for example, he wouldn't have the time to learning how to program, and obviously not enough time to learn how to code well.
  • by Anonymous Coward on Thursday March 06, 2003 @08:56AM (#5448246)
    A customer I was working for deployed our web application on a system we (the developers) had no access to at all. The application did not go through any form of testing before being deployed, so as expected, bugs were numerous.
    We asked for access to the machine so we could run simple SQL queries just to figure out WHY things were bugging when the reports started dropping in - but it was against the customer's security policies to allow us on the machine.

    It all ended up with us coding a web page which simply was one form and a submit button. Into the form you could type any SQL query and it would be executed directly on the tables, and the result would be displayed in an HTML table on the page. The page was of course password protected and not linked to anywhere, but still...
    This is possibly the worst case of security policies gone wrong I've ever seen, and fortunately also the only 'backdoor' I've ever coded.

    Posting anonymously to protect the customer.

They are relatively good but absolutely terrible. -- Alan Kay, commenting on Apollos

Working...