Do You Write Backdoors? 1004
quaxzarron asks: "I had a recent experience where one of our group of programmers wrote backdoors on some web applications we were developing, so that he could gain access to the main hosting server when the application went live. This got me thinking about how we are dependent on the integrity of the coders for the integrity of our applications. Yet in this case a more than casual glance would allow us to identify potentially malicious code. How does this work when the clients are companies who can't perform such checks - either because they don't know how, or because the code is too large or too complex? How often do companies developing code officially sanction backdoors...even if means calling them 'security features'? How often has the Slashdot crowd put a backdoor in the code they were developing either officially or otherwise? How sustainable is the 'trust' between the developer and the client?"
Happens everywhere (Score:5, Informative)
Re:Deadlines (Score:5, Informative)
Some people write backdoors to facilitate debugging. They don't have to worry about checking with the customer for various passwords - they just type in "IAMGOD" or some such hard-coded password and they are in.
For the record, I don't approve of backdoors. First, they provide security issues - someone just has to look through the executable for strings. Second, these things are never changed when employees move on.
Re:Happens everywhere (Score:4, Informative)
A very different animal, indeed.
Contract Programming and Backdoors. (Score:3, Informative)
If User = Me then
bypass security
else
Security/Validation
end if
This way I can test the app without having to go and validate against the system which I don't have rights to. When we move from test to production, this backdoor is left in until the client validates user acceptance test(UAT) phases, at which point a second production move is done without the offending backdoor. In otherwords the backdoor is the first UAT bug reported.
I suspect this is common for contractors.
Ted
Re:microsoft (Score:3, Informative)
(Kinda funny actually, someone who had to support Netscape 4.x for any length of time must have wrote it.)
However, once the phrase was found out it made it easy to start cracking the encryption. So it was removed and replaced with something else.
Re:Payment Insurance (Score:2, Informative)
If it's not fully outlined in the contract, he could have a really fun time in court.
Re:Never have, never will (Score:1, Informative)
Is that a backdoor? Or just lazy design and poor planning?
Re:microsoft (Score:3, Informative)
Always. Always. Always. (Score:5, Informative)
This is hard to see from a large-company perspective, because as a developer you aren't the one collecting the money, you have accountants and lawyers and rabid CEOs that make sure you get your contract's worth one way or another. But small companies don't have this option--they can't afford lawyers or even the time to spend in court. They have to find where their next paycheck is coming from.
As a result, many of our clients have tried to jerk us around by either dragging their heels on payments or doing something underhanded like changing passwords to servers to try to lock us out and give us the finger. There have been instances where I've sent out a "it's all done, check it out" email and had the live server's passwords changed on me minutes later, follwed by a "we're not paying" response.
Simply put, backdoors are a small company's only assurance that it will be paid for the work it has done. Given, the backdoors that I put in aren't to r00t the server or take down a whole subnet, they're limited to disabling the application that we developed. Until the client has paid their bills, it's still our code, and we have every right to put in as many backdoors as we want.
Yes and no (Score:3, Informative)
Another time, we had a one line message window - if you sent a message with a severity of 'w','e','s' (for warning, error, and severe respectively), the message would stay for progressively longer amounts of time before the next message could wipe it out, and it would flash different colours and beep for the more severe ones. For message type 'i' (information) it would immediately be replaced by any subsequent messages. Once when it was late at night and I was getting a bit punch drunk, I made one branch of the program put out the 'i' message "How the fuck did that happen?" followed immediately by a more informative 'e' message. Nobody ever saw the 'i' message, because it was replaced so quickly. Until one day when somebody put a scroll bar on the message window so you could scroll back and see previous message. I got a call from a trade show requesting an immediate patch. Ooops. That's the closest I've ever come to putting in an "easter egg".
I think putting in secret backdoors to get access without telling your superiors is very bad news, and could quite easily get you fired.
Not All Backdoors Are Nefarious (Score:5, Informative)
I was a senior software engineer at Whistle Communications, and later at IBM, for the Whistle InterJet/IBM Web Connections products. I did most of the last generation of email, user account management, mailing list, internal database, and other infrastructure services for the product.
This product has back doors. But they are all explicitly guarded.
From the front panel of the InterJet, you can enable remote management, for a short period of time. This allows a tier 1 support representative to help you configure/maintain your InterJet, while you are on the phone with them.
This required explicit customer consent for remote Web UI based administration.
From the Web UI, if you are logged in as "Admin", there are "secret URLs", which you can use to obtain raw access to the configuration database for much of the InterJet: all of the parts I personally wrote, and some of the rest of it, where the engineers used the standard APIs we had agreed upon for user interface and common configuration store code. This was done to work around the Web UI design, which failed to expose many useful features of the product, which we engineers knew would result in customers inability to use the product as it had been sold to them. It was likewise useful for tier 2 support, to avoid engineering escalations.
This required explicit customer consent for remote Web UI based administration.
Also from the front panel of the InterJet, you can enable "telnet mode". This was done by going to a particular configuration screen on the front panel, and entering a "T" (for "Telnet") on the front panel keypad at that screen. A time limited ability for a remote engineer to come in and manually access the system to diagnose and treat engineering escalations was thereby enabled.
This required explicit customer consent for remote shell based administration.
In addition, this mode only worked from a specific netblock of IP addresses.
Once in at the shell, it was possible for an engineer to force any of these protections. It was common practice for a persistant problem to leave the remote access for engineers open until the problem was verified to be resolved.
There was also a "magic" front panel sequence that would permit you to play "Pong" on the LCD display. I filed a sev-1 bug ("total loss of functionality") against the maintainer, because it did not support "Skunks" (scores of 7-0) as a victory condition. 8-).
All of them were under direct user control, in terms of outside access.
None of these are "proprietary" or "confidential", they just aren't useful to people without documentation.
Other than working around the Web UI designer's intent, with the second back door, none of these really qualifies as nefarious (I would argue that working around the Web UI designers intent qualifies as "routing around the damage").
-- Terry
woo woo. backdoors. scary. (Score:1, Informative)
If the coder was good enough to handle your app, he's good enough to write a robust and secure backdoor.
If you want to know what an app is doing when you run it, run it under strace, and you will know everything it does.
If it's a windows app then don't even worry about it, cuz you've already lost the game.
FoxPro Command Line (Score:3, Informative)
Back in the days of GOD (Good Ol' DOS) the variable memory need would grow too large for that reserved, needing tweaking. Or a scratch database would get corrupt from a hardware failure.
Almost all things that could go wrong could be corrected without having to tear the code apart...because it always worked in my development systems; it only broke in production environments. The "backdoors" proved invaluable for tending to the screwups of the DEUs (Defective End Users :)- example: one of my clients had forgotten to use the AR functions and had literally MILLIONS of dollars owed to them in the system (only), all because they never entered checks received. Arghhh!
Re:Why yes... (Score:2, Informative)
Jim just sang the song.
The other song in their studio albums Jim or any of the other Doors did not write was Alabama Song.
Re:FUNNY != Flamebait (Score:1, Informative)
I have, will again. (Score:3, Informative)
Of course, I later learned that there was a faculty account set up with no password for one stupid printer in some jerk's office, and so the whole thing was in fact wide open. I had tried to get the whole thing moved to a more secure setup, but I wrote it as a summer contractor and not an employee (semantics) and so I had no pull with the tech people. Some students got in, copied every picture, and created their own copy that was fully public and even advertized. So in the end, the backdoor I put in remained secure, but the backdoor some lazy fuck put in in a completely unrelated application so they could have their own personal printer screwed me over, released private information into the wild, and almost got at least one (good) student expelled for discovering the whole thing.
Re:Microsoft believes in them.. (Score:2, Informative)
Writing Backdoors? (Score:2, Informative)
It was left in for a while because a) We had an extremely low volume of users b) We were running low on funding and it was believed that more "political" improvements were neccessary to get more VC money and c) We were planning to scrap our internal Email as soon as we finalized a deal with a third party vendor.
I eventually fixed that bug, but considering that we had no QA department to speak of I wouldn't be surprised if others got through that I didn't know about. That company was all about smoke and mirrors anyway, I think upper management's philosophy was along the lines of "take the money and run." I think that the way creditors were stiffed was a more serious matter than the backdoors into our system, which we all joked internally that we hoped none of our users were taking seriously. I know, it is a sad thing to joke about, but what happened was our old bosses (who had been replaced) had taken a proof of concept that one of the developers (who had also left by that time) had created and put it out on the Web as if it were a live site rather than a working prototype. So, we basically couldn't take it down if we wanted to because upper management felt that having a bad Website was better than having no site at all.
We ended up reengineering the whole thing from scratch, and hadn't finished with it by the time the money ran out. (Ah! The good old dotcom days!)
Re:Extreme? (Score:2, Informative)
And how does that differ from normal prejudice?
There is compliant disposition of employment and then there is non-compliant disposition. Picture a couple of burly security guards with an empty cardboard box explaining to you that you need to turn in your security credentials and pack your personal possessions under supervision during the next 5 minutes before being "escorted" off of the premises.
Contrast that with the picture of a large party being thrown in your name as you complete your "two weeks notice" into early retirement for a job well done after a career of successes.
Re:Payment Insurance (Score:2, Informative)
For example:
$ gzip test.pl
$ mv test.pl.gz test.pl
$ zcat test.pl | perl
This would probably work well enough to hide source from non-techies.
History: Sendmail, DEBUG and Morris (Score:2, Informative)
Anyone who writes a backdoor should be fired ASAP and the door should be closed. Failing to do so can easily make your company liable for damages caused by someone using it. It's a miracle that Allman didn't get prosecuted with Morris - he probably would today, but the legal folks were more clueless about computing risks in those days.
Re:Deadlines (Score:4, Informative)
I think you misunderstand what my message meant: I don't have a "master" mode, I have a overriding password that bypasses the standard authentication system and lets the admin user assume the username/authentication details of an arbitrary user. That's important so I can have the same experience as that user. It doesn't compromise any data security, as I can just as easily see all the user's data when it's sitting on the database server.
Re:I have, will again. (Score:3, Informative)
My point is that I would consider a feature like this to be a necessary design element of the solution (given the reliance on the LDAP authentication database), and not a backdoor in the sense discussed with this question. Now, the idiocy that led to an unsecured account with faculty priviledge, that's another matter. Not really a backdoor, but a serious breach of security. Didn't they have policies against this? This is exactly why you need security policies, so that the admin has something to point to with greater authority than the idiot insisting they "need" it.
Re:Deadlines (Score:5, Informative)
If they fire him tomorrow, they have no way of removing his access from the system, since they don't even know it's there
Everyone seems to focus on the actual piece of code that acts as a 'backdoor' and forgets that just knowledge of the system is just as dangerous. No sufficiently complex system can be foolproof both in design and implementation. During developent debugging code gets left over, some shortcuts are taken, etc.etc. Nobody except the developers who designed and wrote the stuff even know about what exactly is in the code. While I do not put any backdoors in my code intentionally, I have the sufficient knowledge of the system to poke a few holes big enough for a full compromise.
In short: If you have a sufficiently large system, chances are that a disgruntled developer can compromise or damage it even without placing any backdoors in the code ahead of time. Knowledge is power. Obviously, this does not apply to open-source projects that receive a fair amount of peer review (or just people tinkering with the code).
Re:Job Security (was Re: Deadlines) (Score:3, Informative)
When I said my servers were running a daemon that allowed remote root logins, I wasn't meaning in the sense that they were open to Internet at large.
All of the servers I was dealing with were on our internal LAN and behind our firewall. The application in question was a daemon that was supposed to listen on a given port. In fact, we had scans set up to monitor and alert us if the service went down. So our portscans showed it as a listening port as it should have. We would have had to put a sniffer against it to see that it was passing traffic that it wasn't supposed to.
I am not saying I couldn't have detected that the daemon was doing a bit more than it ought to but it wasn't quite as simple as you suggest.
Re:Deadlines (Score:1, Informative)
You seem to be confused on the issue here. If you're trying to get a sense of whether your code is buggy or not, that's not debugging, that's testing. If you're debugging, you already know your code is buggy -- you're trying to find out why (or how) it's broken.
When you test things in "normal" mode, and something fails, you enter into "master" mode to find out what's going wrong.
Re:Deadlines (Score:2, Informative)
Mangement ASKED for a backdoor (Score:4, Informative)
Yes I did and it was expensive (Score:3, Informative)
Don't do it.