Ethical Dilemmas Related to Technology 885
Anonymous Coward writes "I have a relative who will be teaching a college class on the topic of ethical dilemmas brought about by new technology. Unfortunately, he doesn't keep up with technology news, so he's not sure what the most relevant dilemmas are. For example, 'If robots came alive, would we be justified in killing them?' is one that might come up if nothing more relevant were suggested. (OK, it might not be that bad, but you get the idea. He was using Netscape 4.76 on system 9 until last week.)
So, what are the most relevant ethical dilemmas brought up by technology? Note that I am looking for ethical dilemmas, e.g. 'Is Activity X moral?' rather than legal dilemmas like 'Is the DMCA constitutional?' Now is your chance to guide the young minds of the future toward stuff that matters."
Responsibility (Score:3, Insightful)
But the coming rise of nanotechnology should also not be overlooked. Sure, the grey goo problem is largely hype, but what if something like that really does happen? Should the scientists working in nanotech be held responsible for an epidemic on a massive global scale?
These are all issues I would like to see addressed in a class on ethical dilemmas in technology.
Re:Responsibility (Score:5, Insightful)
By saying these scientists should be held responsible would akin to your atomic bomb argument. Is Einstein more responsible than Truman who ordered the massacre of hundreds of thousands of innocent Japanese civilians?
I would hope that the answer would be no. Then we'd have civil proceedings where Victim Y would sue the inventor of Technology X because said technology brought bodily harm, even though Perpetrator Z is the actual cause of the incident.
Oh, but wait. We already have people seeking injunctions agains gun manufacturers because they produce a lethal weapon.
Re:Responsibility (Score:3, Interesting)
This is good, related, and thought-provoking. If these "creations" are actually discoveries rather than inventions, then one might argue that someone will eventually find the dangerous discoveries, so as a responsible scientist, one must look these even more aggressively, if only to better understand (and thereby be better prepared to control or limit damages from) them.
Sorta like the guy w
Re:Responsibility (Score:4, Informative)
Someone's been reading too much US patent number 4,666,425 [uspto.gov].
Re:Responsibility (Score:3, Informative)
There's prior art described in C.S. Lewis' "That Hideous Strength". It's book three of the sci-fi series including: "Out of the Silent Planet" and "Perelandra".
The premise is based on "The Saracen's head," which is kept alive through mechanical and biological means, although the brain has been grown past the bounds of the head itself. It's a really creepy picture that Lewis creates, but the books were printed in the 60's, a full 20-30 years before this particular patent was filed/granted.
Re:Responsibility (Score:3, Insightful)
There was a small blurb just before tonight's edition of Tech Nation [technation.com] where a woman (I'm embarrassed to say I've forgotten her name) was describing her own coming-to-grips with being a weapons engineer during the 80's. I wish I had a transcript of her monologue, but she told a story of being approached and reproached by Douglas Englebart [ideafinder.com] for doing the unconscionable. He reprimanded her, saying t
Re:Responsibility (Score:5, Insightful)
Are you entirely sure you want to be taking this line, right now?
Re:Responsibility (Score:5, Insightful)
The Japanese people were innocent victims?
Yes. These were cities full of civilians that got nuked, not military bases. Hospitals, schools, kids, grannies, you name it.
I actually understand the reasoning behind nuking them. A brutal demonstration of the Allies' strength quickly forced a rethink from their government.
There is a word used to describe the slaughter of civilians in order to shock the enemy into capitulating. That word is terrorism.
There is nothing innocent about anyone who went along with that regime and supported their cause.
Last time I checked, they were not a democracy. The USA, on the other hand, does not have that excuse to hide behind.
Re:Responsibility (Score:3, Informative)
Technically they were military support infrastructure.
If you want to get into 'tit-for-tat' arguement then perhaps you should crack open a history book. Especially the bits about how the Japanese were actively using bio-warfare in IndoChina, including experiments on civillians that were as brutal, if not more so, as those perpetrated by the Germans.
Incidentally, the term terrorism is only really applicable where actions are against civillians not in
Re:Responsibility (Score:5, Interesting)
"I'm sorry, there is nothing innocent about supporting a regime..."
Who voted for Hideki Tojo?
"The true innocent victims were the American sailors who were bombed in Pearl Harbor by the same people we were discussing peace treaties with."
1.) From the Japanese POV, Pearl Harbor was a cold war gone hot. US trade embargos (especially on oil) were strangling the Japanese war effort (whether the Japanese war effort was moral is a completely different story), not to mention indirect and direct assistance the US was providing Chiang Kai-Shek's government. What do you think the Japanese diplomats were discussing with the US in Washington, tea parties?
2.) A war declaration was supposed to be delivered just before the Pearl Harbor attacks.
Re:Responsibility (Score:4, Interesting)
So they dropped two nukes, bang bang, to make it look like they had a stockpile of them and this was the beginning of the end, in which all Japan would be reduced to a scorched smoking ruin. They only had the two, but the Japanese didn't know that, and couldn't know that. The prospect was unthinkable, and so the Emperor was forced to do the unthinkable to prevent it: surrender.
We make the mistake of believing that everyone thinks like we do, that all cultures are essentially like ours. They aren't. I doubt that even the Japanese today can grasp how single minded the people of Imperial Japan were. Living in a pluralistic democracy, we certainly cannot grasp it. The stories of kamikaze pilots and hermit soldiers who waited 15 years after the war for orders that never came are all true.
Re:Responsibility (Score:3, Informative)
They had already firebombed the crap out of Tokyo anyway. More people were killed in the Tokyo firebombings than either Hiroshima or Nagasaki (but not both combined).
graspee
Re:Responsibility (Score:5, Insightful)
Otherwise, cars, airplanes, guns, knives, most (legal) drugs, etc. might not have ever made it our way. Sometimes the risks are worth it. Sometimes not, but you can't make a generalzation like that without a lot of people suffering from it.
Re:Responsibility (Score:5, Insightful)
Re:Responsibility (Score:4, Insightful)
- Cath
Re:Responsibility (Score:3, Funny)
...uhhh Atomic Bombs are also good for, ummm....making....large amounts of toast?
*runs*
Re:Responsibility (Score:3, Insightful)
Indeed, there is no useful application for nuclear fission, or fusion. We'd better stuff that Genie back in the bottle.
I did (Score:4, Funny)
It was for a high school physics project, to demonstrate our knowledge of simple machines. Fortunately we had a teacher who was pleasantly eccentric herself and who trusted me quite a bit, and so she barely batted an eyelash when we started piercing a large coffee can with 12 inch serrated steak knives, facing outward. (I think there were four knives, not three, though)
The idea was that a single heavy weight on a cord provided the motive force to pullys which operated both the knife cylinder and a conveyor belt. The idea was that you would release the weight, the conveyor belt would slowly turn and bring vegetables (we did use carrots for the test run) up to a slide. Halfway down the slide they would be stopped by a set of bars, and would be stuck there until the knives came spinning down (fitting in between the bars) to chop them into pieces small enough to fit in between the bars and slide gently down to the bowl at the bottom.
The problem was that we didn't put a counterweight on the canister with the knives. You all can see what happened next, can't you? We kept adding more and more weight to the motive cord, so that it would provide enough torque to lift that heavily off balance can of knives through half a revolution. It was only a fraction of a second after releasing the weight that it became obvious that, once those knives had been lifted through half a revolution, they'd have (to put things in high school physics terms) a bunch of potential energy saved up which would start working with, not against, the motive weight, for half of each revolution. So, needless to say, that big coffee can with the loosely affixed knife blades sticking out started spinning like a freaking lawn motor, on a steel frame held together by little hand-tightened C-clamps, surrounded by onlookers.
Somewhere there may still be video tape of me screaming "EVERYBODY DOWN!", carrot chunks flying in every direction, and a can full of steel weights clanking to the floor while about six cubic feet of unshielded space were rendered unsafe for human flesh by three or four rapidly spinning steak knives.
Oh, but before I get modded to hell, I should bring this back on topic: if some idiot (okay, some other idiot) had activated the triple-stabbing-auto-knife carrot chopper without my knowledge with the intent of hurting someone else, I wouldn't feel in the least bit responsible. It's not like they wouldn't be able to hurt someone else badly with a non-auto knife, or even with just a big rock. I am feeling pretty grateful that I didn't take someone's fingers off myself, though.
Re:Responsibility (Score:3, Interesting)
In addition, unbeknownst to the public, the U.S. also had a program to weaponize anthrax for use against Japan if the bomb hadn't worked out. The inventors of the atomic bomb were well aware that Germany had an atomic bomb program as well. SOMEBODY was going to invent the bomb, the only question in their minds at the time was if it would be used for or against the Allied forces.
The real tragedy of the atomic bomb was the inability of the political and diplomatic process to assure peacful coexistance with
Re:Responsibility (Score:5, Insightful)
On the A-Bomb (Score:3, Insightful)
As far as their moral involvement, Oppenheimer, after the very first Bomb test at the Trinity site in New Mexico, quoted the Koran, saying "I am become Death, the Destroyer of Worlds". He knew exa
Re:On the A-Bomb (Score:3, Informative)
Stem Cell Research and Gene Therapy (Score:5, Informative)
Dermoid Cystic Teratoma (Score:3, Insightful)
The condition is Dermoid Cystic Teratoma, commonly called Teratoma, and is generally formed by a (mutated) ova (egg cell) that starts to divide without first being fertilized. It lacks the genetic material to actually mature into a fetus, but it has enough genetic information to form various substructures. Most commonly, te
Even better nanotech dilema... (Score:3, Interesting)
3 hours later, one is sitting there, polished better than any you've seen in a movie. And hell, you can wreck the damn thing, treat it awful, and a $1.29 bottle of goo can fix it...
Well, you were lucky, one of the first to buy Goo(TM). Soon, laws are passed applying the IP paradigm to physical stuff... a Ferrari is a copyrighted piece of
Re:Even better nanotech dilema... (Score:3, Interesting)
And for the ferrari example, why bother with a damn label when you could make and style your own car. People who go for labels over style are wankers and deserve getting it taken away for copyright theft.
Re:Even better nanotech dilema... (Score:3, Informative)
You do, as a matter of fact, own the subterranian mineral and oil rights to your property, unless those rights were granted to another person by a deed executed and recorded prior to you gaining title to the property. There are many, many landowners out there who do not own subterranian rights to their own land, but this isn't the general rule.
On the other hand, you don't really own the airspace above your property - you only own as much of it as the
Responsible disclosure of software issues (Score:3, Interesting)
[Assuming male for the finder, coz I'd like to here from some female security researchers to know if they exist. ;] ]
Who should he tell?
His friends? The authors? The world?
If he just tells the author, what if they don't seem interested in fixing it?
(Microsoft wouldn't ever do that, would they?)
If he releases the information to a public security list, how much information should he
Re:Responsibility (Score:4, Insightful)
This is a personal ethical question that I contemplate on a regular basis.
I am a student in the field of quantum computing. My goal is to build a practically useful quantum computer. This isn't an easy task, but a very intellectually rewarding problem. The impact on physics would be nothing short of revolutionary, like the impact of modern supercomputers only much bigger (assuming that building one is physically possible).
One use of this quantum computer is to crack public key cryptosystems such as RSA, which would deal out a crippling blow to privacy worldwide. Governments would be able to read your PGP-encrypted email at will. Human-rights workers battling oppressive regimes would be put at extreme risk of their past work being deciphered and used against them.
The other practically significant use is to simulate quantum systems, meaning that scientists could use quantum computers to perform theoretical work that was previously impossible, including the modelling of advanced materials and protein-folding at the quantum level, which would have tremendous benefits in electronics and biotechnology respectively (or so I am lead to believe).
We can't know what we will or won't discover with this power but it could be one of the biggest advances to scientific achievement since the Internet made knowledge free and instantly accessible.
This situation isn't a whole lot different from when nuclear energy was developed. Some people knew there would be an immediate negative impact (death and destruction) from developing the basics, but also believed there would be a future benefit that they didn't know the exact details of. It turned out that the gains were absolutely crucial to our progress towards the present state of technology.
So the question is whether or not an individual should work in this field, knowing that it is in some sense inevitable that others will do it if any one researcher doesn't. What if there is the possibility of retaining some control using intellectual property? Could you force the users not to use it for eavesdropping? Is it your place to make such an assertion?
Is it right to gamble on new technology when there is the potential for abuse? What if the gains are huge and so is the abuse?
Now look what you made me do. (Score:3, Insightful)
Next, they saw a city of 550,000 men, women and children, and in an instant the city vanished; shadows remained where the men were gone, a firestorm raged, burning pimps and infants and an old statue of a happy Buddha and mice and dogs and old men and lovers; and a mushroom cloud arose above it all. This was a world created by the cruelest of all gods, Realpolitik.
"This is Discord," said Apollo, disturbed, laying down his lute.
Harry Truman, a servant of Realpolitik, wearing the face of Oli
Re:Responsibility (Score:5, Interesting)
Errrrrmmmmmmmmmmmm...... At the time of the decision to drop the bomb, Germany had already been defeated (VE: Victory in Europe), so that point is........ well, moot.
As far as the whole "what about the lives that would have been lost if we hadn't dropped the bomb" argument..... it's an interesting one, and is thrown around a lot, but it really comes down to the question of whether or not we should have had the power to make that choice. That is, of course, disregarding the fact that that wasn't quite the motivation for dropping the bomb. Had Japan's surrender not been forced within a matter of days, an agreement between Russia and the States would have come into effect, and given Russia free reign to take Japan. Basically, dropping the bomb was a way to insure that there wouldn't be another Soviet state to worry about. Rationalizations comparing numbers of projected casualties after the fact don't really apply.
Re:Responsibility (Score:5, Insightful)
Today, "cloning" has taken the definition "DNA cloning" or "artificial asexual reproduction" (as opposed to the more sci-fi forms of cloning, which can involve accelerated growth or memory transfer)
It's already been done (mammals), and it's already been outlawed (humans).
Further, it looks like an area where we really need the public to start discussing the issue- because most people have apparently already decided asexual human reproduction is evil [cnn.com].
At least the scientist's ban [cnn.com] has the character "more study is needed first", rather than the "affront to God" spin the policitcal establishment put on it.
But why is human asexual reproduction really wrong?
Weakens the atomic family? (Single-parent adoption is legal, though regulated.)
Slippery-slope towards designer babies? (So then genetic manipulation would be legal if based on a traditionally fertilized egg?)
Allows people to reproduce who couldn't naturally? (Assisted reproduction technology already does this, and is viewed as a blessing from God)
These and other questions need answering! There's plenty of room to productively discuss "cloning" today.
Re:Responsibility (Score:3)
Yeah, your beloved dictatorship merely starved 20-million+ of its own citizens to death for opposing the government.
I can't quite get your point about small pox? Are you saying it is a good thing to have kept those samples?
And I suppose you would just want to get rid of it, so that only the cheaters will have it and we will have no means of defense.
But, since the U.S. is the ultimate evil in the galaxy, and dictatorships are such sweetharts,
Here's mine: (Score:5, Insightful)
What if the artist encourages it [bbc.co.uk]?
What if the artist is pissed off by it? [slashdot.org]
Is violating the license less morally wrong if it's easy [cdburner.com]?
What about if the copy is of a lesser quality [flightpath.com] than the original?
What if it's a license that you like [gnu.org]?
Re:Here's mine: (Score:5, Funny)
That's what I was taught in kindergarten anyway:
Teacher: Ok Peter, what did you bring for show-and-tell today to share with us? Oh, you brought software? Well don't share any of it! Sharing is wrong, sharing means you're a pirate!
Actually I tend now to ignore all licenses unless the threat of physical force (the law) causes me to do otherwise. I believe licenses have no moral force.
So I guess that makes me a pirate. In that case, Arrgh, matey! Let's hit the high seas! I've got some Britney Speares CDs in yonder chest!
Re:Here's mine: (Score:5, Interesting)
Where the issue grows problematical is that the means of reproducing software are far less expensive than the means of reproducing cupcakes. If I already have a computer (which is reasonable, if I own software), then reproducing it costs next to nothing. If I owned a Star Trek replicator and I bought a box of Hostess cupcakes, then replicated them and gave them away, I would have wronged Hostess. I did not come up with the recipe for those cupcakes nor did I do any real work to reproduce them. However, I'm distributing, for free, cupcakes that are identical to Hostess's. Just because I am able to do this does not mean that it is right or ethical for me to do so.
I don't know exactly what one would call the act of distributing, like that, but I certainly don't think it's sharing.
Re:Here's mine: (Score:3, Insightful)
Re:Here's mine: (Score:4, Insightful)
Remember, copyrights are themselves a fairly recent invention. They have not always been applied in history and it would be foolish to think that they always will be in the future.
Furthermore, let's assume the copyright holders' worst case scenario. Copyright dies and is buried beneath easy intercontinental copying. Nobody has monetary incentive to invent and anything they do is spread without the author's permission. Sound about right? It is important to note that this situation differs from the classic Tragedy of the Commons or the foolishness of Communism. This is not a building or a piece of land that constantly requires work by people (who of course receive nothing) to keep active and useful. The Information Commons does not suffer during a dearth of fresh blood. As you say, it 'merely' stagnates. Or does it? Industrial R&D would probably suffer (we'd see a dramatic rise in the Trade Secret approach to new products), but pure researchers would likely settle for getting their name stamped on the results. Music, movies, and novels might be added to only by the altruistic, though it's arguable that this is in many respects better than the corporatized version we get today. And there will always be incentive to go to movie theaters and to see live bands; the experience beats the hell out of home systems. Paintings and sculptures, of course, will never lack for artists with visions and people wanting to 'culture up' their homes with the real thing.
Compare that to Valenti's dream scenario, where every work is owned and totally controlled even after it leaves the store. With copyright lengths reaching into the centuries and beyond (forever minus a day?), unless someone is actively printing it, old works will languish in dusty bins and eventually die an ignomious death under the guise of Digital Rights Management. The Commons cannot survive being owned. I'm constantly hearing about people who search high and low for some 80-year old piece of work, but because the author's heir says no, nothing happens.
I'm not suggesting the false dichotomy that we will eventually be forced choose between these scenarios. The future will almost certainly be something in between, or even something wierder. But I say that if we were to have to choose one, life with excessive freedom is _infinitely_ superior to the alternative.
Bullshit (Score:3, Interesting)
I really hate it when people say this. Production/reward systems are not human nature, they are social constructs. If we go back into the not-so-far past, human nature was plucking fruit off of trees and gathering nuts and grubs. The reward systems you are talking about only became "human nature" when people started locking the food up and needed to explain why it had to be that way. A gazillio
Re:Bullshit (Score:3, Insightful)
If I, a chemical engineer, never design a distillation column, never build a reactor, never work in a plant, have I earned my share of the food that the farmers and the fishermen in my community have gathered? The whole production/reward idea comes about as a result of having individuals not concentrating on producing food, but on doing other tasks which advance their society (whi
Here's one for you... (Score:5, Funny)
Re:Here's one for you... (Score:5, Insightful)
...but I have to agree, how can you teach something without an intimate knowledge of subject? If the teacher isnt passionate about the subject, how is he going to get the students to be. I hope theyre not paying for this crap! I wouldnt.
And i certainly wouldnt trust the /. crowd with any sort of moral question, but thats just me.
Re:Here's one for you... (Score:3, Funny)
Some of my favorite teachers teach classes so they can learn the material. Clearly you can't effectively teach a while knowing absolutely nothing, but intimate knowledge is not definitely not a requirement for a good class.
Teachers doing this typically have a good idea of what questions the students will ask, because they just spent hours trying to understand the same material.
Re:Here's one for you... (Score:3, Interesting)
There is an old 'where are your beliefs?' question that helps you figure out what you think government should be like. The question (paraphrased) is: If you could place the people in the top 100 positions of government from the following two choices, which would it be? A) The top 100 graduates from Harvard University, -or- B) The first 100 people in the phone book? The point to think about is: which bias y
the irony (Score:3, Insightful)
< - Fishing for Ideas [slashdot.org]
I use to feel similar, but now I think otherwise. (Score:5, Insightful)
Congress, as a whole, doesn't know that much about farming or road work, or labor unions or pretty much anything.
Congress often *cant'* be the expert on subject matter X that any given group wants it to be. There are just too many laws and too many subjects.
So what congress does instead is listen to intrest groups and their constituants. Indivdual members/groups then write and sponser a Bill dealing with the concerns raised.
Each Bill is there for everyone in the nation to read and learn about (http://thomas.loc.gov) and if they do have a problem then it's their right to call up their congressman and say so. It's even their right to go to DC and address the subject matter. They can even start their own lobying group to try and changes things or pass laws addresing their own concerns.
It's just about who has money and who doesn't (though it would be naieve to think money doesn't help). Groups like the AARP have huge sway in congress. And there are thouslands of other such
And the real beauty of the system is that even if you say, "I don't like the system it's croupt and doesn't work as well as it should," you can go out and try to change it.
The only thing that never does any good is to complain about the state of things and not try to change it or even offer an alternative.
In short, it's our job to try to educate congress and others to the issues we feel strongly about.
well... (Score:3, Insightful)
xao
Did anybody else read that as... (Score:5, Funny)
Privacy and the information mine. (Score:4, Interesting)
One obvious question (Score:2, Insightful)
linking (Score:2)
I say hell yeah!
On another note, are the linkers responsible for the content they link to?
Extinction vs. Genetic engineering (Score:5, Interesting)
If I could send 1000000 Emails for free, should I? (Score:5, Interesting)
It's mostly legal, but highly unethical, since it involves cost-shifting and most of times hijacking open relays and other unsecured resources to send out that crap. And it annoys 99% of all recipients.
Proletariat of the world, unite to kill spammers. Remember to shoot knees first, so that they can't run away while you slowly torture them to death
Re:If I could send 1000000 Emails for free, should (Score:3, Insightful)
It's mostly legal, but highly unethical, since it involves cost-shifting and most of times hijacking open relays and other unsecured resources to send out that crap. And it annoys 99% of all recipients.
Actually spammers do act ethically.
Spam is never going away until there is a solution to it. You can't stop humans behaving annoyingly when there's money to be made.
That solution has not arrived yet. When it does arrive, it won't be trivial, or someone would alread
Re:If I could send 1000000 Emails for free, should (Score:4, Insightful)
I could moderate you today, but I'm feeling like responding, even if you are trolling.
The ends justify the means? Whether I agree with that depends on the ends, and the means; in this case, I don't agree with you. The ends, in this case, will be a more restrictive Internet and an e-mail system more hardened against spam. The solution won't fix anything more than spam itself. Why should I have to put up with spam now if the only solution spam causes is its elimination?
How about this one? (Score:3, Funny)
How about the ethical dillema of people teaching things that they don't know enough about?
Replacing people with machines (Score:4, Interesting)
On a side note, I'm an information systems specialist, and the systems I design do flatten organizations and often eliminate people's jobs. This issue is one I often think about.
Is there a balance between how much machine replaces man?
Just my 2 cents..
-6d
Re:Replacing people with machines (Score:4, Informative)
Re:Replacing people with machines (Score:3, Insightful)
Re:Replacing people with machines (Score:3, Interesting)
If a machine can do the work of ten people, and the twenty lazy slobs who have that job are to stupid to get a real ones, so they form a union. Then they demand: all the machines be shut down, twice the pay, and no work. Which causes the company to export the work overseas (while still paying the 20 slobs), and 100 people have to work 100 hour weeks and are only given housing in the slums and barely enough food to survive. Are the twenty lazy slobs being ethical? Do they deserve money for doing nothing?
Ye
Introduction to Engineering Ethics (Score:5, Informative)
The book description:
Indigenous vs. introduced (Score:3, Insightful)
It is a truism in ecology that it is good to preserve ecosystems from invaders. This argument has been used against genetically modified crops and introduced predators.
Somewhere down the line, we are going to run into a situation where we have a completely new life form, engineered by humans, that is competing with existing species.
Is humanity obligated to value existing organisms over new ones? Should scientists live in fear of upsetting the established "order of nature?" Why?
A good starter for finding these (Score:4, Interesting)
The first hit and one of my favorite questions, which I've debated to some length with friends in the past, is to what extent you can observe your workers' use of the Internet. After all, their traffic runs through your servers in a manner akin to a person shouting cell-phone conversations; but should you accept that those 8 hours a day will not all be spent filling TPS reports, or should you employ Draconian tactics to monitor users' porn-site usage [everything2.com]?
Another interesting one, less IT-related but also interesting, is the economic issue: if the application of certain expensive technology can save human lives, should it be used, to whom should it be offered, and who should have to pay?
Perhaps one day SETI will present us with another dilemma: If you know a religion to be false, should you tell its followers? [everything2.com] Some would say this is already an issue [xenu.net] in the modern information-enabled world.
A few obvious ones. (Score:3, Insightful)
Some time this century we'll likely be able to produce artificial intelligent creatures, be they machines or tailored organisms. Where do we draw the line between "person" and "non-person", and how do we assess this in practice?
If the previous point is a concern, this one will be too.
E.g. works of art, algorithms/code, ideas/concepts, pictures of people, medical records. Justify from both a moral/ethical and a practical viewpoint.
We arguably have this _now_.
All of these are going to have to be dealt with sooner rather than later, and none have cut-and-dried answers, no matter what position you take. Enjoy.
sysadmins code of ethics (Score:5, Interesting)
Some examples:
1) A person in management who is not the boss of employee Jane Doe asks the sysadmin for files in Jane's network space. The person asking is above Jane in the heirarchy, but not in the the org chart path to Jane. Say a manager in another department. Should the sysadmin just give the files to the manager or ask that the request come from either the sysadmin's boss or from Jane's boss.
2) Should a company that doesn't actively close ports used by file sharing programs be liable for employees that use those programs. The company provided the bandwidth after all and could easily have blocked the ports.
3) Jane brings her computer to you as a professional repair person to fix a part. While fixing the computer, you browse through her files to make sure everything is working correctly. You notice some files have interesting names and discover that Jane is having an affair. Do you tell her husband? Should Jane be able to sue you for breach of confidentiality if you do?
4) Should tech people be made mandatory reporters? School teachers, doctors, and counselors can be made mandatory reporters of child abuse. What if we aren't talking about kiddie porn, but the parents are drug dealers?
What if it is "just" pot?
5) What responsibility, if any, do users/resellers have for groundwater contamination by the dumping of old computers?
6) You work for a nonprofit organization that must use Microsoft Access to work with some data (in other words, you can't just shout, "Switch to open source alternatives" and make the problem go away). You can't afford the 10 copies of Access you need, so you say that since only 1 person will probably use it at a time, you can install 1 copy on 10 different computers. Is this moral? It is illegal, but the class wasn't about legalities, it was about morality. This is akin to the steal a loaf of bread to feed a starving family question. Well, what if your family don't like bread? What if they like cigarettes? And what if instead of stealing them, they were selling them at a price that was practically giving them away?
And that's just a few off the top of my head.
Re:sysadmins code of ethics (Score:3, Informative)
One good start might be to look at existing codes of ethics from professional bodies, like SAGE [sage.org]. Here is theirs [sage.org]
Re:sysadmins code of ethics (Score:3, Informative)
Re:sysadmins code of ethics (Score:3, Informative)
From http://www.smith-lawfirm.com/mandatory_reporting. h tm:
"All states require certain professionals and institutions to report suspected child abuse, including health care providers and facilities of all types, mental health care providers of all types, teachers and other school personnel, social workers, day care providers and law enforcement personnel. Many states require film developers to report.
A number of states have broad statutes requiring "any
Open Wi-Fi access points (Score:5, Informative)
Who buys? (Score:5, Funny)
These are important ethical dilemmas that need discussion and input from the academic community.
Re:Who buys? (Score:3, Funny)
Now that you said that it wont work. Thanks.
Adapting old ethics to technology (Score:3, Insightful)
What about the ethics of a hypothetical individual who has an idea for software that could save lives, perhaps a medical program. But this individual is employed by a company that claims ownership to any ideas/inventions/patents/etc of this person during their employment. Is this person obligated to start work on the idea for someone else, or should they take the time to develop the idea on their own. The same could apply to people in the military. Do you wait four years to start saving lives? or do you let the military take all the profit.
Speaking of the military, what are the ethics for creating machines that kill. Military weapons and all that. Computers have become an integral part of warfare.
Ethically, if software has a bug/flaw in it, is the developer ethically supposed to fix it. What if this software is depended on by other people in very sensitive ways. Is the developer allowed to only fix this flaw in a newer version that the developer charges for. Can you legally charge someone to fix the flaws in their software? Why does this whole paragraph remind me of microsoft over and over.
Oh, and drop the "if robots came alive" thing. That's like teaching a philosophy class and asking "What if garfield came out of the newspaper and he was real".
AI = always artificial? (Score:3, Interesting)
I mean, if a robot, toaster, or what ever has sentience, intelligence, and all the thinkgs that we think make us special, even if it was manufactured, is that intelligence truly "artificial" or is it "real"? If not, then at what point does it become real? When did it stop being just semi-programmed responses and boolean algorythms and become something more? When do we say that you can dismantle that car, but you can't disassemble that robot (without its expressed permission)?
Insurance vs. welfare (Score:5, Interesting)
Now suppose that improving technology (like DNA sequencing) allows us to predict the future: Joe will get heart disease (and his 9 cohorts won't). Since the future is certain, the insurance market vanishes. No one will sell Joe insurance, because he is a known loss, and his 9 cohorts won't buy insurance, because they know that they won't need it.
Now when Joe gets heart disease, he can't afford treatment. Do we as a society institute some kind of welfare system to pay for Joe's treatment? Or do we just leave him to die?
I had this discussion with my parents... (Score:5, Insightful)
Technology and the 3rd world (Score:3, Interesting)
My Slasdot article submission... (Score:5, Funny)
Then some (Score:3, Interesting)
If we can choose the sex of a baby, it's moral to do it? What about the color of the eyes?
If we can know the probable lifespan of a person by looking at its DNA, should we allow an insurance policy based on it? Even if it's presented as a "discount" for sturdier people?
If we can exterminate an entire species, are we morally allowed to do it? Well we did it (almost) with the variola virus, but you could argue if a virus is alive. We'll soon be able to do it with mosquitoes, the tse-tse fly. Those are pests, but should they be erased from the face of earth? What about rats?
Some day in the not too distant future, all nations of earth will have an infectious pathogen agent with 98% fatality rate, six weeks of incubation (of which three in contagious state), and a safe vaccine for their own population. The nuclear arms race will look positively sedate in comparation. Should we (whoever this "we" is, soon it will be everybody) strike first?
The SINGLE MOST IMPORTANT article on this subject! (Score:5, Interesting)
- Should we, as a society, curtail research on particular branches of science? Human cloning is the obvious one, but researching superbugs and genetically hand-made viruses might have enormous benefits--at a cost of extreme risk.
- Where do we draw the line between human and (for lack of a better word) robot? Nanotech, implants, and genetic mods are all coming to meet at a common point, and that point is SOON!
Some other interesting technological dillemas come to mind. Should we sell or aid the development of technology to 'enemy' nations? How do we define enemies for this purpose? I happen to work for a company that's substantially responsible for getting much of the US military aircraft into the air--am I partly responsible for the use those aircraft are put to? The same question could be (and has been) asked of the Canadian CANDU nuclear reactors--safe, cheap, efficient, reliable, and the easiest way to produce weapons-grade material.
This last one is actually a dillema as old as the hills--dealing with the enemy--but technology is becoming an important factor because it's drawing the world together. (Not to mention the HUGE role technology plays in any conflict these days)
Other issues: Technology eats power, consumes resources, produces waste--do we have a moral responsiblilty to drive as much technological innovation as possible towards cleaning up some of our messes?
The media is now able to modify live broadcasts--how do we control that behaviour? Pasting over footage of billboards with the station's advertising is pretty reprehensible, but what about when they start adding nonexistent people to war scenes?
But the real question may boil down to this simple one: How does technology actually change any of our present moral or ethical states? Does technology actually change our ethics, and should it?
Re:The SINGLE MOST IMPORTANT article on this subje (Score:3, Funny)
Michael Jackson, Cher, and Joan Rivers -- we're too late, the line has been crossed!
Re:The SINGLE MOST IMPORTANT article on this subje (Score:3, Insightful)
If we do, and do it very much, the societies that do not will eventually squash us like bugs.
C//
Here's a few (Score:5, Interesting)
2) We are seeing that technology is making the world increasingly dangerous in the form of "asynchronous threats" or rather individual empowerment through technology that cannot be foreseen or prevented. (briefcase bombs, artificially engineered diseases, computer viruses, etc.). Is this a threat to human interdependence, or an inevitable feature?
3) Technology is making the world a lot smaller, and eroding private space and information. Will the ability of people to be in constant contact with each other, and perhaps in constant surveillance of each other, be a good thing or a bad thing? How will this affect human society and culture?
4) Lastly, are we asking these questions too late? Will humans ever be able to control the path of discovery and uses of technology? If not, should we?
What about PHB's running wild? (Score:5, Interesting)
1. Allowing projects to start without defined deliverables.
2. Allowing time-and-materials (TMA) projects to run wild with no schedule, since the company will eventually get paid regardless of the outcome.
3. Allowing marketroids to lie to the customers and public about your company's capabilities in the hope these can be acquired on the run if a project is signed with a big enough down payment.
4. Forcing people to keep billing on a project when it is a TMA with a "not to exceed" cost. If the cap is $200,000 and so far you have billed $175,000, you will be forced to find something to keep you busy until you hit the $200K or else.
5. Allowing customers to sign on a project without the buy-in of their technical people. Case in point: In a previous job my company got a huge defense contractor (127,000 desktop users) to sign on an intranet project that required IE 5 or Netscape 6. Small problem: The standard for this monstruous organization is Netscape 4.7, and overseeing the upgrade of 127,000 desktops to Netscape 6 or IE 5 would have cost twice as much as our project's budget. This could have been fixed had these people checked with their IT folks.
My fix was simple: I left. I got to see the company shoot itself in the foot, and went thru layoff rounds every 90 days. The day I was going to be handed over my pink slip I was interviewing across town. That afternoon I was told that I was spared at the last second. 2 days later I got offered the job across town and I jumped ship. I still program but only internally, my customers are my own employers so it is in their best interest to not lie to themselves!
We laid off a lot of good people at that previous company, and most of them by now have better jobs elsewhere. The few that are still working there are living thru pure hell every day of the week.
Self-Defense (Score:5, Interesting)
DDoS worms, rather than directly attacking other computers from the worm creator's computer, take over other computers and then use them to perform an attack. If you're the one targetted by one of these attacks, do you have the right to defend yourself? Is it right for you to hack into an innocent person's computer because their technological ignorance is actively causing you harm? Would you and the people that depend on your network just having to sit there and accept the attack without any real defense be preferable to that? And if you have the skill to not screw it up (probably a rare skill, but still), would it be right for someone to create an "anti-worm" that deinfects computers that have become unwitting DDoS zombies?
Computer security is a field that is absolutely soaked in real life analogies, but this situation doesn't have one that anyone would ever encounter in their lives. "If a hypnotized/possessed person tried to kill you, would it be moral to hurt them in your self-defense?" isn't an analogy that provokes an instant pre-prepared answer.
Genetics and bionics (Score:3, Informative)
Is it ethical for us to push the envelope of genetics and create our own made to order creatures? It might seem like and easy "no" or even "yes" but it isn't.
-Imagine if scientists discovered they could splice a few certain genes to create some special breed of monkey that would live its life in pain but would offer guaranteed universal matches for organs in humans. Is that ethical?
Bionics
The abicore heart has shown that we are well on our way of having artificial organs. Is this ethical? The first inclination might be yes. I am envisioning extending life of people by an extra 50 years or so.
This might sound great but if all thing were equal and everyone could reap the benefits then that could cause serious population problems as people would live MUCH longer.
Besides, this kind of technology will probably only really be available to those that can afford it which brings up a whole other ethical issue.
Ethics of Teaching Unknown Material (Score:5, Interesting)
I am a student at Penn State University, in the IST program, and I have spent untold amounts of time and my hard earned money to "learn" from instructors who have no idea of what they are even teaching! Maybe if this person doesn't keep up with technology... HE SHOULDN'T BE TEACHING THE DAMN CLASS! Talk about ethics, this post is amazingly frustrating to me.
Doesn't anyone else see the problem here?Students should be learning about this topic from a professor who is schooled in technology and has a good understanding of ethics! Students are now going to be wasting their time in a class where the professor doesn't even know what the prevalent issues are to cover!
College tuitions have skyrocketed, and will continue to do so... however we, as students, continue to receive a rapidly diminishing quality of instruction. My only wish is that no one would help this moron.
A real life email one (Score:5, Interesting)
I did. Rationale: the 3rd party hadn't read it, and the putative adulterer's affairs weren't my business. One of my colleagues was adamant that sysadmins should NEVER delete mail from a user mailbox, that it violated that user's privacy, and that the mail after all was correctly addressed.
Ah, the difference between Simon and Simone...
Overseas outsourcing (Score:4, Interesting)
This question addresses whether the practice is ethical, rather than symptomatic of a capitalist, employed-at-will society.
How about... (Score:5, Insightful)
And yes, I realise that most college lecturers are borderline useless, but why encourage it?
My advice to your "friend" would be simple - bugger off and learn your material. When you know more than your students, THEN you can consider teaching.
My suggestion--how much can we borrow? (Score:3, Insightful)
Speaking ethically, not legally, how much can we borrow from the ideas of others to develop new ideas? For instance, all scientific discovery that I'm aware of before this century depended on large part on working from the ideas of others. Now, the notion of IP has provided an incentive to stop sharing ideas--but will this hurt human scientific development?
To exaggerate the issue--if you develop a cure for cancer, but its ideas depend on the work of another scientist, should you develop the cure? What if the scientist prohbits access to the information for personal reasons? Along those lines, how do you determine valuation? ie If one is to be compensated, does the scientist with the original idea get more compensation that the scientist that developed the idea? Why? What proportion?
Cached Articles? (Score:3, Interesting)
How about the "Should Slashdot cache articles?" Is it more ethical to mirror a website without permission, or to send a ton of traffic to their site costing them money?
--nw
He's teaching a class he knows nothing about (Score:4, Funny)
Doesn't this about sum up the state of our education system today?
Ethics? (Score:3, Funny)
(The _wisdom_ of the latter is beyond the scope of this comment!)
How about laws and techology together? (Score:4, Funny)
For example, if it's just a "minor offense" to spray paint grafitti on a bridge, why can you get 10 years in prison for defacing a website? Seems a bit disproportionate.
C//
To be a cog... (Score:3, Interesting)
Now, blacklisting isn't a new idea, and it doesn't require technology. But it also does... blacklisting, to be effective, is a bureaucratic process. Bureaucracy is very much enabled by technology, since the abacus on up. A large amount of technology continues to be used for bureaucracy (probably a considerable majority of computer technology).
Bureaucracy isn't all bad... we often don't notice all the effective bureaucracy around us.
And what's the moral for database manufacturers who are creating something that happens to be used for immoral purposes? I don't know, but I will argue strongly that they are not entirely without culpability. The greatest evils ever done were done by people who did not feel themselves responsible, supported by people who did not feel themselves responsible. I believe the ends justify the means, but I also believe the ends can be a condemnation of the means, no matter how benign or neutral they seemed at the time. Anyway, certainly a point for discussion.
A very good book on the moral implications of technology is The Existential Pleasures of Engineering [amazon.com]. It's not about engineering particularly, but about technology (and a reaction against anti-technologists), building infrastructure, and very much about the moral responsibilities and questions of being someone who designs and builds the things that surround us, without being able to make many key decisions about those things. It applies very well to computer programmers.
What happens if something passes the Turing test? (Score:3, Insightful)
If a computer can fool you into thinking it is alive, which is the basic premise of the Turing test, and then it makes the argument that turning it off, or dissasembling it is like killing it, well where does that place us?
Consider this, many people consider the basic difference between people and machines (or animals as some would argue) is self awareness. How do you define self awareness?
I am sure that PETA people would say that killing anything self aware is wrong.
Well...?
Sharing of potentially harmful knowledge (Score:3, Interesting)
Knowing how people go about cracking into systems could be harmful if one does it and it could be useful when building a defence for said crackers.
When you learn how to pick locks, you gain an understanding of what makes a good lock and what doesn't. Nice to know when buying locks...
Pick pocket? Walking through the airport and get bumped? No big deal right? Unless you know how these people work.
Building bombs? Surely this is a terrorist only thing right? How about knowing what is a bomb and what is not? What if you are in a position to disarm one?
Crypto. Same as locks really. How does one know what is going to be effective and what is not? The DVD guys sure didn't. (Heh Heh) For that matter, using the crypto knowledge to solve a simple problem like playing the DVD under Linux? Legal? Not in many places. Moral and ethical. I would say yes, provided you own the thing and have a clear right to use it.
So is the knowledge itself bad? What about the teaching and access? Should everyone be able to know and decide for themselves or not?
Each of these things is under attack right now. Why?
A bit offtopic... (Score:3, Funny)
This reminds me of what happend at the begining of this semester. The professor walked into the classroom and asked us what subject was he supposed to teach us! And the first thing he said after finding out it's "prosessional ethics" was: "Oh... That's not really my area...".
Focus on ethics not technology (Score:3, Interesting)
An introductary course should not focus on particular technological issues, but rather on:
The actual technology is secondary, and the person faced with the ethical dilemma will probably know more about the technology than you anyways.
Off the top of my head, I would present the following, incomplete, list of dilemma categories (An exercise for the class would be to have the students come up with the list themselves, perhaps starting with examples taken from the press and movies):
One presumes the goal of the course is to encourge ethical behaviour and decisions, rather than recognizing ethical dilemmas and using public relations to justify the use of the most cost-effective solution, regardless of the moral issues.
With that in mind the following meta-issues should be discussed:
Misspelling (Score:3, Funny)
I would recommend 'Is ActiveX moral?'