Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Education

Ethical Dilemmas Related to Technology 885

Anonymous Coward writes "I have a relative who will be teaching a college class on the topic of ethical dilemmas brought about by new technology. Unfortunately, he doesn't keep up with technology news, so he's not sure what the most relevant dilemmas are. For example, 'If robots came alive, would we be justified in killing them?' is one that might come up if nothing more relevant were suggested. (OK, it might not be that bad, but you get the idea. He was using Netscape 4.76 on system 9 until last week.) So, what are the most relevant ethical dilemmas brought up by technology? Note that I am looking for ethical dilemmas, e.g. 'Is Activity X moral?' rather than legal dilemmas like 'Is the DMCA constitutional?' Now is your chance to guide the young minds of the future toward stuff that matters."
This discussion has been archived. No new comments can be posted.

Ethical Dilemmas Related to Technology

Comments Filter:
  • Responsibility (Score:3, Insightful)

    by Drunken Coward ( 574991 ) on Sunday April 06, 2003 @07:39PM (#5675402)
    How about the moral responsibility of scientists for the repercussions of their creations? Several things come to mind, the first being the developement of the atomic bomb and the subsequent massive loss of innocent life. And when does biotech evolve from improving genetic flaws to customizing a person as a whole?

    But the coming rise of nanotechnology should also not be overlooked. Sure, the grey goo problem is largely hype, but what if something like that really does happen? Should the scientists working in nanotech be held responsible for an epidemic on a massive global scale?

    These are all issues I would like to see addressed in a class on ethical dilemmas in technology.
    • Re:Responsibility (Score:5, Insightful)

      by Zanthany ( 166662 ) on Sunday April 06, 2003 @07:46PM (#5675445) Journal
      But are the inventors of these technologies to blame? Should they be held responsible for inventing Technology X?

      By saying these scientists should be held responsible would akin to your atomic bomb argument. Is Einstein more responsible than Truman who ordered the massacre of hundreds of thousands of innocent Japanese civilians?

      I would hope that the answer would be no. Then we'd have civil proceedings where Victim Y would sue the inventor of Technology X because said technology brought bodily harm, even though Perpetrator Z is the actual cause of the incident.

      Oh, but wait. We already have people seeking injunctions agains gun manufacturers because they produce a lethal weapon.
      • Re:Responsibility (Score:3, Interesting)

        by randyest ( 589159 )
        You can't help here but get into the debate about whether anything is really ever invented or simply discovered.

        This is good, related, and thought-provoking. If these "creations" are actually discoveries rather than inventions, then one might argue that someone will eventually find the dangerous discoveries, so as a responsible scientist, one must look these even more aggressively, if only to better understand (and thereby be better prepared to control or limit damages from) them.

        Sorta like the guy w
      • Re:Responsibility (Score:3, Insightful)

        by mbogosian ( 537034 )
        But are the inventors of these technologies to blame? Should they be held responsible for inventing Technology X?

        There was a small blurb just before tonight's edition of Tech Nation [technation.com] where a woman (I'm embarrassed to say I've forgotten her name) was describing her own coming-to-grips with being a weapons engineer during the 80's. I wish I had a transcript of her monologue, but she told a story of being approached and reproached by Douglas Englebart [ideafinder.com] for doing the unconscionable. He reprimanded her, saying t
    • Re:Responsibility (Score:5, Insightful)

      by quantaman ( 517394 ) on Sunday April 06, 2003 @08:01PM (#5675546)
      Richard Feynman helped build the atomic bomb. He wrote that when he started hw felt that it was because if they didn't build it first than Germany would. After Germany was defeated he admits that he simply never reconsidered his working on it until afterwards but was never able to come to the decision if continueing building it was a good thing or bad.
      • On the A-Bomb (Score:3, Insightful)

        by Anonymous Coward
        Feynman had less an involvement in the creation of the atomic bomb than did others such as Fermi, Bohn, and Oppenheimer. Einstein had even less involvement, despite popular belief. He had a theoretical and political involement -- he urged FDR to pursue the Bomb project because he feared Germany would get it first.

        As far as their moral involvement, Oppenheimer, after the very first Bomb test at the Trinity site in New Mexico, quoted the Koran, saying "I am become Death, the Destroyer of Worlds". He knew exa
    • by Dukeofshadows ( 607689 ) on Sunday April 06, 2003 @08:20PM (#5675649) Journal
      Why not move into biotechnology for a bit. Stem cell research is a hot field for exploration in biology right now but is it right to use human embryoes (even if frozen and otherwise destroyed)? Suppose we can circumvent this by using pluripotent (can't transform into everything but not totally specialized) cells in adults. Would it be justified to use such cells in humans even though the results could be disasterous if they revert after a period of time? That may sound ludicrous but in cancer cells of certain types we see regression into other cell types and re-specialization into other cells (an ovarian cancer that de-specializes then re-specializes into hair, teeth, or any number of other things. This is actually where some of the ideas for stem cell research came from...) On the other end, what about gene therapy? We currently use viral vectors as a means for this and it has actually killed people when used irresponsibly (Check out "University of Pennsylvania", "gene therapy", and "1994" or so via google.com if anyone wants a glaring example, this one influenced many of the laws for gene therapy in the US). Are reserachers right to wipe out genes or diseases even for human benefit? Or are they responsible if exotic means ot gene therapy fail? Example: topoisomerases are able to literally pick up genetic sequences and transfer them elsewhere within the genome, kinda like cut 'n paste on a genetic word processor. But if it kills people involved in experiments using this as a means for gene therapy, even for a potentially huge payoff, is it worth the risk? Should this be made a means of genetically enhancing children via in-vitro fertilization and genetic tampering, what problems could arise from that? These two quagmires alone should give you ample material to peruse and discuss, hopefully it won't turn into an abortion debate on the former or a eugenics debate for the latter. .
      • That condition of "an ovarian cancer that de-specializes then re-specializes into hair, teeth, or any number of other things" is not a cancer at all, nor is it formed from stem cells.

        The condition is Dermoid Cystic Teratoma, commonly called Teratoma, and is generally formed by a (mutated) ova (egg cell) that starts to divide without first being fertilized. It lacks the genetic material to actually mature into a fetus, but it has enough genetic information to form various substructures. Most commonly, te

    • Suppose it works brilliantly, better than could be hoped for. You buy a $1.29 bottle of harmless/helpful gray goo. Sprinkle it in your yard, tell it you want a Ferrari...

      3 hours later, one is sitting there, polished better than any you've seen in a movie. And hell, you can wreck the damn thing, treat it awful, and a $1.29 bottle of goo can fix it...

      Well, you were lucky, one of the first to buy Goo(TM). Soon, laws are passed applying the IP paradigm to physical stuff... a Ferrari is a copyrighted piece of
      • Theres a bloke in England who decided to dig into his grounds and constructed rooms and what not under the nearby road and his neighbours grounds. I dont remember the outcome when he was taken to court by the neighbours and the council (for the road).

        And for the ferrari example, why bother with a damn label when you could make and style your own car. People who go for labels over style are wankers and deserve getting it taken away for copyright theft.

      • Just a note concerning real property law from a lawyer.

        You do, as a matter of fact, own the subterranian mineral and oil rights to your property, unless those rights were granted to another person by a deed executed and recorded prior to you gaining title to the property. There are many, many landowners out there who do not own subterranian rights to their own land, but this isn't the general rule.

        On the other hand, you don't really own the airspace above your property - you only own as much of it as the
    • How about someone who finds a problem with some software that would allow them to take control of other peoples machines?

      [Assuming male for the finder, coz I'd like to here from some female security researchers to know if they exist. ;] ]

      Who should he tell?

      His friends? The authors? The world?

      If he just tells the author, what if they don't seem interested in fixing it?

      (Microsoft wouldn't ever do that, would they?)

      If he releases the information to a public security list, how much information should he

    • Re:Responsibility (Score:4, Insightful)

      by Neurotensor ( 569035 ) on Sunday April 06, 2003 @11:02PM (#5676443)
      How about the moral responsibility of scientists for the repercussions of their creations?

      This is a personal ethical question that I contemplate on a regular basis.

      I am a student in the field of quantum computing. My goal is to build a practically useful quantum computer. This isn't an easy task, but a very intellectually rewarding problem. The impact on physics would be nothing short of revolutionary, like the impact of modern supercomputers only much bigger (assuming that building one is physically possible).

      One use of this quantum computer is to crack public key cryptosystems such as RSA, which would deal out a crippling blow to privacy worldwide. Governments would be able to read your PGP-encrypted email at will. Human-rights workers battling oppressive regimes would be put at extreme risk of their past work being deciphered and used against them.

      The other practically significant use is to simulate quantum systems, meaning that scientists could use quantum computers to perform theoretical work that was previously impossible, including the modelling of advanced materials and protein-folding at the quantum level, which would have tremendous benefits in electronics and biotechnology respectively (or so I am lead to believe).

      We can't know what we will or won't discover with this power but it could be one of the biggest advances to scientific achievement since the Internet made knowledge free and instantly accessible.

      This situation isn't a whole lot different from when nuclear energy was developed. Some people knew there would be an immediate negative impact (death and destruction) from developing the basics, but also believed there would be a future benefit that they didn't know the exact details of. It turned out that the gains were absolutely crucial to our progress towards the present state of technology.

      So the question is whether or not an individual should work in this field, knowing that it is in some sense inevitable that others will do it if any one researcher doesn't. What if there is the possibility of retaining some control using intellectual property? Could you force the users not to use it for eavesdropping? Is it your place to make such an assertion?

      Is it right to gamble on new technology when there is the potential for abuse? What if the gains are huge and so is the abuse?
    • From this [rawilson.com] site.

      Next, they saw a city of 550,000 men, women and children, and in an instant the city vanished; shadows remained where the men were gone, a firestorm raged, burning pimps and infants and an old statue of a happy Buddha and mice and dogs and old men and lovers; and a mushroom cloud arose above it all. This was a world created by the cruelest of all gods, Realpolitik.

      "This is Discord," said Apollo, disturbed, laying down his lute.

      Harry Truman, a servant of Realpolitik, wearing the face of Oli
  • Here's mine: (Score:5, Insightful)

    by Saint Aardvark ( 159009 ) on Sunday April 06, 2003 @07:40PM (#5675412) Homepage Journal
    Are Napster et al. moral?

    What if the artist encourages it [bbc.co.uk]?

    What if the artist is pissed off by it? [slashdot.org]

    Is violating the license less morally wrong if it's easy [cdburner.com]?

    What about if the copy is of a lesser quality [flightpath.com] than the original?

    What if it's a license that you like [gnu.org]?

    • by Linux-based-robots ( 660980 ) on Sunday April 06, 2003 @08:01PM (#5675542) Journal
      Sharing is fine unless it's software or music.
      That's what I was taught in kindergarten anyway:

      Teacher: Ok Peter, what did you bring for show-and-tell today to share with us? Oh, you brought software? Well don't share any of it! Sharing is wrong, sharing means you're a pirate!

      Actually I tend now to ignore all licenses unless the threat of physical force (the law) causes me to do otherwise. I believe licenses have no moral force.

      So I guess that makes me a pirate. In that case, Arrgh, matey! Let's hit the high seas! I've got some Britney Speares CDs in yonder chest!
      • Re:Here's mine: (Score:5, Interesting)

        by GospelHead821 ( 466923 ) on Sunday April 06, 2003 @09:16PM (#5675891)
        I dislike this mentality and I think that it incorrectly identifies the meaning of 'sharing'. Sharing software is perfeclty fine in the same sense of sharing cupcakes. If I have enough cupcakes for the entire class and I give each one a cupcake, that's good. Likewise, if I buy 25 copies of SimCity 4 to hand out to my friends, that's okay too.

        Where the issue grows problematical is that the means of reproducing software are far less expensive than the means of reproducing cupcakes. If I already have a computer (which is reasonable, if I own software), then reproducing it costs next to nothing. If I owned a Star Trek replicator and I bought a box of Hostess cupcakes, then replicated them and gave them away, I would have wronged Hostess. I did not come up with the recipe for those cupcakes nor did I do any real work to reproduce them. However, I'm distributing, for free, cupcakes that are identical to Hostess's. Just because I am able to do this does not mean that it is right or ethical for me to do so.

        I don't know exactly what one would call the act of distributing, like that, but I certainly don't think it's sharing.
  • by Sick Boy ( 5293 ) on Sunday April 06, 2003 @07:42PM (#5675418) Homepage
    How about, "should somebody who isn't familiar with the issues be responsible for teaching them?" Seriously, this could also spin off into "should the largely technologically illiterate Congress be making laws about technology?" and other topics that shine light onto the pressing concerns that have been the cause of umteen YRO articles.
    • by paradesign ( 561561 ) on Sunday April 06, 2003 @07:48PM (#5675459) Homepage
      unfortunately i used all my mod points up today...

      ...but I have to agree, how can you teach something without an intimate knowledge of subject? If the teacher isnt passionate about the subject, how is he going to get the students to be. I hope theyre not paying for this crap! I wouldnt.

      And i certainly wouldnt trust the /. crowd with any sort of moral question, but thats just me.

      • > how can you teach something without an intimate knowledge of subject?

        Some of my favorite teachers teach classes so they can learn the material. Clearly you can't effectively teach a while knowing absolutely nothing, but intimate knowledge is not definitely not a requirement for a good class.

        Teachers doing this typically have a good idea of what questions the students will ask, because they just spent hours trying to understand the same material.
      • I'm not so sure that the /. crowd is a poor source of moral questions. Hang on, I'll explain that...

        There is an old 'where are your beliefs?' question that helps you figure out what you think government should be like. The question (paraphrased) is: If you could place the people in the top 100 positions of government from the following two choices, which would it be? A) The top 100 graduates from Harvard University, -or- B) The first 100 people in the phone book? The point to think about is: which bias y

    • the irony (Score:3, Insightful)

      by AEton ( 654737 )
      (the truth, revealed slightly below the post)


      < - Fishing for Ideas [slashdot.org]
    • by Ted_Green ( 205549 ) on Sunday April 06, 2003 @08:57PM (#5675801)
      Technology doesn't have any unique attributes that give it more privlidge than any other subject matter.

      Congress, as a whole, doesn't know that much about farming or road work, or labor unions or pretty much anything.

      Congress often *cant'* be the expert on subject matter X that any given group wants it to be. There are just too many laws and too many subjects.

      So what congress does instead is listen to intrest groups and their constituants. Indivdual members/groups then write and sponser a Bill dealing with the concerns raised.
      Each Bill is there for everyone in the nation to read and learn about (http://thomas.loc.gov) and if they do have a problem then it's their right to call up their congressman and say so. It's even their right to go to DC and address the subject matter. They can even start their own lobying group to try and changes things or pass laws addresing their own concerns.

      It's just about who has money and who doesn't (though it would be naieve to think money doesn't help). Groups like the AARP have huge sway in congress. And there are thouslands of other such .orgs (eff, aclu, etc) who w/o have done just as much as the big bad corperate wolf.

      And the real beauty of the system is that even if you say, "I don't like the system it's croupt and doesn't work as well as it should," you can go out and try to change it.

      The only thing that never does any good is to complain about the state of things and not try to change it or even offer an alternative.

      In short, it's our job to try to educate congress and others to the issues we feel strongly about.
  • well... (Score:3, Insightful)

    by xao gypsie ( 641755 ) on Sunday April 06, 2003 @07:42PM (#5675419)
    use of cloning technology on humans, obviously.

    xao
  • Is ActiveX moral? I think the answer would be no, unless implemented right.
  • by rdewald ( 229443 ) <rdewald&gmail,com> on Sunday April 06, 2003 @07:42PM (#5675421) Homepage Journal
    How do you balance security and privacy? Technology makes it possible to gather a tremendous amount of information about a person from sources to whom they divulged that information for purposes other than dossier-building. If you can save a life by violating the privacy of many people, is the preservation of life (or some number of lives) absolute? If you can prevent a property crime by violating the privacy of many, how much does it have to be worth?
  • More and more data is gathered about us each day. If, and when, this data is ever gathered into a huge central database, who should be in charge of maintaining it? And, more importantly, who should be allowed access to it?
  • Should Slashdot link when others are buying?

    I say hell yeah!

    On another note, are the linkers responsible for the content they link to?

  • by kinnell ( 607819 ) on Sunday April 06, 2003 @07:45PM (#5675442)
    It is commonly held that a species becoming extinct is bad. Does it therefore follow that creating a new species through genetic engineering is good? If not, why not?
  • by gorbachev ( 512743 ) on Sunday April 06, 2003 @07:47PM (#5675447) Homepage
    Spam is such an easy ethical problem.

    It's mostly legal, but highly unethical, since it involves cost-shifting and most of times hijacking open relays and other unsecured resources to send out that crap. And it annoys 99% of all recipients.

    Proletariat of the world, unite to kill spammers. Remember to shoot knees first, so that they can't run away while you slowly torture them to death
    • Spam is such an easy ethical problem.

      It's mostly legal, but highly unethical, since it involves cost-shifting and most of times hijacking open relays and other unsecured resources to send out that crap. And it annoys 99% of all recipients.

      Actually spammers do act ethically.

      Spam is never going away until there is a solution to it. You can't stop humans behaving annoyingly when there's money to be made.

      That solution has not arrived yet. When it does arrive, it won't be trivial, or someone would alread

      • by Jaeger ( 2722 ) on Sunday April 06, 2003 @08:47PM (#5675758) Homepage
        Actually spammers do act ethically. ... They proactively increase the level of pain in the Internet community. This brings forward the day when some kind of solution is put in place. So they are making the world a better place.

        I could moderate you today, but I'm feeling like responding, even if you are trolling.

        The ends justify the means? Whether I agree with that depends on the ends, and the means; in this case, I don't agree with you. The ends, in this case, will be a more restrictive Internet and an e-mail system more hardened against spam. The solution won't fix anything more than spam itself. Why should I have to put up with spam now if the only solution spam causes is its elimination?

  • by dmadole ( 528015 ) on Sunday April 06, 2003 @07:47PM (#5675451)

    How about the ethical dillema of people teaching things that they don't know enough about?

  • by SixDimensionalArray ( 604334 ) on Sunday April 06, 2003 @07:48PM (#5675458)
    A very simple ethical dilemma - if a machine can do what ten people can, is it unethical to take away their jobs in the name of saving money? I mean, these are real humans we are talking about!

    On a side note, I'm an information systems specialist, and the systems I design do flatten organizations and often eliminate people's jobs. This issue is one I often think about.

    Is there a balance between how much machine replaces man?

    Just my 2 cents..

    -6d
    • by Minna Kirai ( 624281 ) on Sunday April 06, 2003 @07:53PM (#5675489)
      Hopefully the course instructor is already aware of that particular question, since Luddites [bigeastern.com] have been around for 200 years.
    • Exactly, I've been proposing for awhile that we move to non-motorized machinary, and square wheels. With square wheels, it will take 10x the amount of people pulling a heavy wagon, providing jobs for many more people! If we take every simple machine, and make it 10x as inefficient, it will give everyone a job!
    • If a machine can do the work of ten people, and the twenty lazy slobs who have that job are to stupid to get a real ones, so they form a union. Then they demand: all the machines be shut down, twice the pay, and no work. Which causes the company to export the work overseas (while still paying the 20 slobs), and 100 people have to work 100 hour weeks and are only given housing in the slums and barely enough food to survive. Are the twenty lazy slobs being ethical? Do they deserve money for doing nothing?

      Ye

  • by sielwolf ( 246764 ) on Sunday April 06, 2003 @07:50PM (#5675468) Homepage Journal
    You are in luck as the class I TA for does a section on engineering ethics. The main resource we use is Introduction to Engineering Ethics [amazon.com] by Schinzinger & Martin. It covers such topics as the Challenger Disaster and the Yuca Dam and shows some nice ethics tidbits. Like how various groups involved denied responsibility because lack of authority ("We were just doing our little part") and how little things can have big effects. It also then parlays such large, obvious disasters into standard workplace ethical uses. Overall a nice little book.

    The book description:
    Introduction to Engineering Ethics provides the background for discussion of the basic issues in engineering ethics. Emphasis is given to the moral problems engineers face in the corporate setting. It places those issues within a philosophical framework, and it seems to exhibit both their social importance and their intellectual challenge. The primary goal is to stimulate critical and responsible reflection on moral issues surrounding engineering practice and to provide the conceptual tools necessary for pursuing those issues.

    As per new ABET 2000 guidelines, more and more introductory engineering courses cover engineering ethics as part of their instruction. Students preparing to function within the engineering profession need to be introduced to the basic issues in engineering ethics. This book places those issues within a wider philosophical framework than has been customary in the past and aims to stimulate critical and responsible reflection on the moral issues surrounding engineering practice and to provide the conceptual tools necessary for pursuing those issues.
  • by Spyffe ( 32976 ) on Sunday April 06, 2003 @07:51PM (#5675472) Homepage

    It is a truism in ecology that it is good to preserve ecosystems from invaders. This argument has been used against genetically modified crops and introduced predators.

    Somewhere down the line, we are going to run into a situation where we have a completely new life form, engineered by humans, that is competing with existing species.

    Is humanity obligated to value existing organisms over new ones? Should scientists live in fear of upsetting the established "order of nature?" Why?

  • by AEton ( 654737 ) on Sunday April 06, 2003 @07:51PM (#5675480)
    is Google. "ethical dilemmas" technology [google.com] yields some good ones, and some false positives; here's an interesting paper [geocities.com].
    The first hit and one of my favorite questions, which I've debated to some length with friends in the past, is to what extent you can observe your workers' use of the Internet. After all, their traffic runs through your servers in a manner akin to a person shouting cell-phone conversations; but should you accept that those 8 hours a day will not all be spent filling TPS reports, or should you employ Draconian tactics to monitor users' porn-site usage [everything2.com]?
    Another interesting one, less IT-related but also interesting, is the economic issue: if the application of certain expensive technology can save human lives, should it be used, to whom should it be offered, and who should have to pay?
    Perhaps one day SETI will present us with another dilemma: If you know a religion to be false, should you tell its followers? [everything2.com] Some would say this is already an issue [xenu.net] in the modern information-enabled world.
  • by Christopher Thomas ( 11717 ) on Sunday April 06, 2003 @07:54PM (#5675498)
    Here are a few of the obvious ones:
    • At what point do we call something a "person" for purposes of rights?

      Some time this century we'll likely be able to produce artificial intelligent creatures, be they machines or tailored organisms. Where do we draw the line between "person" and "non-person", and how do we assess this in practice?

    • What ethical/moral concerns, if any, are appropriate/mandatory to consider before creating an artificial person?

      If the previous point is a concern, this one will be too.

    • In a society where information may be freely exchanged anonymously and without cost, what are appropriate and inappropriate models of ownership and rights of control over things that are now considered owned information?

      E.g. works of art, algorithms/code, ideas/concepts, pictures of people, medical records. Justify from both a moral/ethical and a practical viewpoint.

    • How will or should the ability of anyone to undetectably conduct surveillance of anyone/any location affect privacy rights as they are currently known?

      We arguably have this _now_.


    All of these are going to have to be dealt with sooner rather than later, and none have cut-and-dried answers, no matter what position you take. Enjoy.
  • by jd142 ( 129673 ) on Sunday April 06, 2003 @07:55PM (#5675510) Homepage
    What code of ethis should system administrators operate under? Should there be an external code, agreed upon by some standards body or should a sysadmin simply do whatever the policies of the company she works for dictate?

    Some examples:

    1) A person in management who is not the boss of employee Jane Doe asks the sysadmin for files in Jane's network space. The person asking is above Jane in the heirarchy, but not in the the org chart path to Jane. Say a manager in another department. Should the sysadmin just give the files to the manager or ask that the request come from either the sysadmin's boss or from Jane's boss.

    2) Should a company that doesn't actively close ports used by file sharing programs be liable for employees that use those programs. The company provided the bandwidth after all and could easily have blocked the ports.

    3) Jane brings her computer to you as a professional repair person to fix a part. While fixing the computer, you browse through her files to make sure everything is working correctly. You notice some files have interesting names and discover that Jane is having an affair. Do you tell her husband? Should Jane be able to sue you for breach of confidentiality if you do?

    4) Should tech people be made mandatory reporters? School teachers, doctors, and counselors can be made mandatory reporters of child abuse. What if we aren't talking about kiddie porn, but the parents are drug dealers?
    What if it is "just" pot?

    5) What responsibility, if any, do users/resellers have for groundwater contamination by the dumping of old computers?

    6) You work for a nonprofit organization that must use Microsoft Access to work with some data (in other words, you can't just shout, "Switch to open source alternatives" and make the problem go away). You can't afford the 10 copies of Access you need, so you say that since only 1 person will probably use it at a time, you can install 1 copy on 10 different computers. Is this moral? It is illegal, but the class wasn't about legalities, it was about morality. This is akin to the steal a loaf of bread to feed a starving family question. Well, what if your family don't like bread? What if they like cigarettes? And what if instead of stealing them, they were selling them at a price that was practically giving them away?

    And that's just a few off the top of my head.

    • System Admins should follow a formal code of ethics, just like any other profession. (i.e. accountants) Obviously, they do not always do so.

      One good start might be to look at existing codes of ethics from professional bodies, like SAGE [sage.org]. Here is theirs [sage.org]
    • How about the ACM Code of Ethics and Professional Conduct [acm.org] as a start?
  • by ralzod ( 537241 ) <ralzod@ya[ ].com ['hoo' in gap]> on Sunday April 06, 2003 @07:55PM (#5675511)
    This was a good one brought up on /. recently... The Ethics of Stealing Wireless Bandwidth? [slashdot.org]
  • Who buys? (Score:5, Funny)

    by meta-monkey ( 321000 ) on Sunday April 06, 2003 @07:56PM (#5675514) Journal
    If I, as a technology specialist, continue to field random tech support phone calls from freinds, family, and friends of friends and family, what are the ethical rules surrounding the beer they rightfully owe me? Should said beer be handed over before or after services are rendered? What about an "all you can drink while you're here" policy for housecalls?

    These are important ethical dilemmas that need discussion and input from the academic community.
  • by Qender ( 318699 ) on Sunday April 06, 2003 @07:58PM (#5675521) Homepage Journal
    How about taxation of CDR's. a lot of people will use them to copy copywritten music, but should everyone who buys a blank CD be forced to pay a few cents to the RIAA? Not to mention sony, the corporation that produces the cd burners and cds, then complains that people can use them to copy the music created by artists under sony's label.

    What about the ethics of a hypothetical individual who has an idea for software that could save lives, perhaps a medical program. But this individual is employed by a company that claims ownership to any ideas/inventions/patents/etc of this person during their employment. Is this person obligated to start work on the idea for someone else, or should they take the time to develop the idea on their own. The same could apply to people in the military. Do you wait four years to start saving lives? or do you let the military take all the profit.

    Speaking of the military, what are the ethics for creating machines that kill. Military weapons and all that. Computers have become an integral part of warfare.

    Ethically, if software has a bug/flaw in it, is the developer ethically supposed to fix it. What if this software is depended on by other people in very sensitive ways. Is the developer allowed to only fix this flaw in a newer version that the developer charges for. Can you legally charge someone to fix the flaws in their software? Why does this whole paragraph remind me of microsoft over and over.

    Oh, and drop the "if robots came alive" thing. That's like teaching a philosophy class and asking "What if garfield came out of the newspaper and he was real".
  • by fudgefactor7 ( 581449 ) on Sunday April 06, 2003 @07:58PM (#5675527)
    Here's a question: is any intelligence truly artificial?

    I mean, if a robot, toaster, or what ever has sentience, intelligence, and all the thinkgs that we think make us special, even if it was manufactured, is that intelligence truly "artificial" or is it "real"? If not, then at what point does it become real? When did it stop being just semi-programmed responses and boolean algorythms and become something more? When do we say that you can dismantle that car, but you can't disassemble that robot (without its expressed permission)?
  • by swm ( 171547 ) <swmcd@world.std.com> on Sunday April 06, 2003 @08:01PM (#5675545) Homepage
    Suppose there's something (like heart disease) that afflicts 10% of the population. Faced with an uncertain future, Joe (and his 9 cohorts) buys insurance so that he can pay for treatment if he is the unlucky 10%.

    Now suppose that improving technology (like DNA sequencing) allows us to predict the future: Joe will get heart disease (and his 9 cohorts won't). Since the future is certain, the insurance market vanishes. No one will sell Joe insurance, because he is a known loss, and his 9 cohorts won't buy insurance, because they know that they won't need it.

    Now when Joe gets heart disease, he can't afford treatment. Do we as a society institute some kind of welfare system to pay for Joe's treatment? Or do we just leave him to die?
  • by StArSkY ( 128453 ) on Sunday April 06, 2003 @08:02PM (#5675550) Homepage
    I had this discussion over a large quantity of red wine with my Parents and a group of their friends. I have a degree in IT and work in the industry, and they see me as a guru because I know how to connect to the internet an fix their email and that kind of thing. The ethical issues they came up with were: 1. When the only way to access a service is via technology (eg internet), are we creating a class of people who are denied access to services because they don't have or understand the technology involved? Particularly of relevance to government services. Disclaimer: i don't want to buy into the pc's in libraries debate, this is about the ability to use the tecnology, not just have access to it. 2. Why do computers use so much electricity? In terms of pollution are computers to the 21st century what cars were to the 20th century, amazingly transforming society but at what cost? This is not just the electricity, but the lack of recycling, the use of polluting products in manufacture etc. 3. Will a child be denied equal access to education because they don't have a PC at home?
  • by xyzzy ( 10685 ) on Sunday April 06, 2003 @08:03PM (#5675552) Homepage
    Is it more important to get technology such as the Internet into the hands of residents of the 3rd world, or to use more traditional approaches to increasing their welfare, such as food donation, education, transfer of farming tech, etc?
  • by Andrewkov ( 140579 ) on Sunday April 06, 2003 @08:09PM (#5675590)
    Hi Slashdot. I accepted a programming job paying in excess of $100,000. I start tomorrow but have never programmed before. Can you give me some tips to help me fake it? I really want this job, but I'm scared that my lack of programming skills will get me fired! Please help!
  • Then some (Score:3, Interesting)

    by OpenSourced ( 323149 ) on Sunday April 06, 2003 @08:13PM (#5675607) Journal

    If we can choose the sex of a baby, it's moral to do it? What about the color of the eyes?


    If we can know the probable lifespan of a person by looking at its DNA, should we allow an insurance policy based on it? Even if it's presented as a "discount" for sturdier people?


    If we can exterminate an entire species, are we morally allowed to do it? Well we did it (almost) with the variola virus, but you could argue if a virus is alive. We'll soon be able to do it with mosquitoes, the tse-tse fly. Those are pests, but should they be erased from the face of earth? What about rats?


    Some day in the not too distant future, all nations of earth will have an infectious pathogen agent with 98% fatality rate, six weeks of incubation (of which three in contagious state), and a safe vaccine for their own population. The nuclear arms race will look positively sedate in comparation. Should we (whoever this "we" is, soon it will be everybody) strike first?

  • by swordgeek ( 112599 ) on Sunday April 06, 2003 @08:17PM (#5675638) Journal
    Go read Bill Joy's article, "Why the future doesn't need us. [wired.com]" Possibly the best discussion I've seen on the dangers of future (and present!) technology. Some points he brings up or alludes to:

    - Should we, as a society, curtail research on particular branches of science? Human cloning is the obvious one, but researching superbugs and genetically hand-made viruses might have enormous benefits--at a cost of extreme risk.

    - Where do we draw the line between human and (for lack of a better word) robot? Nanotech, implants, and genetic mods are all coming to meet at a common point, and that point is SOON!

    Some other interesting technological dillemas come to mind. Should we sell or aid the development of technology to 'enemy' nations? How do we define enemies for this purpose? I happen to work for a company that's substantially responsible for getting much of the US military aircraft into the air--am I partly responsible for the use those aircraft are put to? The same question could be (and has been) asked of the Canadian CANDU nuclear reactors--safe, cheap, efficient, reliable, and the easiest way to produce weapons-grade material.

    This last one is actually a dillema as old as the hills--dealing with the enemy--but technology is becoming an important factor because it's drawing the world together. (Not to mention the HUGE role technology plays in any conflict these days)

    Other issues: Technology eats power, consumes resources, produces waste--do we have a moral responsiblilty to drive as much technological innovation as possible towards cleaning up some of our messes?

    The media is now able to modify live broadcasts--how do we control that behaviour? Pasting over footage of billboards with the station's advertising is pretty reprehensible, but what about when they start adding nonexistent people to war scenes?

    But the real question may boil down to this simple one: How does technology actually change any of our present moral or ethical states? Does technology actually change our ethics, and should it?

  • Here's a few (Score:5, Interesting)

    by Hard_Code ( 49548 ) on Sunday April 06, 2003 @08:18PM (#5675641)
    1) Is technological progress inherently good? Who does it benefit and who does it hurt (if any)? If technological progress is inherently good, are scientists ethically or morally responsible for their inventions? Are consumers responsible for their use of technology?

    2) We are seeing that technology is making the world increasingly dangerous in the form of "asynchronous threats" or rather individual empowerment through technology that cannot be foreseen or prevented. (briefcase bombs, artificially engineered diseases, computer viruses, etc.). Is this a threat to human interdependence, or an inevitable feature?

    3) Technology is making the world a lot smaller, and eroding private space and information. Will the ability of people to be in constant contact with each other, and perhaps in constant surveillance of each other, be a good thing or a bad thing? How will this affect human society and culture?

    4) Lastly, are we asking these questions too late? Will humans ever be able to control the path of discovery and uses of technology? If not, should we?
  • by pvera ( 250260 ) <pedro.vera@gmail.com> on Sunday April 06, 2003 @08:23PM (#5675667) Homepage Journal
    Here's a common ethical dilemma to us programmer: A pointy-haired boss (PHB) left unchecked:

    1. Allowing projects to start without defined deliverables.
    2. Allowing time-and-materials (TMA) projects to run wild with no schedule, since the company will eventually get paid regardless of the outcome.
    3. Allowing marketroids to lie to the customers and public about your company's capabilities in the hope these can be acquired on the run if a project is signed with a big enough down payment.
    4. Forcing people to keep billing on a project when it is a TMA with a "not to exceed" cost. If the cap is $200,000 and so far you have billed $175,000, you will be forced to find something to keep you busy until you hit the $200K or else.
    5. Allowing customers to sign on a project without the buy-in of their technical people. Case in point: In a previous job my company got a huge defense contractor (127,000 desktop users) to sign on an intranet project that required IE 5 or Netscape 6. Small problem: The standard for this monstruous organization is Netscape 4.7, and overseeing the upgrade of 127,000 desktops to Netscape 6 or IE 5 would have cost twice as much as our project's budget. This could have been fixed had these people checked with their IT folks.

    My fix was simple: I left. I got to see the company shoot itself in the foot, and went thru layoff rounds every 90 days. The day I was going to be handed over my pink slip I was interviewing across town. That afternoon I was told that I was spared at the last second. 2 days later I got offered the job across town and I jumped ship. I still program but only internally, my customers are my own employers so it is in their best interest to not lie to themselves!

    We laid off a lot of good people at that previous company, and most of them by now have better jobs elsewhere. The few that are still working there are living thru pure hell every day of the week.
  • Self-Defense (Score:5, Interesting)

    by DarkZero ( 516460 ) on Sunday April 06, 2003 @08:44PM (#5675744)
    Here's one that often comes up in computer security discussions:

    DDoS worms, rather than directly attacking other computers from the worm creator's computer, take over other computers and then use them to perform an attack. If you're the one targetted by one of these attacks, do you have the right to defend yourself? Is it right for you to hack into an innocent person's computer because their technological ignorance is actively causing you harm? Would you and the people that depend on your network just having to sit there and accept the attack without any real defense be preferable to that? And if you have the skill to not screw it up (probably a rare skill, but still), would it be right for someone to create an "anti-worm" that deinfects computers that have become unwitting DDoS zombies?

    Computer security is a field that is absolutely soaked in real life analogies, but this situation doesn't have one that anyone would ever encounter in their lives. "If a hypnotized/possessed person tried to kill you, would it be moral to hurt them in your self-defense?" isn't an analogy that provokes an instant pre-prepared answer.
  • Genetics and bionics (Score:3, Informative)

    by div_2n ( 525075 ) on Sunday April 06, 2003 @09:03PM (#5675825)
    Gentically Modified Organisms (GMOs)

    Is it ethical for us to push the envelope of genetics and create our own made to order creatures? It might seem like and easy "no" or even "yes" but it isn't.

    -Imagine if scientists discovered they could splice a few certain genes to create some special breed of monkey that would live its life in pain but would offer guaranteed universal matches for organs in humans. Is that ethical?

    Bionics

    The abicore heart has shown that we are well on our way of having artificial organs. Is this ethical? The first inclination might be yes. I am envisioning extending life of people by an extra 50 years or so.

    This might sound great but if all thing were equal and everyone could reap the benefits then that could cause serious population problems as people would live MUCH longer.

    Besides, this kind of technology will probably only really be available to those that can afford it which brings up a whole other ethical issue.
  • by GamezCore.com ( 631162 ) on Sunday April 06, 2003 @09:30PM (#5675949) Homepage
    You know, this post really managed to get me about as mad as any post I have ever seen at /.

    I am a student at Penn State University, in the IST program, and I have spent untold amounts of time and my hard earned money to "learn" from instructors who have no idea of what they are even teaching! Maybe if this person doesn't keep up with technology... HE SHOULDN'T BE TEACHING THE DAMN CLASS! Talk about ethics, this post is amazingly frustrating to me.

    Doesn't anyone else see the problem here?Students should be learning about this topic from a professor who is schooled in technology and has a good understanding of ethics! Students are now going to be wasting their time in a class where the professor doesn't even know what the prevalent issues are to cover!

    College tuitions have skyrocketed, and will continue to do so... however we, as students, continue to receive a rapidly diminishing quality of instruction. My only wish is that no one would help this moron.

  • by judd ( 3212 ) on Sunday April 06, 2003 @10:02PM (#5676155) Homepage
    I was on the help desk of a university. A staff member sent an email to his lover (ie, not his wife). Through a typo, it went to a third person's mailbox. He rang and asked if I could delete the message.

    I did. Rationale: the 3rd party hadn't read it, and the putative adulterer's affairs weren't my business. One of my colleagues was adamant that sysadmins should NEVER delete mail from a user mailbox, that it violated that user's privacy, and that the mail after all was correctly addressed.

    Ah, the difference between Simon and Simone...
  • Overseas outsourcing (Score:4, Interesting)

    by pongo000 ( 97357 ) on Sunday April 06, 2003 @10:06PM (#5676180)
    Is it ethical for an American-owned and American-operated company to outsource IT jobs overseas in order to take advantage of lower wages, thereby failing to create jobs stateside for IT workers who demand a higher salary?

    This question addresses whether the practice is ethical, rather than symptomatic of a capitalist, employed-at-will society.
  • How about... (Score:5, Insightful)

    by Lurgen ( 563428 ) on Sunday April 06, 2003 @10:38PM (#5676328) Journal
    How about the responsibility of our educators to actually know their material? Surely nobody thinks it appropriate for a college lecturer to be teaching a subject about which he quite obviously knows nothing?

    And yes, I realise that most college lecturers are borderline useless, but why encourage it?

    My advice to your "friend" would be simple - bugger off and learn your material. When you know more than your students, THEN you can consider teaching.


  • Speaking ethically, not legally, how much can we borrow from the ideas of others to develop new ideas? For instance, all scientific discovery that I'm aware of before this century depended on large part on working from the ideas of others. Now, the notion of IP has provided an incentive to stop sharing ideas--but will this hurt human scientific development?

    To exaggerate the issue--if you develop a cure for cancer, but its ideas depend on the work of another scientist, should you develop the cure? What if the scientist prohbits access to the information for personal reasons? Along those lines, how do you determine valuation? ie If one is to be compensated, does the scientist with the original idea get more compensation that the scientist that developed the idea? Why? What proportion?
  • Cached Articles? (Score:3, Interesting)

    by SaturnTim ( 445813 ) on Sunday April 06, 2003 @11:46PM (#5676631) Homepage

    How about the "Should Slashdot cache articles?" Is it more ethical to mirror a website without permission, or to send a ton of traffic to their site costing them money?

    --nw
  • by chudnall ( 514856 ) on Sunday April 06, 2003 @11:47PM (#5676634) Homepage Journal
    "he doesn't keep up with technology news, so he's not sure what the most relevant dilemmas are."

    Doesn't this about sum up the state of our education system today?

  • Ethics? (Score:3, Funny)

    by wcdw ( 179126 ) on Sunday April 06, 2003 @11:58PM (#5676681) Homepage
    How about the ethics of this person teaching a class for which he is admittedly not qualified? Or the ethics of using /. to compile a course syllabus?

    (The _wisdom_ of the latter is beyond the scope of this comment!)
  • by Courageous ( 228506 ) on Sunday April 06, 2003 @11:59PM (#5676683)

    For example, if it's just a "minor offense" to spray paint grafitti on a bridge, why can you get 10 years in prison for defacing a website? Seems a bit disproportionate.

    C//
  • To be a cog... (Score:3, Interesting)

    by Ian Bicking ( 980 ) <ianb@nOspaM.colorstudy.com> on Monday April 07, 2003 @12:09AM (#5676719) Homepage
    One day I was skimming someone's college book on ethical issues in fashion. I came upon a sidebar talking about El Salvador. The sweat shop owners were very excited about technology, because it allowed them to keep a database of union organizers, which they shared with each other. If anyone was caught trying to organize, they could be thoroughly blacklisted.

    Now, blacklisting isn't a new idea, and it doesn't require technology. But it also does... blacklisting, to be effective, is a bureaucratic process. Bureaucracy is very much enabled by technology, since the abacus on up. A large amount of technology continues to be used for bureaucracy (probably a considerable majority of computer technology).

    Bureaucracy isn't all bad... we often don't notice all the effective bureaucracy around us.

    And what's the moral for database manufacturers who are creating something that happens to be used for immoral purposes? I don't know, but I will argue strongly that they are not entirely without culpability. The greatest evils ever done were done by people who did not feel themselves responsible, supported by people who did not feel themselves responsible. I believe the ends justify the means, but I also believe the ends can be a condemnation of the means, no matter how benign or neutral they seemed at the time. Anyway, certainly a point for discussion.

    A very good book on the moral implications of technology is The Existential Pleasures of Engineering [amazon.com]. It's not about engineering particularly, but about technology (and a reaction against anti-technologists), building infrastructure, and very much about the moral responsibilities and questions of being someone who designs and builds the things that surround us, without being able to make many key decisions about those things. It applies very well to computer programmers.

  • What happens if a computer passes the Turing test? Furthermore what happens if it can pass an audio (speech synthesis/voice recognition) Turing test?

    If a computer can fool you into thinking it is alive, which is the basic premise of the Turing test, and then it makes the argument that turning it off, or dissasembling it is like killing it, well where does that place us?

    Consider this, many people consider the basic difference between people and machines (or animals as some would argue) is self awareness. How do you define self awareness?

    I am sure that PETA people would say that killing anything self aware is wrong.
    Well...?
  • by PotatoHead ( 12771 ) <doug.opengeek@org> on Monday April 07, 2003 @12:45AM (#5676866) Homepage Journal
    IE: cracking, (as opposed to hacking) picking locks, how to pick pocket, building bombs...

    Knowing how people go about cracking into systems could be harmful if one does it and it could be useful when building a defence for said crackers.

    When you learn how to pick locks, you gain an understanding of what makes a good lock and what doesn't. Nice to know when buying locks...

    Pick pocket? Walking through the airport and get bumped? No big deal right? Unless you know how these people work.

    Building bombs? Surely this is a terrorist only thing right? How about knowing what is a bomb and what is not? What if you are in a position to disarm one?

    Crypto. Same as locks really. How does one know what is going to be effective and what is not? The DVD guys sure didn't. (Heh Heh) For that matter, using the crypto knowledge to solve a simple problem like playing the DVD under Linux? Legal? Not in many places. Moral and ethical. I would say yes, provided you own the thing and have a clear right to use it.

    So is the knowledge itself bad? What about the teaching and access? Should everyone be able to know and decide for themselves or not?

    Each of these things is under attack right now. Why?

  • by InsaneCreator ( 209742 ) on Monday April 07, 2003 @12:54AM (#5676903)
    unfortunately, he doesn't keep up with technology news, so he's not sure what the most relevant dilemmas are.

    This reminds me of what happend at the begining of this semester. The professor walked into the classroom and asked us what subject was he supposed to teach us! And the first thing he said after finding out it's "prosessional ethics" was: "Oh... That's not really my area...".
  • by ahodgkinson ( 662233 ) on Monday April 07, 2003 @04:26AM (#5677475) Homepage Journal
    I don't actually think that an up to date knowledge of technology is required to teach ethics in engineering and technology, other than perhaps as an aide when presenting examples. Most technological ethical dilemmas can be reduced to fairly simple (simple to describe, not necessarily simple to resolve) moral dilemmas.

    An introductary course should not focus on particular technological issues, but rather on:

    • The importance taking responsibility for ethical issues.
    • Recognizing an ethical dilemma.
    • Strategies for analyzing ethical issues and making a moral choice.
    • Techniques for implementing a moral choice, particularly in the face of opposition.
    • Practicality of choices. Some moral choices are extremely impractical or expensive. Can we afford them?

    The actual technology is secondary, and the person faced with the ethical dilemma will probably know more about the technology than you anyways.

    Off the top of my head, I would present the following, incomplete, list of dilemma categories (An exercise for the class would be to have the students come up with the list themselves, perhaps starting with examples taken from the press and movies):

    • Harmful technologies - To what extent should you work on harmful and destructive technologies? Especially harmful technologies that also have benificial uses (e.g. the use of radation in medicine)? What is the chain of responsibility for the initial research, deployment and control against misuse?
    • Whistleblowing - When a corporation or government are doing something unethical, what steps can, should and should not be taken by an individual to correct the problem? To what extent can rules and laws be broken in attempt to serve the greater good.
    • Responsibility of invividuals vs. groups - Who ultimately has responsibility for group decisions on ethical issues? The group itself, the individual members, the group's leader? How much individual responsibillity do group members have when bad choices are made by the group. To what extent should you take individual responsibility for actions carried out by a group?
    • Privacy - To what extent do we allow or prohibit the use of technology that allow us to expose private information about individuals and groups?
    • Environment - To what extent must we protect our natural environment? Particularly faced with mankind's needs.
    • Technological divide - What is our responsibility to those who do not have access to modern technology? Must everyone have equal access to a minimum level of technology? Is it right to offer services only to those how have some minimum level of training and technology (Hint: It's not as easy as you think: what about services to illiterates?)
    • Equality vs. scientific advances - What is society's responsibility to the equality of its members in the face of scientific advances that prove inequallity? E.g. what happens when genetic testing shows that some people will be stupid or will die early from a disease? Can they be denied schooling, insurance or other resources?

    One presumes the goal of the course is to encourge ethical behaviour and decisions, rather than recognizing ethical dilemmas and using public relations to justify the use of the most cost-effective solution, regardless of the moral issues.

    With that in mind the following meta-issues should be discussed:

    • Advocacy - Techniques for promoting corporate, government and public awareness of the importance moral solutions to ethical dilemmas.
    • Individuals vs. powerful groups - Recognizing the difficulty and risk involved to an individual who takes an unpopular, though moral, sta
  • Misspelling (Score:3, Funny)

    by stevenp ( 610846 ) on Monday April 07, 2003 @05:07AM (#5677569)
    >> Note that I am looking for ethical dilemmas, e.g. 'Is Activity X moral?'

    I would recommend 'Is ActiveX moral?'

What is research but a blind date with knowledge? -- Will Harvey

Working...