Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

When Should a Website Edit Its Users? 159

rw2 asks: "Can a weblog edit users comments without opening itself up to liability in case of a slander suit? I run a political weblog and have a policy similar to slashdots in terms of the comments posted belonging to their owners. I'm worried about instituting something like lameness filters as it seems like as soon as you start regulating what your users post you have agreed to edit them for other reasons as well. Can someone point me to a good resource on issues like this. Those of us who aren't owned by publically traded companies are better off avoiding potential problems rather than hire lawyers to help us wiggle out later." Honestly, this greatly depends on the type of weblog you run and the community behind it. I don't think a one-answer-suits-all-sites solution exists, particularly for the reason that what may be inappropriate for one site may be more than appropriate for others. What say you?
This discussion has been archived. No new comments can be posted.

When Should a Website Edit Its Users?

Comments Filter:
  • Moderation (Score:3, Redundant)

    by The Gardener ( 519078 ) on Saturday December 01, 2001 @11:36AM (#2641059) Homepage

    Moderation is not the same as editing. IOW, delete the lame crap, but don't alter any posts. Lots of places delete inappropriate stuff; no big deal.

    The Gardener

    • Surely if you are implimenting an automated filter (like to remove posts with abusive words for instance) then you aren't moderating people comments since there is no thought process behind it...
      • Of course there was thought behind's just timelessly encoded so that a computer can carry out your policy.
        • OK... I guess I should qualify my statement... there is no-one makeing a value judgment about the relative merits/worth of one comment over another and then deciding that one can stay whilst the other comment get removed. There is just a set of strict rules that are inforced w/o making a value judgement about the comment; like if there is swearing, what ever the context, then it gets removed by the system.
          • Aren't the "strict rules" a value judgement at the time they were created?

            • Yes they are. But they are against everyone so no one person could ever claim that they are being delt with unfairly. I'm not making a judgment on a per-post basis.
              • Please don't put words in my mouth. All I said was that a value judgement occurs at some point during the creation of such a rule.

                It may seem like an insignificant point, but it is important nonetheless.

                • I'm sorry... I don't intend to put words into your mouth. I was actually agreeing with you. I just wanted to diferentiate between per-post moderation and automaticly moderating peoples comments.
    • Re:Moderation (Score:5, Interesting)

      by verbatim ( 18390 ) on Saturday December 01, 2001 @11:56AM (#2641110) Homepage
      Define "inappropriate stuff". If the process is automated, then there is nothing you can do. However, if you (or any moderator) has subjective privieldge over what is and what is not appropriate, then the line is blurred.

      There is obviously useless stuff, "f1r57 p057", links to inappropriate websites, and the sort. But if it isn't an automated process, then subjectivity can interfere with moderation.

      What happens when someone simply pisses you off? Do you abuse your power and delete their post? What if the users start to withold posting out of fear of being "edited" or censored.

      Perhaps write a clearly defined policy regarding what is and what is not acceptable. Adhere to that policy very strictly and make sure everyone is completly aware of it. Then, when some big wig company asks you to censor/change something, just wave your policy at them.

      I guess.
      • Re:Moderation (Score:3, Interesting)

        [...] But if it isn't an automated process, then subjectivity can interfere with moderation.

        What happens when someone simply pisses you off? Do you abuse your power and delete their post? What if the users start to withold posting out of fear of being "edited" or censored.

        I fell into this trap myself. I had no moderation for two years, then all of a sudden, some jerk kid started posting things ranging from racial slurs to out and out attacks on what others wrote. My "regular" participants started writing to me off the list complaining, wondering what was going on.

        I posted a request to keep it clean. That only sparked a bunch of personal attacks on my character. So, I started deleting the moron's more offensive posts. When that didn't deter him, I started deleting some of his less offensive posts to show him that I meant it. Some of those posts were pretty good, too, showing some insight in between the insults. Looking back, I regret deleting some of them, but...

        I've now switched to a moderation system of approve or throw out. I've calmed down quite a bit since then and don't throw out anything slightly insulting any more - if there is a good argument behind it. If it isn't adding anything, like "You don't know what you're talking about, idiot," then it's gone.

        Since I started moderating, the fool tried posting a great deal, with a lot of insults toward me, the first couple of weeks. He seems to have finally gotten the idea and tries once every week or two.

        Deciding to moderate was a very hard decision. I didn't want to cencor anybody, and I still don't. But some of the other readers made a distinction between "free speach" and appropriate behavior. Free speach is vital when it comes to being able to talk about a governing body. However, the example one person gave where free speach is not an absolute law would be should somebody come into my home and verbally abuse me. To do so would be begging to be kicked out.

        Nonetheless, I tried to be reasonable with him, but he obviously doesn't bow to any kind of authority whatsoever. I would have liked to have had a dialog with him off-line, but since I don't require valid e-mail addresses, and he didn't supply any, I was unable to contact him other than by writing articles "to" him.

        Also, right from the start he used anonymisers and/or hacked into cable modems. That got me very interested in securing my box as best I could. I shut down FTP (only one person was using it), and pretty much everything else in /etc/inetd.conf was disabled from the start. SSH 1 was also disabled.

        Other than the usual MS CodeRed and MS Nimda attacks, there doesn't appear to be anything out of the ordinary, so I could let out a sigh of relief that he's just a kid who knows how to use a limited range of tools (anonymisers to cause havoc), and not one who understands how thinks work (like a cracker). Nonetheless, my paranoia level has risen above the black helicopter level since then.

        What did I learn? Don't bother trying to reason with the morons. Just moderate them away without acknowleging their existance. They seem to live to insult others and watch their reactions. If there are no reactions (other than their obnixious posts disappearing), they should eventually go away. (I'm hoping so, anyway.)

      • I don't normally post on /., but I think most of you on this board are missing the point.

        It appears as though its more of a legal question. And you're all talking "fairness".

        He's asking if he's more liable if he starts deleting/editing posts than if he doesn't.
        • The point is that an _automated_ system is not subject to the same rules as a _manual_ system.

          In an _automated_ system, you are a carrier of information and you have no control over it. Because you have no control over the content I believe that you are not liable for it.

          In a system where you start editing/deleting/censoring posts, you essentailly endorse the ones that remain and are, therefore, liable for the content.

          According to this [], there is now precident for a "public forum" rule.

          Under the DMCA section 512,

          "'' 512. Limitations on liability relating to material online
          provider shall not be liable for monetary relief, or, except as provided
          in subsection (j), for injunctive or other equitable relief, for infringement
          of copyright by reason of the provider's transmitting,
          routing, or providing connections for, material through a system or
          network controlled or operated by or for the service provider, or by
          reason of the intermediate and transient storage of that material in
          the course of such transmitting, routing, or providing connections,
          ''(1) the transmission of the material was initiated by or at
          the direction of a person other than the service provider;
          ''(2) the transmission, routing, provision of connections, or
          storage is carried out through an automatic technical process
          without selection of the material by the service provider;
          ''(3) the service provider does not select the recipients of the
          material except as an automatic response to the request of another
          ''(4) no copy of the material made by the service provider in
          the course of such intermediate or transient storage is maintained
          on the system or network in a manner ordinarily accessible
          to anyone other than anticipated recipients, and no such
          copy is maintained on the system or network in a manner ordinarily
          accessible to such anticipated recipients for a longer period
          than is reasonably necessary for the transmission, routing,
          or provision of connections; and
          ''(5) the material is transmitted through the system or network
          without modification of its content."

          In other words: as long as you don't modify, censor, alter, redirect, or otherwise tamper with the content you are a public forum and are safe from lawsuits (well, they can sue, but you'll have a legal defense).
  • Moderation constantly pushing stuff to the bottom, and a disclaimer stating that the content is that of the general public, not of the site owner should solve most issues

    However, even slashdot has removed stories under threat of legal action I believe. It's just a matter of cost. Unfortunate, but true.

  • Do you Yahoo!?! (Score:2, Informative)

    by satanami69 ( 209636 )
    You should, they have a pretty good template to start from here []

    To Quote:
    "Messages that are unlawful, harmful, threatening, abusive, harassing, tortious, defamatory, vulgar, obscene, libelous, invasive of another's privacy, hateful, or racially, ethnically or otherwise objectionable may be removed and may result in the loss of your Yahoo! ID (including e-mail). Please do not post any private information unless you want it to be available publicly."
    • Messages that are unlawful, harmful, threatening, abusive, harassing, tortious, defamatory, vulgar, obscene, libelous, invasive of another's privacy, hateful, or racially, ethnically or otherwise objectionable may be removed

      That's exactly the problem - when you let one person (or several) define what is "vulgar, obscene, etc." you start to do what the submitter didn't want to do - censor. It's been said before, but who decides the specific criterion that make something vulgar (as an aside, I was under the impression that vulgar means "common - as in boorish" not obscene).

      A few of the criteria they list are so vague as to allow almost any message to be removed. Hateful towards what? Half the posts on /. are hateful (MS, RIAA, MPAA, etc. suck - would these get removed?), as for "otherwise objectionable", hell, I'll bet every single post is objectionable to at least one person somewhere.
  • Freedom Of Speech (Score:4, Insightful)

    by Klein Pretzel ( 538395 ) on Saturday December 01, 2001 @11:48AM (#2641085) Homepage
    Freedom of speech is mostly guaranteed in the US Constitution. However, I do not have to supply the forum for you to practice that speech. If I run a website or any other media forum (newspaper, etc), then I have the right to say what goes into that forum.

    If I write a book, I'd probably have to go through dozens of publishers before being accepted. Certainly they're not forced to publish your work. Why should any other medium be any different?
    • by Stonehand ( 71085 )
      It's not so much a freedom-of-speech issue as a liability issue; by removing some posts, is the site operator implicitly condoning the others, and does that mean that he bears responsibility for it? After all, removing posts, even if it's automatically done, is basically taking an active role...

      In particular, I'd worry about, say, harrassment law (maintaining a "hostile environment" -- remember that if an employer fails to act on the idiotic actions of his employees, the employer itself may be held liable).
    • I agree with this post. If you assert that the poster's own their comments then let it be your policy theme. You own the website so you have authority on what SITS on your website. Their rights end where yours begin.

      If users say they wish to remove their comments from your website then remove them... if they want their comments delivered to them then charge them.

      Quotes of the user's comments should be considered as being the property of the of the person incorporating the quote into their post; the original poster having given the person quoting the original poster the rights to quote by virtue of the fact that they posted in your forum.

      Make sure you post the Terms and Conditions (or whatever you want to call them) for all to see prior to making decisions based on those T&C's. Make sure the users know that the Terms and Conditions are subject to change without notice. etc... etc... etc...

      I hate RedTape(tm) but in a litigious (sp?) world, how can you get around it?
  • by leo.p ( 83075 ) on Saturday December 01, 2001 @11:49AM (#2641087)
    Practice editorial censorship on idiot comments made by g**ks with insufferable intellectual pretensions. Otherwise you're just going to have a lot of shrill cranks drowning intelligent commentary in their din. I mean, look what happened to slashdot when Bruce Perens was allowed to create an account.

    You dont want that.
    • You know, there are a lot of ways you could interpret the above comment. Funny is not, or at least should not be, one of them. Intelligent? Yes. Insightful? Yes. Informative? Yes. Flamebait? Possibly.

      But we all know how much mods like to think they "get it" when someone slams them.

  • IANAL (Score:5, Interesting)

    by Snowfox ( 34467 ) <(ten.xofwons) (ta) (xofwons)> on Saturday December 01, 2001 @11:49AM (#2641089) Homepage
    Really, Slash is a funny place to go for this question. You really want to talk to a lawyer.

    That said, if memory serves you lose your status as the equivalent of a common carrier and become responsible for the content as soon as you perform subjective modification or exclusion.

    Dropping messages which violate an established set of rules is one thing, as was recently upheld in a lawsuit against Yahoo. But if memory serves, subjectively editing and dropping posts is what made a slander lawsuit against Prodigy successful. By having selectively removed posts, Prodigy was, in effect, endorsing the remainder.

    Google should be your friend on both cases - the Prodigy case made a fairly big buzz in its time, and I have to think there must have been a dozen more since.

    • dropping posts is what made a slander lawsuit against Prodigy successful

      Likewise, the Compuserve suit of the same era ended up with a Compuserve victory because unlike Prodigy, they didn't guarantee the quality of the environment by filtering content.

      Course, I'm still waiting for the proxy/filter lawsuits to start cropping up. "My kids school allowed them to see pr0n so I'm suing for $250 million" stuff...

      Surgeon General's Warning: Internet contains porn, violence, bad language, lousy spelling, unreliable service providers and may cause addiction or offense. Turn off your PC and stick to NPR if you may be offended."
    • To put it in geek speak, the difference appears to be syntactic censorship vs. semantic. The lameness filters check for illegal syntax (easy for a computer to do, and irrelevant to libel suits).

      Slashdot, on the other hand, steadfastly refuses to censor on the basis of semantics (meaning), which is where libel lies.

      Besides -- the lameness filter (as I've encountered it) is pretty lame, anyways.

  • erm... (Score:2, Interesting)

    i'm a moderator for a somewhat large website,
    and our rule is NEVER edit a post, only delete it... i've been told it's against the DMCA ... i dunno if it's fully true though, cause IANAL.
  • Obviously, submitted stories, such as on Slash, can be edited, if nothing else but for an occasional typo, etc.

    User comments should not be touched, and in fact Slash does not permit this. You would have to access the MySQL files and edit the comments directly if you wanted to do that. This can be inconvenient.

    That being said, posters should be resonsible for their own comments. If they post something against the site policy, or illegal, then the site should be able to retain the option to delete the comments.

    I happen to like the moderation system, because otherwise you can devolve into a sea of moronic cluelessness. It will do until something else comes along. Things like the open publishing system seen at Indy Media [] are great, but they do not scale well.

  • by ajuda ( 124386 ) on Saturday December 01, 2001 @11:54AM (#2641106)
    Click Here [] to read about the time when slashdot was forced to delete a post about scientology. It's interesting and relates to your question.

    • The threat of lawsuits really do challenge a medium's freedom of speech.

      I worked for an independent University newspaper that published a column that made some questionable comments about the current Miss America. By definition, the column could not be considered libel. Proving it in the US Court system, however, would cost more money than the newspaper could make in several years.

      In my specific case, the columnist was no longer allowed to contribute to the newspaper, in order to circumvent current and future problems. As an editor at that paper, I loathed knowing that a voice -- that often agressively challenged popular thought -- was silenced.

      According to others in the media business, this kind of censorship by threatened legal action happens all the time. Sad but true.
  • by pdqlamb ( 10952 ) on Saturday December 01, 2001 @11:55AM (#2641107)
    You might want to institute something like the slashdot system, and let your users do the moderating. IIRC, AOL was held liable in a slander suit because they (AOL) were moderating users' posts. The act of moderating, to our brilliant judiciary (hack, spit!), is equivalent to your agreeing with, and even stating yourself, everything that's left on your message board. Let the slashdotters push the crap to the bottom of the heap; you're not exercising editorial control then.

    It's good that you're thinking about this now, because I suspect political arenas would attract more lawyers and highly inflammatory idiots than most. That combination is asking for lawsuits, IMHO.

  • Lameness Filter? (Score:5, Interesting)

    by mESSDan ( 302670 ) on Saturday December 01, 2001 @11:55AM (#2641108) Homepage
    I'm worried about instituting something like lameness filters as it seems like as soon as you start regulating what your users post you have agreed to edit them for other reasons as well.
    Slashdot has a lameness filter and I don't think it indicates that there has been other editing going on. Granted, it can hit at importune times, like when you're just trying to Karma whore and post a quick link to this or that, but it also does do some good, mainly by keeping the goatsecx man away.

    Too bad they don't have a lameness filter on the submission box though, that would theoretically keep most Jon Katz articles from ever making the front page ;)

    The potential upside in reference to your question is that since the lameness filter happens before the comment becomes a post and part of the static page (atleast here on Slashdot, I'm not sure on your site, I don't have an account and you can't post unless you do), You probably won't be sued unless its by someone who's going to sue you anyway.

    Just my 2 cents.

  • by redzebra ( 238754 ) on Saturday December 01, 2001 @12:09PM (#2641132)
    You can publish all the user posts but you're not obliged to publish those you don't want to. From that point web owners are not any diferent than normal publishers. All risk are avoided if you stick to the publishing part, since you only publish what you want too. Messing with people's post will nowhere be accepted. Deletion is not a problem since it's surely your right not to publish things you don't want. For the rest, your visitors will decide wether they feel you do an honnest job. If you'don't they won't come back :-)
  • by alphaque ( 51831 ) <[moc.euqahpla] [ta] [hsenid]> on Saturday December 01, 2001 @12:09PM (#2641134) Homepage
    The way I read it, and IANAL, is that if you're not into editting the text of the posts but are displaying them verbatim, then you cannot be responsible for them. You're just a carrier of the message.

    Filtering out whole posts based on some ranking (think /. moderation) is just as alright as it's a method of ranking entire posts and not within a particular post.

    However, if you are in the habit of editting or posting snippets of postings, then you are exerting editorial control and perhaps are liable.

    Usually, as long as the posting mechanism is automated without passing thru a human being, you can claim to being a common carrier. Newspapers and dead tree editions dont have this benefit as they pick and choose which stories they carry as they have limited print space. An online forum doesnt do this, and acccepts everything.

    Once again, IANAL, so take all of this with a pinch of salt.

    • , is that if you're not into editting the text of the posts but are displaying them verbatim, then you cannot be responsible for them. You're just a carrier of the message.

      Tecnically, this is not true. IAAL, and in most jurisdictions, it comes down to your notice that material is infringing, contains trade secrets, defamatory, etc... the ususal stuff that invokes 3rd party liability.

      A newspaper retains a bunch of lawyers exactly for the fact that they DO have liability as a publisher. With defamation, for example, if they published it with a reckless disregard for the truth, they can be liable, unless the article involved a public figure.

      Bottom line - be caureful what you exercise editorial control over because it is evidence of "notice" and deliberation.
  • by Fatal0E ( 230910 ) on Saturday December 01, 2001 @12:10PM (#2641135)
    You run the weblog, you have the final auth concerning the posts.

    I know that sounds overly simplistic but anything that falls outside the scope of protecting yourself legally you can decide what goes and what stays. Whether that means letting people stray into OT conversations via moderation or lack thereof is up to you. If you feel you have a legal issue to deal with, consult a lawyer that specializes in libel and slander.

    Again concerning the non-legal issues... If you feel strongly enough about something that bothers you on your BBS (note I didnt say something you disagree with) wield your authority. If you do your best to be fair, people will appreciate that and anyone who doesnt like it can be reminded that another discussion board just like yours is only a google search away.
  • by Anonymous Coward on Saturday December 01, 2001 @12:10PM (#2641137)
    If a post contains irrelevant/offensive content the proper course is to delete it. Do not attempt to edit it.

    Editing someone else's words without their express permission will highly annoy a significant fraction of those who get edited.

    It also could open you up to a civil suit on libel charges if the edited post changes the sense of the post in a way that defames or injures the reputation of the poster.

    Newspapers do edit letters and opinions before publishing them without express consent but they (1) use professional editors (2) have lawyers (3) have limited page space. Even so, they often annoy opinion writers and risk lawsuits by changing the writers' original statements.

    If you are running a bulletin board your best practice is to let people speak for themselves.
  • by Ieshan ( 409693 ) <(moc.liamg) (ta) (nahsei)> on Saturday December 01, 2001 @12:13PM (#2641145) Homepage Journal
    There's nothing long with Editing something as long as your approve of the rest of it. As soon as you edit something, you've "agreed" that you're taking out material that you, the editor, finds unworthy of your publication - be it a weblog, a book, a magazine, or a television show.

    Because of this, the remaining portion is now just as much your work as it is theirs. It's like touching just a single paintbrush to the Mona Lisa: while you can't claim you've painted the Mona Lisa, you could claim that you've done "art". In essense, by altering it, you've created something else, and that represents you and your views.

    Now then, back to your blog. I say that no one could hold you libel for posts you didn't edit, but then, there's a problem - namely - that people against the material on your site can ask why you didn't exercise the right to edit the material, and claim that everything represents your opinion if you have the ability to edit and aren't exercising that ability.

    Oh well, it's a tough call. Just some feedback.
    • This position of yours is very much against my common sense, but also contrary to copyright law, which explicitly forbids you to publish derivative works, and that's precisely what you're advocating here. in the case of editing it could be arguedthat the original hasn't been published yet - I don't know if that plays a reole, I am not a lawyer.

      Of course you can make something new that is merely inspired on the original work, and the line between that and derivative work can be hard to draw - but what you're suggesting is certainly derivative.
  • by GeauxTigers ( 540471 ) on Saturday December 01, 2001 @12:24PM (#2641159)
    When I'm doing legal research, my first stop is Perkins Coie LLP's Internet Case Law digest []. In this case, you should probably look under defamation.
  • Editing comments (Score:5, Interesting)

    by buss_error ( 142273 ) on Saturday December 01, 2001 @12:33PM (#2641168) Homepage Journal
    On the one hand, the Slashdot style (perhaps not slash code, but you get the idea) gives you the ability to let your reader community decide what is crap and what isn't. On the other hand, a community can develop that tends to moderate down ideas they don't agree with, even though the idea itself may state a point. (Valid to the reader or not, it is still a point.)

    I've noticed that I tend to moderate up most things, and only mod down Goat Sex type posts. I don't even do the "First Post!" type comments down. The Goat Sex guy may have had a point at one time, but it's been made, let's move on now. Nothing to see here.

    On the other hand, someone is always going to get ticked off no matter what you do, sometimes even if you do exactly what they espouse they want. This is called Damned if you do, Damned if you don't, and Damn them all anyway.

    Part of the problem, as I see it, is that if you give yourself and out to edit or remove comments, that same out conversely gives you a liability to do that on demand from someone else. I was reading the other day that a judge ruled that as a general rule, postings to forum sites are generally accepted to be opinion, not statements of fact (IANAL). As such, these are not for the most part actionable in any case, though you can START an action anyway.

    The real problem here is the legal system that allowes for suit for just about any reason. You may not win, but for (in Texas) $144.00 you can submit a complaint to a court, send a Sheriff to drop off papers to appear in court, and scare the living bejesus out of almost everyone involved. Take a walk through case law on a site like findlaw, and you will see the most amazing suits for what seems to you and me to be the silliest reasons. One guy's family sued a plane manufacturer for not putting in the operating manual for the plane that gas was required to fly, and his family won the case.(I think it was Cessna, it might have been Piper. The guy was killed when the plane crashed after running out of gas. May have been overturned later, but look at the cost of fighting it!) I don't know that making the filing of a suit harder is the answer. A more technologically cluefull bench would be a start, and perhaps sanctions against those lawyers and their clients that bring silly stuff to court may help. I don't have an answer for this problem, and I don't pretend that I do.

    I guess this all boils down to this: no matter how you do it, be consistant. No execptions to posted rules at all ever, unless ordered by a court. No matter what you do, someone sometime will bring an action against you no matter what it is you do.

    Remember, I am not a lawyer, this is not legal advice. Some restrictions apply.

  • Who are you protecting with filters? Yourself or other people? Fact of life is posts like ones on /. range from good to bad. Without the bad posts, it would be harder to know what was good. It's all part of a continuim, so if something is taken out, it's no longer a natural continium. It is your continium.

    That makes it less valuable in my mind.

  • To be honest... (Score:3, Interesting)

    by Millennium ( 2451 ) on Saturday December 01, 2001 @12:58PM (#2641222)
    I'm something of a free-speech absolutist myself, so I would say that at least ideally, the only time editors should be doing any actual editing is cleaning up duplicate posts, and perhaps mving posts from one forum to another one that's more appropriate, in multi-forum setups.

    Beyond that, Slashdot-like moderation by users is the way to go. Slashdot's system has its flaws (the amount and direction of moderation should be independent of description, though there's definitely a need for both), but it's the best general idea that I've seen.
  • DMCA section 512 (Score:5, Informative)

    by blakestah ( 91866 ) <> on Saturday December 01, 2001 @01:06PM (#2641243) Homepage
    The DMCA section 512 guarantees protection if you do NOT alter the contents of the users posts. See
    The DMCA section 512 []

    • Please resubmit your search

      Search results are only retained for a limited amount of time.Your search results have either been deleted, or the file has been updated with new information.

      OOh, cool!


    That site is the absolute BEST discussion forum I have ever seen in my life. Take a look at their rules/policies, and you'll quickly see why. And the moderation is extremely fair. I have not seen ANY evidence of abuse or hypocrisy anywhere on that site.

    Quite frankly, it frequently puts Slashdot to shame in the quality of content and signal-to-noise ratio.

    Still, I find Slashdot an amusing place. Sure, most Slashdot folk don't have a clue about home theater hardware hacking, but hey, it's fun!

    So far, the HTF has not been threatened by any lawsuits that I know of, even though they deal with movie studios and their employees.
  • Any sort of editing (including no editing) is essentially arbitrary. I run an educational web site [] that allows anyone (registered) to post content. In my terms of use when people register, I basically say that the line between appropriate and inappropriate is arbitrary and determined on a case-by-case basis. This is the only true answer. Even slashdot has removed a small number of posts. My system (Oomind) has a complex moderation mechanism and complex lameness filters. I use 10 dimensions of moderation so that people can filter based on a pretty sophisticated set of interests. The lameness filters include the usual "bad words" and "bad html" but also include post length, and a few other nifty things. So far the Oomind moderation system and lameness filters have not been pushed hard enough to really know if it "works", but hey, here's hoping :-) Blatant plug: Oomind is to education as open source is to commercial software:
  • This is just an idea I had. If you want to delete certain offensive posts without suggesting endorsement of the other posts, why not drop some legal-speak down in the bowels of your documentation stating that the software you run (which it sounds like you wrote) is "use at your own risk" and "not guarenteed to be free from defects, including those that might affect your post's appearance on our site."

    Sure, maybe you have a backdoor that lets you delete things you don't like, if you don't have the ability to implement such a "feature" directly. It would naturally be something you wouldn't want to do all the time, but if someone starts goatse'ing your site, just delete thier posts using your backdoor. So the system "loses" posts of a certain character length, or that contain the word goatse, or that are from a user who's username is a certain combination of characters? And who's to say that it's NOT a bug that's causing the posts to be deleted? (Of course, I'm assuming your source code isn't available by request :) )

    I realize there are alot of moral issues with this idea, but hey. I'm just trying to think of a way you could delete things. I don't know that I agree with my own idea, feel free to knock it down or improve it. But you know, I don't know of anyone who's held MS liable when Word crashes, thus "censoring" what I'm typing. I don't give it a second thought.
    • I can seriously say that I dont like this idea very much. This is just not an acceptable way of doing things.

      I think this would fall into the same category as governments putting people in mental hospitals because of their political opinion.

      Dont worry, im not comparing you to the government ;) but the idea falls into the same category as the example mentioned above. I think that if it finally comes to censoring something, it should at least be made clear that its censorship, with a clear reason given why it was censored.

      "If you can be nothing else, at least be honest"
    • I suspect that a search warrant plus expert witness (programmer) would shoot that down pretty quickly.

      Plus, it doesn't have to be 100% proof, even in a criminal case (civil will have a LOWER burden of proof) -- only "reasonable doubt". And humans are pretty darn good at pattern matching, so if it looks like there's an intentional pattern behind "accidental" deletions, you're opening yourself up to perjury (lying to the court for gain) and perhaps fraud/false advertising (if that's the excuse you give to your forum's patrons).
  • The real cost for a megasite like Slash would in UPDATE on the database, which is always more expensive than insert. If you have reasonably good security (not just a cookie) for your authentication, then plausibly any user could edit their own comments.

    I've occasionally wished that I could rewrite some of the hasty stuff I've written. Of course, I can also see where editing after the fact could change the nature of any thread that follows. Maybe it isn't such a good idea after all.

    • You said:

      I've occasionally wished that I could rewrite some of the hasty stuff I've written. Of course, I can also see where editing after the fact could change the nature of any thread that follows.
      * []

      This is why I believe it should be possible for a user to retract his comment - not edit, retract - just as it is possible to cancel a Usenet post. People may have seen the post, quoted it in their replies, and perhaps even archived it, but the post will no longer be available on the newsgroup itself. In fact, the unavailability of a post at the top of a thread is a common phenomenon on Usenet, where posts simply expire without the intervention of the author, so this feature needn't be shocking to Slashdot users if ever it were implemented.

      This is a lot like what happens in Real Life (I choose that phrase because Taco likes to use it when defending his site policies) where you can't unsay what you said, and some people may never let you live it down as long as their memory serves them -- but you can certainly stop saying it and, if you're humble enough, you can take it back. Now, you might say that, in real life, one takes something back by saying something else, and that's true enough; however, in real life, one has the option of no longer saying something, whereas, in Slashdot, whatever you say is repeated everytime a request for the page containing your comment is served, even if you later change your mind. I think the ability to take something back (post cancellation/removal) would compensate for the inability to change one's position (post editing) as clearly as in Real Life.

      Now, it seems to me that if Slashdot were to honor the poster's copyright, as the notice at the bottom of each Slashdot page claims it does, then it would have to comply with a user's request to remove a comment of which she herself was both the author and the copyright owner. In light of that consideration, would it not be simplest for this functionality (removal of a post by its author) to be available on the board so that administrator intervention is not required? Given that, in the recent Slashdot review of a book on the design of community websites [], defined by the author as websites where users interact with one another directly, our very own CmdrTaco is interviewed as an expert, I think it's safe to assume that he's already thinking about this sort of stuff. ;-)

      Now, I can't know how easy or how difficult it would be to add post removal functionality to Slashdot because I've never looked at the code, but I think this would be a welcome Slashdot feature -- one that would make this community seem more like the ones in so-called Real Life, and indeed more like others on the Internet itself.

      • Comment deletion would be a great trolling tool. Simple example:

        1. Karma whore an account up to 50.
        2. Post some really annoying and abusive stuff at +2: obfuscated links, pix, *BSD is dying trolls, even (shudder) Adequacy citations, you name it.
        3. Get flamed and then modded down.
        4. Delete your post.
        5. Post a perfectly informative comment at root level (or in reply to a flame).
        6. With other accounts, harass the responders mercilessly for flaming nothing (or a perfectly informative post that just needed to be replaced due to a simple typo or two.)

        (Hmm, maybe I agree, I'm an occasional troll myself. Bring it on!)

        • I don't know if such a thing could come to pass just as you describe, but I think I get the gist of what you're telling me, and it seems to me that your objections could be addressed by the following two implementation details:

          • Remove the comment but keep the post; that is, replace the comment's content with a tracer (marker) indicating that the original content has been removed. This would hopefully reduce confusion in, and discourage the opportunist trolling of, threads arising from cancelled posts.
          • Associate a (negative) karma hit with the cancellation of a post. This would penalize people who, for example, speak too soon (only to regret it later) or who would take advantage of post cancellation as a trolling aid.

          In such a system, all users are able to regret their unfortunate words, but those with especially bad karma must first show compunction and make some worthy posts first in order to earn the karma that will, so to say, persuade the community to forget -- just like in Real Life.

          Now, I am not claiming that artificially raising the activation energy of the rehabilitation process is a good strategy for any given community. But it seems to be the Slashdot way to make redemption expensive, when not impossible, and that's why I was proposing the above implementation details. I believe that, if were to agree that users should be able to cancel their own posts, then it would just be a matter of finding a way to make it all work as smoothly as possible. Of course, I am sure that some people are opposed to the very idea of post cancellation, and will raise objections to the implementation details ad infinitum for their aesthetic prejudice to hide behind -- but that sort of device would be apparent, and people would see through it.

          [At this point I wish to acknowledge the origin of the karma penalty idea: I learned of this from a private communication with the sometimes dogmatic, but always keen, yerricde [].]

  • Rules of our website (Score:5, Informative)

    by wrinkledshirt ( 228541 ) on Saturday December 01, 2001 @01:22PM (#2641269) Homepage
    I belong to a website where there's tons of political talk, personal sharing, advice etc. being posted all the time. The basic rules are:

    (a) You cannot out anybody. If you give out a name or location, that post gets edited or deleted. People who post that sort of thing are often warned about it, and have the option to fix it themselves within the 30-minute "edit window" for a post.

    (b) Hate speech is usually deleted. This is a sticky situation, and usually it requires a ton of people complaining to the site administrator that such and such a post is offensive. We don't automatically filter out any words, and each post is often treated separately.

    (c) Spam. Nobody wants it there, so it's toast the moment it goes up.

    (d) Copyright violations. This is one of the regulations for the hosting corporation, and so we usually have to replace text with a link to it. Sometimes we get away with it if we're siting a literary passage for a debate or something.

    (e) Every now and then, if something is truly indecent, it'll get cut. That's too bad, because I had this really great run of posts that said "Don't click this!" and pointed to our goatsex friend. It was quite funny, but one silly twit who couldn't take a joke complained and it got taken down. Fortunately, that was almost two months after the fact so nobody there was liable to read that post again anytime soon anyway.

    (f) Every now and then we self-police, and gang up on somebody if they're being really cruel. Many people enjoy their anonymity there, and use the opportunity to talk about a lot of personal stuff, so if a particularly mean poster uses that stuff against them, they'll usually face criticism and pressure to be a little nicer.

    (g) We also have a board dedicated to flaming. This is great because once discussion gets heated, every poster on that particular board who isn't interested in hearing it can redirect the posters in question to the flame board to air out grievances. Needless to say, our flame board is pretty popular.

    I think the important thing isn't so much what gets a user edited, but whether or not that user knows about it beforehand and is given fair warning. Yeah, it ends up being subjective, but one of the reasons people like to go to this place is because they can safely discuss things. Our administrator is great about leaving political talk alone -- I've been ranting and raving about how stupid this whole Afghanistan war is, for instance, and there's been no deleting of any of my posts. That said, I've had to stand up to some pretty harsh criticism, but that's okay -- as far as political speech goes, it's really free. Even though we do self-police, we never ask someone to change their opinions on issues in debate.

    On other method that gets used, new users go through a trial period where they can't post on every board, even though they can read them all. This gets them a chance to see how our particular dynamic goes before they are allowed to post. It's arbitrary (two weeks), but it does filter out many people who aren't genuinely interested in the site themselves (spammers, trolls, etc.). This is a new measure we've taken up, and it's pretty controversial right now, so I wouldn't necessarily recommend it to anyone unless they KNOW something like this could fix some problems they're having.

    As a website administrator, you've got to dedicate yourself to figuring out your own sites needs and getting everyone to stick to them. Oh yeah, and be prepared to be underappreciated and called a fascist pig if you ever do edit, even if it is the right thing for your site.
    • (d) Copyright violations. This is one of the regulations for the hosting corporation, and so we usually have to replace text with a link to it. Sometimes we get away with it if we're siting [sic] a literary passage for a debate or something.

      What country do you live/operate in?

      Even in the US (where, IMHO, we have nutso IP laws) this wouldn't be "getting away" with something; it would unequivocally be fair use.

      Of course, I assume you mean "citing" above."

    • by zenyu ( 248067 )
      (a) You cannot out anybody. If you give out a name or location, that post gets edited or deleted. People who post that sort of thing are often warned about it, and have the option to fix it themselves within the 30-minute "edit window" for a post.

      This is the one thing about the "either you censor everything or nothing" idea that bothers me. If someone posted the home address of someone that wasn't a public figure tagged with some hate it could actually endanger someone, even if it was mod'ed to -1. I can see how in a public square you could say it and you could even make a 1000 copies and paste it all over town, but there are controls there. People could take them down off the telephone posts and we should have the same ability in cyberspace without opening ourselves up to litigation. It shouldn't be required, I can see how posting the same private address for a public protest would be ligit, but it should be safe to take it down.
  • by Tony Shepps ( 333 ) on Saturday December 01, 2001 @02:51PM (#2641482) Homepage
    There are a lot of comments talking about the legal ramifications, but you don't want to forget the personal/social ramifications to your community.

    Your community is not about you; it's about your subject first, then about all the people who find your subject interesting, and about getting them together to communicate. This is important to remember. A lot of community owners find themselves so entranced by their status as benevolent dictator that they quit being benevolent. It's usually an ego-related thing. This is the worst-case scenario. Avoid it.

    If you delete posts that people generally expect to be deleted, you'll find your community happy and rewarding. This includes spam, obvious mistake posts with no content, personal information that shouldn't have been communicated, and cases where someone set out to purposefully cause trouble with the system or the community.

    If you delete posts that even one person finds useful, you'll find yourself in the middle of a controversy. Think about it from the user's point of view. A user may spend hours developing a post, even days contemplating what to say in a situation. Maybe they didn't take hours to write the post that you edited or deleted, but users don't want to even think about the possibility that their words may disappear. Delete a few posts without warning, even in a site that announces that it's heavily moderated, and you may find the community goes quiet for a few days. This sort of thing happens all the time.

    This goes triply for editing posts instead of removing them. I would never participate in a system where my own, attributed words could be changed around as the site owner sees fit. Would you? Why would you? Why would anyone?

    Also remember that a good, strong community will police itself to a degree. This sort of thing is not possible on someplace like /. where its popularity, has lead to effective anonymity. When most /. readers read most /. posts, they don't know or care who wrote it. This isn't true of smaller forums where there is a stronger sense of community.

    For a long time, newsgroups were the only net community going, and they were so prone to abuse that the communities in them had to develop a combination of thick skin and newbie-flaming. In fact, many people wrote that the flame was an important, necessary tool for the survival of these communities; if people wrote things that the community didn't like, they flamed, and this was their only defense mechanism. And for a while, it worked, until the net grew all out of proportion...

    The point is, you may feel that you desperately need to take action as the site owner and moderator, but your best action may well be to leave well-enough alone and let your community take care of it.
  • This may be only marginally on topic, but... I help run a very small, out-of-the-way weblog / community, which is basically just a site for people to get together, talk about whatever they want, and bullshit. It's not tied to a particular genre or ideology.

    In general, we're small enough that we've never had problems with abusive / troublesome users, and so there's never been any call to edit users or delete posts, except for one.

    Someone ran a story on Mohamed Atta [], one of the terrorists on the planes that smashed into the WTC. Someone, apparently having searched for Atta's name online, found his way to our site and anonymously posted a link reading "Here is my message of patriotism!" The link led to a Shockwave animation saluting the "heroes" who destroyed the WTC and declaring "they died for justice."

    I deleted the post. The guy came back, created an account, and reposted the link. I deleted the account and the post. He went away after that.

    A couple of other users complained about my "censorship," but I would absolutely do it again under the same circumstances, without hesitation. It's a free country -- he's free to say what he pleases, and I'm free to nuke whatever he says from the board if I find it inappropriate. It says so right up front, when you click to the comments page. And that definitely falls outside the boundaries of what I will accept on my web site.
  • by rudy_wayne ( 414635 ) on Saturday December 01, 2001 @03:15PM (#2641531)
    There is no need for moderation/censorship/editing on a message board. None. Zero. Zip. Zilch. Nada.

    As a participant in a forum or message board, if you see something "offensive" - IGNORE IT - DO NOT REPLY. If you are the owner of a message board and you are not willing to accept posts that you don't like, then DO NOT RUN A PUBLICLY ACCESABLE MESSAGE BOARD.

    It's that simple. Period.

    If your ego is so big that you really MUST be in control of what people say, then draw up a bunch of rules and institute a registration process requiring a valid e-mail address. Then, when someone says something you don't like, or violates one of your silly rules, you can play dictator and revoke their posting ability.

    The real problem here is ego. Trolls, flamers, assholes, etc. post crap in order to get a reaction and get attention. 99% of them do not have the patience and/or attention spam to conduct a long term campaign. Ignore them and they will go away. IGNORE THEM AND THEY WILL GO AWAY. Unfortunately, too many people are unable/unwilling to follow this simple advice.

    I've seen it a million times in usenet newsgroups and various message boards. As soon as people see an "offensive" post their ego immediately kicks into high gear and they launch a retaliatory attack. The whole place becomes mired in attacks and responses to attacks. In the end, the "regulars" blame the trolls and flamers and cite this as another good reason for moderation, conveniently ignoring the fact that all they had to do was ignore the idiots and they would go away.
    • Presumably you can ignore all offensive posts, I can too. Presumably everyone with enough bbs boards/fido/usenet/internet boards experience can along with some few people who has an insight to abuser dynamics without digital discussion platform experience. But not all users belong to these groups, we -if you won't be offended by the pronoun- aren't even the majority. And a few users respond to abuse, others rookies will follow. So your scheme won't work. They WILL get reaction, they WON'T be ignored unless you are running a board for the "elite."
    • As a participant in a forum or message board, if you see something "offensive" - IGNORE IT - DO NOT REPLY. If you are the owner of a message board and you are not willing to accept posts that you don't like, then DO NOT RUN A PUBLICLY ACCESABLE MESSAGE BOARD.

      It's that simple. Period.

      No it isn't. As someone with a public guestbook myself, I know the difference between something "offensive" and something "abusive." When someone posts the same idiotic joke hundreds of times in a row, I delete them all. You go ahead and ignore messageboard abuse and see how fast your board fails.


      • Any software I've written to take orders, comments, etc. does a hash against all those already in the database and doesn't allow duplicates ... this can be avoided, but its simple
      • Restrictions on the time, manner and place of speech have been upheld countless times in meatspace. Your right to speak on the lawn of the courthouse does not give you the right to set up a PA system in a residential area at 3 AM.

        This is partly because of the tradeoff between freedom of speech and the right to peaceful enjoyment of personal property and life.

        But it's also because "freedom of speech" does not protect the physical act of speaking, it protects the right to express a dissenting view. The majority requires no explicit protection precisely because it's the majority. But the minority, especially the lone dissenter, *does*. That's why some cities have laws requiring that protestors stand in specific "boxes" when they make their speech - it's partly to prevent others from attempting to drown out their voice!

        The same thing applies in cyberspace. If you have *no* moderation and attempt to discuss controversial issues, you *will* have an asshole appear who doesn't mind posting hundreds of marginally pertinent responses to drown out "objectionable" content. Just look at alt.scientology (or something like that) sometime. While it's technically true that the original messages are still there, and it's not an exact analogue of the real-world situation where the lone protester may not be heard at all, in practice few people will bother to search for meaningful content and the protester(s) will have succeeded in supressing speech.

        It's ironic, but sometimes the only way to guarantee that everyone has a voice is to be willing to silence those who would use theirs as a weapon.
  • this is not a complex issue unless you wish it to be.

    First of all, let us not act like angry monkeys throwing our feces at each other. Let us not fall into the trap of hostile hypocricy that only hurts us and our 'causes' more than anything else.

    That said, I believe that self filtering/censoring is up to each individual. Some use the phrase, "if you don't like what is posted, dont read it". This is a good if simplistic representation of the entire issue. However, it is used by those who are frankly nothing but parrots who repeat words without understanding either the words' meanings or the collective meaning behind the phrase, thus relegating it quickly to the knee-jerk cliche trash heap.

    I see many situations where this phrase comes in handy. After all, it does no good to get all worked up because of some flamer that is just pathetically attempting to get a rise out of people. But before the rhetoric spouters begin their little crusades of mentioning how "if you don't like what is posted, then don't read it", let us look at what is ruffling the feathers first.

    If I have a forum site that polices topics in specific threads, and perhaps even has a 'general thread' for offtopic posts, is it then bad to filter out offtopic posts relative to the section posted in? What if I have only one topic and the stated rules about 'appropriate behavior' clearly let everyone know to keep on subject due to the very nature of the board?

    Now, let us say that I police content that is considered uncivilized, like personal attacks, slandering, cussing, etc. Is this bad? If in this situation, it is easy to see how many defending it would say, "If you don't like it then you don't have to be a part of the forum" See how that sounds so similar? Wouldn't someone who is trully 'tolerant' extend that tolerance towards those that he views as intolerant? Am I to claim enlightenment and tolerance by letting any subject be posted regardless of the topic at hand, or how negatively or positively it is posted, yet ONLY if I agree with said posts? Guess what, that is NOT TOLERANT? No matter how many fancy words, quotes, etc I throw at it, it is intolerant due to my very own definition. It is the worst sort individual that can not even stand the judgement of his own criteria that he applies so readily towards others.

    If you want an open board, then good for you. If you believe that is morally and ethically superior, then continue to do so confident in that knowledge. Let education and your actions inspire others to do the same. If however, you attack others (and I will expand that below) in an attempt to free them, then by your own definition (and that of histories) you are a tyrant. Attacks consist of direct attacks such as slander, malicious statements, etc. but also very much include actions that attempt to shut others down (If you choose, good for you, if you 'organize' others to sheepishly follow you through fancy words and hateful rhetoric, that is much different). Also included is an inconsistent application of ethics or morals. You must be better than those you attack and must police yourselves first before you jump on any bandwagons to burn, rape and pillage others.

    I am curious how many here have ever defended someone who they do not agree with, but did not wish to see an opponents rational addition of opinions and ideas be trampled under the draconian boots of some intollerant moderators. I also wonder how many would support laws, people, ideas (ATTACKS) that would take away the choices of forum maintainers and creators to filter their boards for what they themselves believe is important. I then wonder how many of these people that support the above, would then ironically do so under the banner of tolerance and being open minded. How many would admit that they simply wish to get rid of those they do not like or agree with. (it would be more respectful in that case).

    This can be applied to so many other aspects of life too. I remember a time before the draconian laws restricting smoking in many private domains where gaining in popularity. I remember many smokers saying that not only were such laws bad, but the 'constitution' protected smokers from being "oppressed" in private restaraunts and the like. Oppressed for them meant that I as a shop/restaraunt owner could not restrict anyone from smoking. So, once again it became a lawyers game between two bands of zealots whom when looked upon with even the slightest scrutiny where seen for what they where... two different shades of brown from the same pile of manure.

  • These would be my rules:

    a) An account is activated after 16 hours of application

    b) In order to post, one must subscribe

    c) All subscribers must supply a valid e-mail address, no other information is collected

    d) All subscribers must have a unique e-mail address

    e) In case of offensive/inappropriate posts, one might be banned from posting. The ban might be timed or permenant

    f) An IP can apply for multiple accounts only if the IP can not be proved to belong to same person

    These rules can keep abusers away without deleting or editing posts. Since you do not delete/edit any post in any case, you probably won't be responsible for their content. Obvious drawback of this scheme is a abuser might accumulate a set of accounts, in case one of the accounts is banned. If you can replace rules d & f with better rules that can be more strongly linked to identity, the system would work better. It would not be bulletproof in any case, but I doubt any other mechanism can.

    • There are several falws in your rules, although it may make trolls work more difficult.
      a) fine, but you can just start laying down accounts regularly
      b) no argument shere but I think anonymous comments in a few select cases are a good thing, we don't want someone beging able to subpeona Slashodot for records just because some disgruntled employeee has told the truth about his/here company.
      c)& d) most ISPs will give you as many aliases as you like
      e) one mans offense is anothers humour. Censorship is not a good thing
      f) With address translation, company proxy servers etc its getting increasingly difficult to tie addresses to specific computers. You COULD perhaps block multiple account setups from the same IP for a limited time [24 hrs maybe] but even this has it's problems
  • This is a very US-centric discussion so far, it seems. Certainly in the UK, there has been some legal history in this area. Anyone planning on running any sort of on-line message board should be well acquainted with things like the Godfrey vs. Demon case, what constitutes being a "publisher", and so forth. I am not a lawyer, but I suspect that many of the comments made here would hold little water in UK courts with the current legal position, even as unclear as that may be.

  • This is not a good time to join the rush to Big Brother.
  • Your first mistake was running a political weblog. I know, because I made the same mistake []. Though I still believe that self-regulation can work.

    My best advice is to walk away, and let some other sucker take the fall.

    • I've read the article you linked to. Can you post any updates on what happened to you afterwards?
      • Sure:

        Having the article posted at several sources really pissed them off. They did their best to discredit me, acting like I fabricated the whole thing.

        The administration and student government tried to give me $500 to shut up. They confinscated my equipment, removed all references of SOS on posters, sites, etc. Plus removed all evidence that I had ever done anything for student government. They instructed all staff in ASUU to call the police if they saw me there.

        My wife was expelled from the University of Utah, and now attends school elsewhere. Any mention of the U around her usually brings cursing and condemnation out of her mouth. I was suspended for a period of one year, but I was able to give a big sob story, and was reinstated on strict probation just before fall semester started.

        I generally must stay out of computer labs on campus. I was officially banned from all labs on campus for some time, but that has been rescinded for a couple labs. I cannot volunteer for anything, and my graduation has been pushed back one year.

        ASUU Student Opinion Survey is gone for good. I can't get a group anywhere near campus to touch the project with a ten foot pole. A few people see me around, but even some of my closest friends from a scant year ago, today refuse even simple eye contact with me.

        I lost both of my jobs thanks to the fiasco which ensued, and it took me six months to finally get hired for a workstudy job. I had to convince tens of people that I wasn't going to crack into their computers if they hired me.

        I am one of the most notorious and hated people around there.. and I can't wait to get my degree and get the hell out.

        Letting the public have an open forum is a great idea, and there needs to be more forums. But if you run one, answer answer to anyone other than yourself.. be ready to get fucked.

  • by kimihia ( 84738 ) on Saturday December 01, 2001 @09:31PM (#2642546) Homepage

    Simply: Speech on a message board is worthless and not legally binding. If you want freedom of speech, yell out your window - and you're more likely to get in trouble for that.

    This was on Tomalak's Realm [] a few days ago.

    Newsbytes: California Appeals Court Upholds Message Board Speech [].

    The appellate court found that postings on an Internet message board constituted a "public forum," as defined in the anti-SLAPP statute. The court further ruled the defendants posted opinions as shareholders of ComputerXpress, not competitors, and the matter was therefore "an issue of public interest.

    Also another link: SJ Mercury: From November 28, 1999; `Cybersmear' lawsuits raise privacy concern [].

    PS, please read the articles and understand them. I know it is a very hard thing to do, but I've even made them hyperlinks.

  • When *SHOULD* you?

    Easy. When you feel that this is not something you want on *YOUR* web site. Use your common sense, your personal morals and values, and stand up for what you believe in.

    But, be fair and honest about it. State that this message has been edited, and tell the author *why*.

    It's your wall. What grafitti do you want written on it?

  • This is an interesting question. I had a similar problem on a humor website that I run, Knowumsayin' [].

    We had a feature called the Name Game, in which we would post pictures of weird looking people and users could give them nicknames. We expected some rude comments -- no problems there, but after a while the section was starting to make the whole site look really bad. People were simply posting names like "nigger" on this one black guy. There was nothing clever about it, and most of the names on all the pictures were nothing more than offensive and juvenile.

    I know, "offensive" can be subjective. But, here's the catch: we also had a voting system in place so that people could moderate names. And each night names with votes below 5 (on a scale from 1-10) would be automatically deleted.

    Well, it didn't help. Very few people were interested in voting on names (except the people that put them up in the first place), so the nasty stuff just stayed at the top. The feature was pretty popular, but only with dumbbells.

    So we took it down. That was almost a month ago, and I still haven't found a solution.

Perfection is acheived only on the point of collapse. - C. N. Parkinson