When Should a Website Edit Its Users? 159
rw2 asks: "Can a weblog edit users comments without opening itself up to liability in case of a slander suit? I run a political weblog and have a policy similar to slashdots in terms of the comments posted belonging to their owners. I'm worried about instituting something like lameness filters as it seems like as soon as you start regulating what your users post you have agreed to edit them for other reasons as well. Can someone point me to a good resource on issues like this. Those of us who aren't owned by publically traded companies are better off avoiding potential problems rather than hire lawyers to help us wiggle out later." Honestly, this greatly depends on the type of weblog you run and the community behind it. I don't think a one-answer-suits-all-sites solution exists, particularly for the reason that what may be inappropriate for one site may be more than appropriate for others. What say you?
Moderation (Score:3, Redundant)
Moderation is not the same as editing. IOW, delete the lame crap, but don't alter any posts. Lots of places delete inappropriate stuff; no big deal.
The Gardener
Re:Moderation (Score:1)
Re:Moderation (Score:1)
Re:Moderation (Score:1)
Re:Moderation (Score:1)
domc
Re:Moderation (Score:1)
Re:Moderation (Score:1)
It may seem like an insignificant point, but it is important nonetheless.
domc
Re:Moderation (Score:1)
Re:Moderation (Score:5, Interesting)
There is obviously useless stuff, "f1r57 p057", links to inappropriate websites, and the sort. But if it isn't an automated process, then subjectivity can interfere with moderation.
What happens when someone simply pisses you off? Do you abuse your power and delete their post? What if the users start to withold posting out of fear of being "edited" or censored.
Perhaps write a clearly defined policy regarding what is and what is not acceptable. Adhere to that policy very strictly and make sure everyone is completly aware of it. Then, when some big wig company asks you to censor/change something, just wave your policy at them.
I guess.
Re:Moderation (Score:3, Interesting)
I fell into this trap myself. I had no moderation for two years, then all of a sudden, some jerk kid started posting things ranging from racial slurs to out and out attacks on what others wrote. My "regular" participants started writing to me off the list complaining, wondering what was going on.
I posted a request to keep it clean. That only sparked a bunch of personal attacks on my character. So, I started deleting the moron's more offensive posts. When that didn't deter him, I started deleting some of his less offensive posts to show him that I meant it. Some of those posts were pretty good, too, showing some insight in between the insults. Looking back, I regret deleting some of them, but...
I've now switched to a moderation system of approve or throw out. I've calmed down quite a bit since then and don't throw out anything slightly insulting any more - if there is a good argument behind it. If it isn't adding anything, like "You don't know what you're talking about, idiot," then it's gone.
Since I started moderating, the fool tried posting a great deal, with a lot of insults toward me, the first couple of weeks. He seems to have finally gotten the idea and tries once every week or two.
Deciding to moderate was a very hard decision. I didn't want to cencor anybody, and I still don't. But some of the other readers made a distinction between "free speach" and appropriate behavior. Free speach is vital when it comes to being able to talk about a governing body. However, the example one person gave where free speach is not an absolute law would be should somebody come into my home and verbally abuse me. To do so would be begging to be kicked out.
Nonetheless, I tried to be reasonable with him, but he obviously doesn't bow to any kind of authority whatsoever. I would have liked to have had a dialog with him off-line, but since I don't require valid e-mail addresses, and he didn't supply any, I was unable to contact him other than by writing articles "to" him.
Also, right from the start he used anonymisers and/or hacked into cable modems. That got me very interested in securing my box as best I could. I shut down FTP (only one person was using it), and pretty much everything else in
Other than the usual MS CodeRed and MS Nimda attacks, there doesn't appear to be anything out of the ordinary, so I could let out a sigh of relief that he's just a kid who knows how to use a limited range of tools (anonymisers to cause havoc), and not one who understands how thinks work (like a cracker). Nonetheless, my paranoia level has risen above the black helicopter level since then.
What did I learn? Don't bother trying to reason with the morons. Just moderate them away without acknowleging their existance. They seem to live to insult others and watch their reactions. If there are no reactions (other than their obnixious posts disappearing), they should eventually go away. (I'm hoping so, anyway.)
Re:Moderation (Score:1)
It appears as though its more of a legal question. And you're all talking "fairness".
He's asking if he's more liable if he starts deleting/editing posts than if he doesn't.
Re:Moderation (Score:2)
In an _automated_ system, you are a carrier of information and you have no control over it. Because you have no control over the content I believe that you are not liable for it.
In a system where you start editing/deleting/censoring posts, you essentailly endorse the ones that remain and are, therefore, liable for the content.
According to this [newsbytes.com], there is now precident for a "public forum" rule.
Under the DMCA section 512,
"'' 512. Limitations on liability relating to material online
''(a) TRANSITORY DIGITAL NETWORK COMMUNICATIONS.--A service
provider shall not be liable for monetary relief, or, except as provided
in subsection (j), for injunctive or other equitable relief, for infringement
of copyright by reason of the provider's transmitting,
routing, or providing connections for, material through a system or
network controlled or operated by or for the service provider, or by
reason of the intermediate and transient storage of that material in
the course of such transmitting, routing, or providing connections,
if--
20
''(1) the transmission of the material was initiated by or at
the direction of a person other than the service provider;
''(2) the transmission, routing, provision of connections, or
storage is carried out through an automatic technical process
without selection of the material by the service provider;
''(3) the service provider does not select the recipients of the
material except as an automatic response to the request of another
person;
''(4) no copy of the material made by the service provider in
the course of such intermediate or transient storage is maintained
on the system or network in a manner ordinarily accessible
to anyone other than anticipated recipients, and no such
copy is maintained on the system or network in a manner ordinarily
accessible to such anticipated recipients for a longer period
than is reasonably necessary for the transmission, routing,
or provision of connections; and
''(5) the material is transmitted through the system or network
without modification of its content."
In other words: as long as you don't modify, censor, alter, redirect, or otherwise tamper with the content you are a public forum and are safe from lawsuits (well, they can sue, but you'll have a legal defense).
Message from a metamoderator (Score:3, Insightful)
Consider the following two comments, which lets say I found listed as "Flamebait":
Comment #1
Linux is no good. Microsoft is much better.
-That would be flamebait because it has no qualification - it is just to make people angry.
Comment #2
Linux is no good because there are no browsers that do as much as IE. Microsoft is much better.
-That would be valid - I would metamoderate a flamebait rating as "unfair."
Hopefully, I'm not alone in using criteria other than my opinion to moderate and metamoderate. But you know...I've been moderated down before despite adhering to my "make a qualification" policy.
Re:Message from a metamoderator (Score:1)
just as an aside, though.. i find that on slashdot, moderation generally appears to be on par. so at least it's working. i think.
Re:Terms and Conditions... (Score:1)
Same with copyright infringement. Under the DMCA, you have to be put on notice that someone is using your site to infringe on another's rights. Once you are on notice, the DMCA (17 U.S.C. Sec. 512) spells out the specific procedure to hide behind the safe harbor so you are not liable, provided you follow certain procedures.
Overall, while this isn't speicific legal advice, generally, you should react quickly to notifications, and otherwise keep a hands-off policy on all other comments to weaken the argument of your complicity.
Moderation + Disclaimer (Score:1)
However, even slashdot has removed stories under threat of legal action I believe. It's just a matter of cost. Unfortunate, but true.
Do you Yahoo!?! (Score:2, Informative)
To Quote:
"Messages that are unlawful, harmful, threatening, abusive, harassing, tortious, defamatory, vulgar, obscene, libelous, invasive of another's privacy, hateful, or racially, ethnically or otherwise objectionable may be removed and may result in the loss of your Yahoo! ID (including e-mail). Please do not post any private information unless you want it to be available publicly."
Re:Do you Yahoo!?! (Score:1)
That's exactly the problem - when you let one person (or several) define what is "vulgar, obscene, etc." you start to do what the submitter didn't want to do - censor. It's been said before, but who decides the specific criterion that make something vulgar (as an aside, I was under the impression that vulgar means "common - as in boorish" not obscene).
A few of the criteria they list are so vague as to allow almost any message to be removed. Hateful towards what? Half the posts on
Freedom Of Speech (Score:4, Insightful)
If I write a book, I'd probably have to go through dozens of publishers before being accepted. Certainly they're not forced to publish your work. Why should any other medium be any different?
Re:Freedom Of Speech (Score:2, Insightful)
In particular, I'd worry about, say, harrassment law (maintaining a "hostile environment" -- remember that if an employer fails to act on the idiotic actions of his employees, the employer itself may be held liable).
Re:Freedom Of Speech (Score:2)
If users say they wish to remove their comments from your website then remove them... if they want their comments delivered to them then charge them.
Quotes of the user's comments should be considered as being the property of the of the person incorporating the quote into their post; the original poster having given the person quoting the original poster the rights to quote by virtue of the fact that they posted in your forum.
Make sure you post the Terms and Conditions (or whatever you want to call them) for all to see prior to making decisions based on those T&C's. Make sure the users know that the Terms and Conditions are subject to change without notice. etc... etc... etc...
I hate RedTape(tm) but in a litigious (sp?) world, how can you get around it?
do what adequacy.org does (Score:3, Funny)
You dont want that.
Re:do what adequacy.org does (Score:1)
But we all know how much mods like to think they "get it" when someone slams them.
IANAL (Score:5, Interesting)
That said, if memory serves you lose your status as the equivalent of a common carrier and become responsible for the content as soon as you perform subjective modification or exclusion.
Dropping messages which violate an established set of rules is one thing, as was recently upheld in a lawsuit against Yahoo. But if memory serves, subjectively editing and dropping posts is what made a slander lawsuit against Prodigy successful. By having selectively removed posts, Prodigy was, in effect, endorsing the remainder.
Google should be your friend on both cases - the Prodigy case made a fairly big buzz in its time, and I have to think there must have been a dozen more since.
Re:IANAL (Score:2)
Likewise, the Compuserve suit of the same era ended up with a Compuserve victory because unlike Prodigy, they didn't guarantee the quality of the environment by filtering content.
Course, I'm still waiting for the proxy/filter lawsuits to start cropping up. "My kids school allowed them to see pr0n so I'm suing for $250 million" stuff...
*scoove*
Surgeon General's Warning: Internet contains porn, violence, bad language, lousy spelling, unreliable service providers and may cause addiction or offense. Turn off your PC and stick to NPR if you may be offended."
Re:IANAL (Score:2)
Slashdot, on the other hand, steadfastly refuses to censor on the basis of semantics (meaning), which is where libel lies.
Besides -- the lameness filter (as I've encountered it) is pretty lame, anyways.
erm... (Score:2, Interesting)
and our rule is NEVER edit a post, only delete it... i've been told it's against the DMCA
Editing, etc. (Score:2)
User comments should not be touched, and in fact Slash does not permit this. You would have to access the MySQL files and edit the comments directly if you wanted to do that. This can be inconvenient.
That being said, posters should be resonsible for their own comments. If they post something against the site policy, or illegal, then the site should be able to retain the option to delete the comments.
I happen to like the moderation system, because otherwise you can devolve into a sea of moronic cluelessness. It will do until something else comes along. Things like the open publishing system seen at Indy Media [indymedia.org] are great, but they do not scale well.
This is when slashdot did it... (Score:5, Interesting)
Free speech costly (Score:1)
I worked for an independent University newspaper that published a column that made some questionable comments about the current Miss America. By definition, the column could not be considered libel. Proving it in the US Court system, however, would cost more money than the newspaper could make in several years.
In my specific case, the columnist was no longer allowed to contribute to the newspaper, in order to circumvent current and future problems. As an editor at that paper, I loathed knowing that a voice -- that often agressively challenged popular thought -- was silenced.
According to others in the media business, this kind of censorship by threatened legal action happens all the time. Sad but true.
Re:This is when slashdot did it... (Score:2)
Slashdot probably safer than AOL (Score:4, Interesting)
It's good that you're thinking about this now, because I suspect political arenas would attract more lawyers and highly inflammatory idiots than most. That combination is asking for lawsuits, IMHO.
Re:Slashdot probably safer than AOL (Score:2)
Lameness Filter? (Score:5, Interesting)
Too bad they don't have a lameness filter on the submission box though, that would theoretically keep most Jon Katz articles from ever making the front page
The potential upside in reference to your question is that since the lameness filter happens before the comment becomes a post and part of the static page (atleast here on Slashdot, I'm not sure on your site, I don't have an account and you can't post unless you do), You probably won't be sued unless its by someone who's going to sue you anyway.
Just my 2 cents.
Re:Lameness Filter? (Score:1)
Durn straight. If /. ever institutes account filters like Linux Today [slashdot.org], that's my first target.
Oh, I guess I'd miss a good discussion or two. Every year or two. ;)
Re:Lameness Filter? (Score:3, Informative)
Re:Lameness Filter? (Score:2)
Re:Some honest tips from a troll (Score:3, Interesting)
Re:Some honest tips from a troll (Score:3, Insightful)
Slashdot has not been ruined. If you think it's been ruined, you must be reading with your score set to zero. Don't do that. Read with a minimum of one. A comment from an Anonymous Coward is almost *never* worth reading.
-russ
Consider yourself as a publisher (Score:3, Insightful)
No editting, filtering OK (Score:3, Insightful)
Filtering out whole posts based on some ranking (think /. moderation) is just as alright as it's a method of ranking entire posts and not within a particular post.
However, if you are in the habit of editting or posting snippets of postings, then you are exerting editorial control and perhaps are liable.
Usually, as long as the posting mechanism is automated without passing thru a human being, you can claim to being a common carrier. Newspapers and dead tree editions dont have this benefit as they pick and choose which stories they carry as they have limited print space. An online forum doesnt do this, and acccepts everything.
Once again, IANAL, so take all of this with a pinch of salt.
Re:No editting, filtering OK (Score:1)
Tecnically, this is not true. IAAL, and in most jurisdictions, it comes down to your notice that material is infringing, contains trade secrets, defamatory, etc... the ususal stuff that invokes 3rd party liability.
A newspaper retains a bunch of lawyers exactly for the fact that they DO have liability as a publisher. With defamation, for example, if they published it with a reckless disregard for the truth, they can be liable, unless the article involved a public figure.
Bottom line - be caureful what you exercise editorial control over because it is evidence of "notice" and deliberation.
Re:No editting, filtering OK (Score:1)
Let Your Conscience Be Your Guide (Score:3, Insightful)
I know that sounds overly simplistic but anything that falls outside the scope of protecting yourself legally you can decide what goes and what stays. Whether that means letting people stray into OT conversations via moderation or lack thereof is up to you. If you feel you have a legal issue to deal with, consult a lawyer that specializes in libel and slander.
Again concerning the non-legal issues... If you feel strongly enough about something that bothers you on your BBS (note I didnt say something you disagree with) wield your authority. If you do your best to be fair, people will appreciate that and anyone who doesnt like it can be reminded that another discussion board just like yours is only a google search away.
Editing posts is rude and possibly libelous (Score:4, Insightful)
Editing someone else's words without their express permission will highly annoy a significant fraction of those who get edited.
It also could open you up to a civil suit on libel charges if the edited post changes the sense of the post in a way that defames or injures the reputation of the poster.
Newspapers do edit letters and opinions before publishing them without express consent but they (1) use professional editors (2) have lawyers (3) have limited page space. Even so, they often annoy opinion writers and risk lawsuits by changing the writers' original statements.
If you are running a bulletin board your best practice is to let people speak for themselves.
The Problem - Taking Credit (Score:5, Informative)
Because of this, the remaining portion is now just as much your work as it is theirs. It's like touching just a single paintbrush to the Mona Lisa: while you can't claim you've painted the Mona Lisa, you could claim that you've done "art". In essense, by altering it, you've created something else, and that represents you and your views.
Now then, back to your blog. I say that no one could hold you libel for posts you didn't edit, but then, there's a problem - namely - that people against the material on your site can ask why you didn't exercise the right to edit the material, and claim that everything represents your opinion if you have the ability to edit and aren't exercising that ability.
Oh well, it's a tough call. Just some feedback.
you can *not* publish derivative works (Score:1)
Of course you can make something new that is merely inspired on the original work, and the line between that and derivative work can be hard to draw - but what you're suggesting is certainly derivative.
Not as good as a lawyer but.... (Score:3, Informative)
Editing comments (Score:5, Interesting)
I've noticed that I tend to moderate up most things, and only mod down Goat Sex type posts. I don't even do the "First Post!" type comments down. The Goat Sex guy may have had a point at one time, but it's been made, let's move on now. Nothing to see here.
On the other hand, someone is always going to get ticked off no matter what you do, sometimes even if you do exactly what they espouse they want. This is called Damned if you do, Damned if you don't, and Damn them all anyway.
Part of the problem, as I see it, is that if you give yourself and out to edit or remove comments, that same out conversely gives you a liability to do that on demand from someone else. I was reading the other day that a judge ruled that as a general rule, postings to forum sites are generally accepted to be opinion, not statements of fact (IANAL). As such, these are not for the most part actionable in any case, though you can START an action anyway.
The real problem here is the legal system that allowes for suit for just about any reason. You may not win, but for (in Texas) $144.00 you can submit a complaint to a court, send a Sheriff to drop off papers to appear in court, and scare the living bejesus out of almost everyone involved. Take a walk through case law on a site like findlaw, and you will see the most amazing suits for what seems to you and me to be the silliest reasons. One guy's family sued a plane manufacturer for not putting in the operating manual for the plane that gas was required to fly, and his family won the case.(I think it was Cessna, it might have been Piper. The guy was killed when the plane crashed after running out of gas. May have been overturned later, but look at the cost of fighting it!) I don't know that making the filing of a suit harder is the answer. A more technologically cluefull bench would be a start, and perhaps sanctions against those lawyers and their clients that bring silly stuff to court may help. I don't have an answer for this problem, and I don't pretend that I do.
I guess this all boils down to this: no matter how you do it, be consistant. No execptions to posted rules at all ever, unless ordered by a court. No matter what you do, someone sometime will bring an action against you no matter what it is you do.
Remember, I am not a lawyer, this is not legal advice. Some restrictions apply.
Aristotle would say "beauty and the other beauty" (Score:1)
That makes it less valuable in my mind.
Re:Aristotle would say "beauty and the other beaut (Score:1)
To be honest... (Score:3, Interesting)
Beyond that, Slashdot-like moderation by users is the way to go. Slashdot's system has its flaws (the amount and direction of moderation should be independent of description, though there's definitely a need for both), but it's the best general idea that I've seen.
DMCA section 512 (Score:5, Informative)
The DMCA section 512 [loc.gov]
Re:DMCA section 512 (Score:1)
Please resubmit your search
Search results are only retained for a limited amount of time.Your search results have either been deleted, or the file has been updated with new information.
OOh, cool!
Ask the owners of the Home Theater Forum (Score:2, Interesting)
That site is the absolute BEST discussion forum I have ever seen in my life. Take a look at their rules/policies, and you'll quickly see why. And the moderation is extremely fair. I have not seen ANY evidence of abuse or hypocrisy anywhere on that site.
Quite frankly, it frequently puts Slashdot to shame in the quality of content and signal-to-noise ratio.
Still, I find Slashdot an amusing place. Sure, most Slashdot folk don't have a clue about home theater hardware hacking, but hey, it's fun!
So far, the HTF has not been threatened by any lawsuits that I know of, even though they deal with movie studios and their employees.
Arbitrary!!! (Score:2)
An idea: why not just say it's a buggy system? (Score:2, Interesting)
Sure, maybe you have a backdoor that lets you delete things you don't like, if you don't have the ability to implement such a "feature" directly. It would naturally be something you wouldn't want to do all the time, but if someone starts goatse'ing your site, just delete thier posts using your backdoor. So the system "loses" posts of a certain character length, or that contain the word goatse, or that are from a user who's username is a certain combination of characters? And who's to say that it's NOT a bug that's causing the posts to be deleted? (Of course, I'm assuming your source code isn't available by request
I realize there are alot of moral issues with this idea, but hey. I'm just trying to think of a way you could delete things. I don't know that I agree with my own idea, feel free to knock it down or improve it. But you know, I don't know of anyone who's held MS liable when Word crashes, thus "censoring" what I'm typing. I don't give it a second thought.
Re:An idea: why not just say it's a buggy system? (Score:1)
I think this would fall into the same category as governments putting people in mental hospitals because of their political opinion.
Dont worry, im not comparing you to the government
"If you can be nothing else, at least be honest"
Re:An idea: why not just say it's a buggy system? (Score:1)
Plus, it doesn't have to be 100% proof, even in a criminal case (civil will have a LOWER burden of proof) -- only "reasonable doubt". And humans are pretty darn good at pattern matching, so if it looks like there's an intentional pattern behind "accidental" deletions, you're opening yourself up to perjury (lying to the court for gain) and perhaps fraud/false advertising (if that's the excuse you give to your forum's patrons).
allowing comments to be edited (Score:1)
I've occasionally wished that I could rewrite some of the hasty stuff I've written. Of course, I can also see where editing after the fact could change the nature of any thread that follows. Maybe it isn't such a good idea after all.
Comment cancellation as on Usenet, Real Life (Score:3, Interesting)
You said:
This is why I believe it should be possible for a user to retract his comment - not edit, retract - just as it is possible to cancel a Usenet post. People may have seen the post, quoted it in their replies, and perhaps even archived it, but the post will no longer be available on the newsgroup itself. In fact, the unavailability of a post at the top of a thread is a common phenomenon on Usenet, where posts simply expire without the intervention of the author, so this feature needn't be shocking to Slashdot users if ever it were implemented.
This is a lot like what happens in Real Life (I choose that phrase because Taco likes to use it when defending his site policies) where you can't unsay what you said, and some people may never let you live it down as long as their memory serves them -- but you can certainly stop saying it and, if you're humble enough, you can take it back. Now, you might say that, in real life, one takes something back by saying something else, and that's true enough; however, in real life, one has the option of no longer saying something, whereas, in Slashdot, whatever you say is repeated everytime a request for the page containing your comment is served, even if you later change your mind. I think the ability to take something back (post cancellation/removal) would compensate for the inability to change one's position (post editing) as clearly as in Real Life.
Now, it seems to me that if Slashdot were to honor the poster's copyright, as the notice at the bottom of each Slashdot page claims it does, then it would have to comply with a user's request to remove a comment of which she herself was both the author and the copyright owner. In light of that consideration, would it not be simplest for this functionality (removal of a post by its author) to be available on the board so that administrator intervention is not required? Given that, in the recent Slashdot review of a book on the design of community websites [slashdot.org], defined by the author as websites where users interact with one another directly, our very own CmdrTaco is interviewed as an expert, I think it's safe to assume that he's already thinking about this sort of stuff. ;-)
Now, I can't know how easy or how difficult it would be to add post removal functionality to Slashdot because I've never looked at the code, but I think this would be a welcome Slashdot feature -- one that would make this community seem more like the ones in so-called Real Life, and indeed more like others on the Internet itself.
I disagree on comment deletion (Score:1)
1. Karma whore an account up to 50.
2. Post some really annoying and abusive stuff at +2: obfuscated goatse.cx links, rotten.com pix, *BSD is dying trolls, even (shudder) Adequacy citations, you name it.
3. Get flamed and then modded down.
4. Delete your post.
5. Post a perfectly informative comment at root level (or in reply to a flame).
6. With other accounts, harass the responders mercilessly for flaming nothing (or a perfectly informative post that just needed to be replaced due to a simple typo or two.)
(Hmm, maybe I agree, I'm an occasional troll myself. Bring it on!)
put cancellation tracer in discuscussion, --karma (Score:1)
I don't know if such a thing could come to pass just as you describe, but I think I get the gist of what you're telling me, and it seems to me that your objections could be addressed by the following two implementation details:
In such a system, all users are able to regret their unfortunate words, but those with especially bad karma must first show compunction and make some worthy posts first in order to earn the karma that will, so to say, persuade the community to forget -- just like in Real Life.
Now, I am not claiming that artificially raising the activation energy of the rehabilitation process is a good strategy for any given community. But it seems to be the Slashdot way to make redemption expensive, when not impossible, and that's why I was proposing the above implementation details. I believe that, if were to agree that users should be able to cancel their own posts, then it would just be a matter of finding a way to make it all work as smoothly as possible. Of course, I am sure that some people are opposed to the very idea of post cancellation, and will raise objections to the implementation details ad infinitum for their aesthetic prejudice to hide behind -- but that sort of device would be apparent, and people would see through it.
[At this point I wish to acknowledge the origin of the karma penalty idea: I learned of this from a private communication with the sometimes dogmatic, but always keen, yerricde [slashdot.org].]
Rules of our website (Score:5, Informative)
(a) You cannot out anybody. If you give out a name or location, that post gets edited or deleted. People who post that sort of thing are often warned about it, and have the option to fix it themselves within the 30-minute "edit window" for a post.
(b) Hate speech is usually deleted. This is a sticky situation, and usually it requires a ton of people complaining to the site administrator that such and such a post is offensive. We don't automatically filter out any words, and each post is often treated separately.
(c) Spam. Nobody wants it there, so it's toast the moment it goes up.
(d) Copyright violations. This is one of the regulations for the hosting corporation, and so we usually have to replace text with a link to it. Sometimes we get away with it if we're siting a literary passage for a debate or something.
(e) Every now and then, if something is truly indecent, it'll get cut. That's too bad, because I had this really great run of posts that said "Don't click this!" and pointed to our goatsex friend. It was quite funny, but one silly twit who couldn't take a joke complained and it got taken down. Fortunately, that was almost two months after the fact so nobody there was liable to read that post again anytime soon anyway.
(f) Every now and then we self-police, and gang up on somebody if they're being really cruel. Many people enjoy their anonymity there, and use the opportunity to talk about a lot of personal stuff, so if a particularly mean poster uses that stuff against them, they'll usually face criticism and pressure to be a little nicer.
(g) We also have a board dedicated to flaming. This is great because once discussion gets heated, every poster on that particular board who isn't interested in hearing it can redirect the posters in question to the flame board to air out grievances. Needless to say, our flame board is pretty popular.
I think the important thing isn't so much what gets a user edited, but whether or not that user knows about it beforehand and is given fair warning. Yeah, it ends up being subjective, but one of the reasons people like to go to this place is because they can safely discuss things. Our administrator is great about leaving political talk alone -- I've been ranting and raving about how stupid this whole Afghanistan war is, for instance, and there's been no deleting of any of my posts. That said, I've had to stand up to some pretty harsh criticism, but that's okay -- as far as political speech goes, it's really free. Even though we do self-police, we never ask someone to change their opinions on issues in debate.
On other method that gets used, new users go through a trial period where they can't post on every board, even though they can read them all. This gets them a chance to see how our particular dynamic goes before they are allowed to post. It's arbitrary (two weeks), but it does filter out many people who aren't genuinely interested in the site themselves (spammers, trolls, etc.). This is a new measure we've taken up, and it's pretty controversial right now, so I wouldn't necessarily recommend it to anyone unless they KNOW something like this could fix some problems they're having.
As a website administrator, you've got to dedicate yourself to figuring out your own sites needs and getting everyone to stick to them. Oh yeah, and be prepared to be underappreciated and called a fascist pig if you ever do edit, even if it is the right thing for your site.
Re:Rules of our website (Score:2)
What country do you live/operate in?
Even in the US (where, IMHO, we have nutso IP laws) this wouldn't be "getting away" with something; it would unequivocally be fair use.
Of course, I assume you mean "citing" above."
-Peter
Re:Rules of our website (Score:2, Interesting)
This is the one thing about the "either you censor everything or nothing" idea that bothers me. If someone posted the home address of someone that wasn't a public figure tagged with some hate it could actually endanger someone, even if it was mod'ed to -1. I can see how in a public square you could say it and you could even make a 1000 copies and paste it all over town, but there are controls there. People could take them down off the telephone posts and we should have the same ability in cyberspace without opening ourselves up to litigation. It shouldn't be required, I can see how posting the same private address for a public protest would be ligit, but it should be safe to take it down.
Re:Rules of our website (Score:1)
Let's say we're protesting the mayor's policy of releasing the home addresses of voters without letting people know who is looking at the address. I think that protest should be at the mayor's house, you may disagree(, but you're wrong.
Other ramifications (Score:3, Insightful)
Your community is not about you; it's about your subject first, then about all the people who find your subject interesting, and about getting them together to communicate. This is important to remember. A lot of community owners find themselves so entranced by their status as benevolent dictator that they quit being benevolent. It's usually an ego-related thing. This is the worst-case scenario. Avoid it.
If you delete posts that people generally expect to be deleted, you'll find your community happy and rewarding. This includes spam, obvious mistake posts with no content, personal information that shouldn't have been communicated, and cases where someone set out to purposefully cause trouble with the system or the community.
If you delete posts that even one person finds useful, you'll find yourself in the middle of a controversy. Think about it from the user's point of view. A user may spend hours developing a post, even days contemplating what to say in a situation. Maybe they didn't take hours to write the post that you edited or deleted, but users don't want to even think about the possibility that their words may disappear. Delete a few posts without warning, even in a site that announces that it's heavily moderated, and you may find the community goes quiet for a few days. This sort of thing happens all the time.
This goes triply for editing posts instead of removing them. I would never participate in a system where my own, attributed words could be changed around as the site owner sees fit. Would you? Why would you? Why would anyone?
Also remember that a good, strong community will police itself to a degree. This sort of thing is not possible on someplace like
For a long time, newsgroups were the only net community going, and they were so prone to abuse that the communities in them had to develop a combination of thick skin and newbie-flaming. In fact, many people wrote that the flame was an important, necessary tool for the survival of these communities; if people wrote things that the community didn't like, they flamed, and this was their only defense mechanism. And for a while, it worked, until the net grew all out of proportion...
The point is, you may feel that you desperately need to take action as the site owner and moderator, but your best action may well be to leave well-enough alone and let your community take care of it.
Cry censorship (Score:2)
In general, we're small enough that we've never had problems with abusive / troublesome users, and so there's never been any call to edit users or delete posts, except for one.
Someone ran a story on Mohamed Atta [cnn.com], one of the terrorists on the planes that smashed into the WTC. Someone, apparently having searched for Atta's name online, found his way to our site and anonymously posted a link reading "Here is my message of patriotism!" The link led to a Shockwave animation saluting the "heroes" who destroyed the WTC and declaring "they died for justice."
I deleted the post. The guy came back, created an account, and reposted the link. I deleted the account and the post. He went away after that.
A couple of other users complained about my "censorship," but I would absolutely do it again under the same circumstances, without hesitation. It's a free country -- he's free to say what he pleases, and I'm free to nuke whatever he says from the board if I find it inappropriate. It says so right up front, when you click to the comments page. And that definitely falls outside the boundaries of what I will accept on my web site.
There is no issue here (Score:4, Insightful)
As a participant in a forum or message board, if you see something "offensive" - IGNORE IT - DO NOT REPLY. If you are the owner of a message board and you are not willing to accept posts that you don't like, then DO NOT RUN A PUBLICLY ACCESABLE MESSAGE BOARD.
It's that simple. Period.
If your ego is so big that you really MUST be in control of what people say, then draw up a bunch of rules and institute a registration process requiring a valid e-mail address. Then, when someone says something you don't like, or violates one of your silly rules, you can play dictator and revoke their posting ability.
The real problem here is ego. Trolls, flamers, assholes, etc. post crap in order to get a reaction and get attention. 99% of them do not have the patience and/or attention spam to conduct a long term campaign. Ignore them and they will go away. IGNORE THEM AND THEY WILL GO AWAY. Unfortunately, too many people are unable/unwilling to follow this simple advice.
I've seen it a million times in usenet newsgroups and various message boards. As soon as people see an "offensive" post their ego immediately kicks into high gear and they launch a retaliatory attack. The whole place becomes mired in attacks and responses to attacks. In the end, the "regulars" blame the trolls and flamers and cite this as another good reason for moderation, conveniently ignoring the fact that all they had to do was ignore the idiots and they would go away.
Re:There is no issue here (Score:2, Interesting)
Re:There is no issue here (Score:3, Interesting)
It's that simple. Period.
No it isn't. As someone with a public guestbook myself, I know the difference between something "offensive" and something "abusive." When someone posts the same idiotic joke hundreds of times in a row, I delete them all. You go ahead and ignore messageboard abuse and see how fast your board fails.
-Legion
Re:There is no issue here (Score:2)
Re:There is no issue here (Score:1)
Restrictions on time, manner and place.... (Score:2)
This is partly because of the tradeoff between freedom of speech and the right to peaceful enjoyment of personal property and life.
But it's also because "freedom of speech" does not protect the physical act of speaking, it protects the right to express a dissenting view. The majority requires no explicit protection precisely because it's the majority. But the minority, especially the lone dissenter, *does*. That's why some cities have laws requiring that protestors stand in specific "boxes" when they make their speech - it's partly to prevent others from attempting to drown out their voice!
The same thing applies in cyberspace. If you have *no* moderation and attempt to discuss controversial issues, you *will* have an asshole appear who doesn't mind posting hundreds of marginally pertinent responses to drown out "objectionable" content. Just look at alt.scientology (or something like that) sometime. While it's technically true that the original messages are still there, and it's not an exact analogue of the real-world situation where the lone protester may not be heard at all, in practice few people will bother to search for meaningful content and the protester(s) will have succeeded in supressing speech.
It's ironic, but sometimes the only way to guarantee that everyone has a voice is to be willing to silence those who would use theirs as a weapon.
consistency and tolerance (Score:2, Insightful)
First of all, let us not act like angry monkeys throwing our feces at each other. Let us not fall into the trap of hostile hypocricy that only hurts us and our 'causes' more than anything else.
That said, I believe that self filtering/censoring is up to each individual. Some use the phrase, "if you don't like what is posted, dont read it". This is a good if simplistic representation of the entire issue. However, it is used by those who are frankly nothing but parrots who repeat words without understanding either the words' meanings or the collective meaning behind the phrase, thus relegating it quickly to the knee-jerk cliche trash heap.
I see many situations where this phrase comes in handy. After all, it does no good to get all worked up because of some flamer that is just pathetically attempting to get a rise out of people. But before the rhetoric spouters begin their little crusades of mentioning how "if you don't like what is posted, then don't read it", let us look at what is ruffling the feathers first.
If I have a forum site that polices topics in specific threads, and perhaps even has a 'general thread' for offtopic posts, is it then bad to filter out offtopic posts relative to the section posted in? What if I have only one topic and the stated rules about 'appropriate behavior' clearly let everyone know to keep on subject due to the very nature of the board?
Now, let us say that I police content that is considered uncivilized, like personal attacks, slandering, cussing, etc. Is this bad? If in this situation, it is easy to see how many defending it would say, "If you don't like it then you don't have to be a part of the forum" See how that sounds so similar? Wouldn't someone who is trully 'tolerant' extend that tolerance towards those that he views as intolerant? Am I to claim enlightenment and tolerance by letting any subject be posted regardless of the topic at hand, or how negatively or positively it is posted, yet ONLY if I agree with said posts? Guess what, that is NOT TOLERANT? No matter how many fancy words, quotes, etc I throw at it, it is intolerant due to my very own definition. It is the worst sort individual that can not even stand the judgement of his own criteria that he applies so readily towards others.
If you want an open board, then good for you. If you believe that is morally and ethically superior, then continue to do so confident in that knowledge. Let education and your actions inspire others to do the same. If however, you attack others (and I will expand that below) in an attempt to free them, then by your own definition (and that of histories) you are a tyrant. Attacks consist of direct attacks such as slander, malicious statements, etc. but also very much include actions that attempt to shut others down (If you choose, good for you, if you 'organize' others to sheepishly follow you through fancy words and hateful rhetoric, that is much different). Also included is an inconsistent application of ethics or morals. You must be better than those you attack and must police yourselves first before you jump on any bandwagons to burn, rape and pillage others.
I am curious how many here have ever defended someone who they do not agree with, but did not wish to see an opponents rational addition of opinions and ideas be trampled under the draconian boots of some intollerant moderators. I also wonder how many would support laws, people, ideas (ATTACKS) that would take away the choices of forum maintainers and creators to filter their boards for what they themselves believe is important. I then wonder how many of these people that support the above, would then ironically do so under the banner of tolerance and being open minded. How many would admit that they simply wish to get rid of those they do not like or agree with. (it would be more respectful in that case).
This can be applied to so many other aspects of life too. I remember a time before the draconian laws restricting smoking in many private domains where gaining in popularity. I remember many smokers saying that not only were such laws bad, but the 'constitution' protected smokers from being "oppressed" in private restaraunts and the like. Oppressed for them meant that I as a shop/restaraunt owner could not restrict anyone from smoking. So, once again it became a lawyers game between two bands of zealots whom when looked upon with even the slightest scrutiny where seen for what they where... two different shades of brown from the same pile of manure.
If I was in charge, (Score:1)
a) An account is activated after 16 hours of application
b) In order to post, one must subscribe
c) All subscribers must supply a valid e-mail address, no other information is collected
d) All subscribers must have a unique e-mail address
e) In case of offensive/inappropriate posts, one might be banned from posting. The ban might be timed or permenant
f) An IP can apply for multiple accounts only if the IP can not be proved to belong to same person
These rules can keep abusers away without deleting or editing posts. Since you do not delete/edit any post in any case, you probably won't be responsible for their content. Obvious drawback of this scheme is a abuser might accumulate a set of accounts, in case one of the accounts is banned. If you can replace rules d & f with better rules that can be more strongly linked to identity, the system would work better. It would not be bulletproof in any case, but I doubt any other mechanism can.
Re:If I was in charge, (Score:2)
a) fine, but you can just start laying down accounts regularly
b) no argument shere but I think anonymous comments in a few select cases are a good thing, we don't want someone beging able to subpeona Slashodot for records just because some disgruntled employeee has told the truth about his/here company.
c)& d) most ISPs will give you as many aliases as you like
e) one mans offense is anothers humour. Censorship is not a good thing
f) With address translation, company proxy servers etc its getting increasingly difficult to tie addresses to specific computers. You COULD perhaps block multiple account setups from the same IP for a limited time [24 hrs maybe] but even this has it's problems
Meanwhile, across the Atlantic... (Score:2)
This is a very US-centric discussion so far, it seems. Certainly in the UK, there has been some legal history in this area. Anyone planning on running any sort of on-line message board should be well acquainted with things like the Godfrey vs. Demon case, what constitutes being a "publisher", and so forth. I am not a lawyer, but I suspect that many of the comments made here would hold little water in UK courts with the current legal position, even as unclear as that may be.
Not a good time for censorship.... (Score:1)
Your first mistake. (Score:2)
Your first mistake was running a political weblog. I know, because I made the same mistake [slashdot.org]. Though I still believe that self-regulation can work.
My best advice is to walk away, and let some other sucker take the fall.
Re:Your first mistake. (Score:1)
Re:Your first mistake. (Score:2)
Having the article posted at several sources really pissed them off. They did their best to discredit me, acting like I fabricated the whole thing.
The administration and student government tried to give me $500 to shut up. They confinscated my equipment, removed all references of SOS on posters, sites, etc. Plus removed all evidence that I had ever done anything for student government. They instructed all staff in ASUU to call the police if they saw me there.
My wife was expelled from the University of Utah, and now attends school elsewhere. Any mention of the U around her usually brings cursing and condemnation out of her mouth. I was suspended for a period of one year, but I was able to give a big sob story, and was reinstated on strict probation just before fall semester started.
I generally must stay out of computer labs on campus. I was officially banned from all labs on campus for some time, but that has been rescinded for a couple labs. I cannot volunteer for anything, and my graduation has been pushed back one year.
ASUU Student Opinion Survey is gone for good. I can't get a group anywhere near campus to touch the project with a ten foot pole. A few people see me around, but even some of my closest friends from a scant year ago, today refuse even simple eye contact with me.
I lost both of my jobs thanks to the fiasco which ensued, and it took me six months to finally get hired for a workstudy job. I had to convince tens of people that I wasn't going to crack into their computers if they hired me.
I am one of the most notorious and hated people around there.. and I can't wait to get my degree and get the hell out.
Letting the public have an open forum is a great idea, and there needs to be more forums. But if you run one, answer answer to anyone other than yourself.. be ready to get fucked.
Re:Your first mistake. (Score:1)
Upon graduation, leave Utah. I think New Hampshire is a pretty nice state.
A legal precedent: No you don't have to (Score:3, Informative)
Simply: Speech on a message board is worthless and not legally binding. If you want freedom of speech, yell out your window - and you're more likely to get in trouble for that.
This was on Tomalak's Realm [tomalak.org] a few days ago.
Newsbytes: California Appeals Court Upholds Message Board Speech [newsbytes.com].
Also another link: SJ Mercury: From November 28, 1999; `Cybersmear' lawsuits raise privacy concern [mercurycenter.com].
PS, please read the articles and understand them. I know it is a very hard thing to do, but I've even made them hyperlinks.
When SHOULD you edit? Who's site is it? (Score:2)
Easy. When you feel that this is not something you want on *YOUR* web site. Use your common sense, your personal morals and values, and stand up for what you believe in.
But, be fair and honest about it. State that this message has been edited, and tell the author *why*.
It's your wall. What grafitti do you want written on it?
mindslip
When the masses get dumb (Score:1)
This is an interesting question. I had a similar problem on a humor website that I run, Knowumsayin' [knowumsayin.com].
We had a feature called the Name Game, in which we would post pictures of weird looking people and users could give them nicknames. We expected some rude comments -- no problems there, but after a while the section was starting to make the whole site look really bad. People were simply posting names like "nigger" on this one black guy. There was nothing clever about it, and most of the names on all the pictures were nothing more than offensive and juvenile.
I know, "offensive" can be subjective. But, here's the catch: we also had a voting system in place so that people could moderate names. And each night names with votes below 5 (on a scale from 1-10) would be automatically deleted.
Well, it didn't help. Very few people were interested in voting on names (except the people that put them up in the first place), so the nasty stuff just stayed at the top. The feature was pretty popular, but only with dumbbells.
So we took it down. That was almost a month ago, and I still haven't found a solution.
Re:When Should Website Moderates Its Users? (Score:4, Insightful)
We do mod comments, yes, but we're fair about it.
I can say this with some certainty because, like all moderations, ours get metamoderated -- so if we start unfairly modding people up or down, we get email a couple of days later letting us know we screwed up!
I can't speak for the other Slashdot editors, but as for me -- of all my mods in the last several months, only two have gotten Unfair judgements. Both were trolls that had posted links that looked like they went somewhere informative but didn't. Apparently the metamoderators didn't bother to check the links, oh well. So I stand by my record of massive Fairness.
Basically I spend mod points where I see that I can save our regular moderators some time. Slashdot gets a lot of crap posted anonymously that is obvious trolling, flamebaiting, or offtopickism, and it would get itself modded down to -1 anyway if we flooded the system with mod points. My taking care of it lets our users focus a little more on picking out what they consider to be the good stuff to mod up, rather than just having a troll cost them a point (and the opportunity to participate in the discussion).
In short, I do a little bit of grunt-work, so that our users can be more choosy and careful, genuinely improving the quality and controlling the tenor of the site. And the built-in feedback of our M2 system will let me know if I ever stray too far from how the users think the site should be run.
Also, for the record, "bitchslap" refers to a specific script [slashcode.com] in the codebase which retroactively sets all of a user's comments to score:-1. Important point: it's only ever been used on user accounts that posted using scripts. And it hasn't been used in months, AFAIK, since the existing moderation/metamod system has been working so well.
Re:When Should Website Moderates Its Users? (Score:1)
Re:When Should Website Moderates Its Users? (Score:4, Insightful)
Jamie, my former colleague, you may be most sincere, but there is a logical flaw in your argument. To wit: I doubt there is anyone who would ever post:
This is the classic "Who watches the watchers?" question. In one's own mind, almost certainly, everything one does is fair. This is not to criticize you personally. However, I think you miss the fact that your statement doesn't establish anything objectively.Again, the logical flaw is that, suppose you didn't care what those e-mails said? Supposed you believed you were RIGHT, and any email simply failed to recognize your obvious correctness?
"For Brutus is an honourable man; So are they all, all honourable men,"(Marcus Antonius meant that sarcastically, the idea being that even if Jamie, err, Brutus, was an honorable man, it didn't necessarily mean that the other editors, err, Romans, were honorable men).
Suppose a skeptical person doubted your philosopher-king status? For example, we know that Michael Sims had a very different view of the "fairness" of his actions with regard to slamming down comments about his destruction of the censorware.org website [google.com]. He would undoubtably argue that all his actions where justified, that every comment he slammed as a troll was a troll, and so on. This is the essence of the conflict of interest. I know some of the anti-spam activist have doubts about comments of theirs criticizing your coverage, which got marked down. Can you blame them for their doubts? (even if you are in fact an honorable man).
Y'know, you may not realize it, but Slashdot looks a lot different from "down here". Especially when one thinks an editor is abusive about an issue which affects one personally.
I have suggested that editorial moderations be clearly marked [slashdot.org]. And I agree with other (anonymous) writers here that the fact that editors have infinite moderation points (of course only use them morally, justly, and with great wisdom ...), deserves mention in the FAQ. These changes
would alleviate some understandable distrust.
Well, I've rambled, perhaps way too much here. Too many topic which stirred a chord in me. and perhaps not worth the effort. But definitely, I suggest again making clear where editorial moderations have been done.
Sig: What Happened To The Censorware Project (censorware.org) [sethf.com]
Re:When Should Website Moderates Its Users? (Score:1)
I suggested that this was a bug, because the Messaging system would say that you have been Moderated by a User when an Editor moderates you.
The response was that there are no Editors at Slashdot, only Users. You can read the full bug report by clicking the link in my
But suffice it to say you're not the only one who thinks this would be a great idea.
Re:When Should Website Moderates Its Users? (Score:1)
Jamie, I must respectfully nitpick. When the bitchslap script was used on this account, it was because I had posted over 100 comments at the +2 level by clicking on submit a whole bunch (the multipost bug). I did not use a script to do this, and I was bitchslapped to -25 karma anyway.
I'm not complaining here about fairness or unfairness, per se, but rather about the accuracy of your comment. I was IP banned and bitchslapped even though I had not used a script, nor broken your "scripted abuse" rule. If you can point me to the rule I broke, I will publicly stand corrected. Furthermore, I have never ever used a script to access Slashdot. Please understand that I am not calling you a liar; you may have been unaware of these facts. I just hope that you can be more careful with the facts in the future.
Anyway to everyone who is worried about editors moderating posts, stop worrying. Editors are users too, they only get 5 mod points just like us. You can read more about this in the link in my signature.
That's all,
--sllort.
It's to late. (Score:1, Redundant)
Depends on the mission, IMO (Score:1)
In general I abhor censorship, but censorship is only relevant IMO when you're talking about an open public forum. Editorial judgment is quite appropriate and even necessary in a forum with a specific purpose.
Re:Depends on the mission, IMO (Score:1)
> some people don't see relevance anywhere, whilst others see everything as interconnected issues
Well, I think the point here is that if you are running a site with a particular focus, and you take what you're doing seriously, then it's your call what's inside the fence and what isn't. If you see relevance everywhere, or hardly anywhere, it's up to you to make the call. If the folks who visit the site don't agree with your judgment, they'll let you know, and if you are consistenly an idiot then they'll split, and you'll wind up conducting a monologue. But if your judgment is reasonably close to theirs, then the community with thrive.
The bottom line is whether it's your site or an unmoderated public site. If it's your site, then you have the right, ability, and duty to keep the discussion on topic. In my experience, a well-moderated site has very little trouble with abuse, yet still maintains an open forum with no perception of censorship, because the posts that get removed are the ones that piss everybody off.
But again, this works in direct proportion to how focused the topic is for the site. If you're discussing banana farming, to use your example, and everybody on the site is interested in banana farming, it will be obvious to everybody if some newcomer starts talking trash and should get booted.
Re:Slashdot itself needs a few user deletions (Score:1)
Re:Slashdot itself needs a few user deletions (Score:1)
I don't want policed OR traditional, for the record. I just thought that putting the trolls first was, well: "way over the line. You're so far past the line, you can't even see the line.. THE LINE IS A DOT to you."
Hey, I like to read a good troll occasionally, but lets not lose our heads, and say "tradition above all".. I think rome fell that way. Continue to change for the better, or stagnate, right?
Besides, you're talking about San Francisco. That's where trolls come from, right? Most people who live there look and act like trolls, anyway.