


Ask Slashdot: Is There an Ethical Way Facebook Can Experiment With Their Users? 141
An anonymous reader writes: This summer, news broke that Facebook had conducted an experiment on some of their users, tweaking which posts showed up in their timeline to see if it affected the tone of their later posts. The fallout was extensive — Facebook took a lot of flack from users and the media for overreaching and violating trust. (Of course, few stopped to think about how Facebook decided what to show people in the first place, but that's beside the point.) Now, Wired is running a somewhat paranoid article saying Facebook can't help but experiment on its users. The writer says this summer's blowback will only show Facebook they need to be sneakier about it.
At the same time, a study came out from Ohio State University saying some users rely on social media to alter their moods. For example, when a user has a bad day, he's likely to look up acquaintances who have it worse off, and feel a bit better that way. Now, going on social media is going to affect your mood in one way or another — shouldn't we try to understand that dynamic? Is there a way Facebook can run experiments like these ethically? (Or Twitter, or Google, or any similarly massive company, of course.)
At the same time, a study came out from Ohio State University saying some users rely on social media to alter their moods. For example, when a user has a bad day, he's likely to look up acquaintances who have it worse off, and feel a bit better that way. Now, going on social media is going to affect your mood in one way or another — shouldn't we try to understand that dynamic? Is there a way Facebook can run experiments like these ethically? (Or Twitter, or Google, or any similarly massive company, of course.)
Let me handle this one guys... (Score:1)
No.
Re:Let me handle this one guys... (Score:5, Insightful)
Actually, if they got VOLUNTEERS, it would be ethical.
Kind of like experimental drug studies, the subjects need to know they are being experimented on, in order that it be ethical.
Re:Let me handle this one guys... (Score:4, Insightful)
They did get volunteers. Everyone who uses FB is a volunteer. They have agreed to FB's terms of service, which allows FB to do this.
Don't like how they run the show? Then don't use FB. If enough other people don't, it will die the death it deserves. But if you keep allowing all your data to be sold for profit, they'll keep doing so. It's not that complex of a concept.
Re:Let me handle this one guys... (Score:4, Insightful)
Your point might technically cover Facebook legally, but Zuckerberg has a history of proving that not all legal acts are ethical. It's the philosophy that his career is founded upon.
The ethical way to do research is to get explicit, informed consent. If you tried at a university to pass off that consent in a TOS buried in a site related to totally different subject matter then you'd probably be expelled.
Re: (Score:3)
Re: (Score:2)
There's this thing in the scieintific community called "informed consent" - basically if you're going to experiment on someone you have to explicitely say what you're doing for every single experiment
Not really, no. You can completely change the experiment (new primary and secondary endpoints, etc.) without getting the subjects consent. The samples and data collected for one experiment can be used for unspecified future experiments, so long as that possibility is disclosed. The legalities are getting tighter, but unless you are related to Henrietta Lacks (HeLa cells) blanket consent for use of samples, complete with intellectual property rights, is still the norm.
Buyer beward (Score:1)
Facebook very existence is based on extracting and modeling user data. If you are too naive to understand this and what it implies, then you should not be using a computer. That's why 13 year old's are off limits. How did you come to believe that Facebook was a charity?
Re: (Score:1)
It may be that Facebook's business strategy results in something other than what most people use it for, and it may be that the uses it is put to should drive its design more than its business strategy. That implies that somebody ought to bury Facebook, develop an alternative that undoes the wrongs in the design and that undercuts the economics of the design. It takes an analysis of the model Facebook uses and a means to undercut its viability and offer an alternative that gains traction by better supporti
Re: (Score:2)
Sadly, the US don't seem to have a "reasonable expectation" clause in its consumer protection laws.
Re: (Score:2)
I, for one, see a difference between "nanny state" and "not having to get a law doctorate just to buy something if I don't want to get ripped off".
Re: (Score:1)
Yes, and Fuck You Too! America is a business-powerful state, not a democracy, not even a decent representative republic, because business has TOO much power to influence law and taxation, the members of Congress are not bought by workers, but by billionaires and an increasingly powerful plutocracy who would destroy personal expression and political freedom if they could. Facebook is a good example of how one elitist, spoiled, rich kid, sociopathically manipulates people. The model is Harvard and the Face B
Re: (Score:3)
For it to be ethical it would have to be a very clear opt-in procedure, not something buried in a ToS somewhere.
Re: Let me handle this one guys... (Score:1)
Re: (Score:2)
No they didn't. That's not how informed consent works.
Re: (Score:3)
TOS-speak is not accepted by anyone in any of the various corners of science that perform human subjects research.
Nope! When subjects volunteer for a specific study or survey, sure. But most consumer research doesn't require informed consent unless there is direct interaction between the researcher and the subjects. A/B advertising research, OKcupid's experiments ... informed consent is buried in the boilerplate or completely absent. That's even partially true for some medical research: you aren't required to give informed consent for each research project that uses a sample taken from you if you gave a blanket consen
Re: (Score:1)
Facebook performed a specific study and published it in an academic journal. It wasn't consumer research, nor was it passive observation - they intentionally subjected particular customers to an intervention in order to (try to) prove something about emotional contagion. They acknowledge in the paper that they *know* such an intervention could cause adverse events. If you think that's the same as A/B testing to decide where a button goes, you're completely insane.
Re: (Score:1)
Re: (Score:2)
How about opt in? (*Not* opt out, because companies can be sneaky about where and how to opt out.)
If someone wants to be experimented on, who are we to say no? But that would be the only way. And if this muddies the data, too bad.
Comment removed (Score:5, Insightful)
Re: (Score:3)
It depends on what every one considers ethical.
I mean Facebook itself is an advertising platform. You as a user really should know that. It sells advertising, you get a place to tell your friends how much you like your cat.
Knowing this you need to expect Facebook to try to get the best possible adds in front of your face so they can sell these adds for more money to the buyers.
Now Facebook isn't hurting people and the fact that the adds are labeled as such and not tricking you into thinking your friend endo
Re: (Score:2)
Don't confuse ethical for moral.
People do A/B testing all the time, that's perfectly legitimate and therefore ethical. What's any different when Facebook does it?
Re: (Score:2)
Because there is seldom anything ethical in the law.
Laws are external regulators, ethics come from within.
External regulation requires a host of legislation, enforcement and judiciary.
Internal regulation is self contained, costs nothing and wastes no time of others.
Yes, Facebook needs to be ethical over law abiding. I don't care to throw money at it to keep it law abiding. Money needs to go to IMPORTANT things instead.
Ethinomics (Score:1)
Because that's what people expect from every person and every company in the world?
Moreover, in case of experiments there are traditionally institutionalized ethical codes above the law in the form of scientific honor committees and ethics commissions as part of the (supposed) academic self-control in science. For example, a university can take away your PhD degree in case of plagiarism even if it has no direct legal implications because e.g. no copyright claims are made and the plagiarizer has not sworn an
Re: (Score:1)
Informed consent? (Score:4, Informative)
Re: (Score:1)
The thing is that the order of posts you get etc is always the effect of applying some algorithm
If FB only messed with the ordering in which posts are displayed, folks wouldn't be so pissed. But FB filters posts, and has ever since they started accepting "sponsored" posts, and that means they are hiding information that your friends tried to show to you.
When they start hiding information to fuck with your mental state for any purpose other than convincing you to buy a product/brand, then they become a culpable actor and potentially contributory to injury or death. It isn't the scientists inside FB w
Re: (Score:2)
When they start hiding information to fuck with your mental state for any purpose other than convincing you to buy a product/brand, then they become a culpable actor and potentially contributory to injury or death.
Now that's an interesting distinction to make. If a marketing strategy was actually injurious, would it still get a bye? Also - this research was still ultimately aimed at getting people to use a consumer product (Facebook) as profitably as possible. So shouldn't it be eligible for that exemption?
Re: (Score:2)
The funny thing is, this goes back to how facebook originally was set up. People wrote on each others walls. People had to go to a friend's page to see what they've done.
Then they added the News Feed to compete with twitter to help people update. Then it became overwhelming. Assume random people have 500+ friends. Those friends update 2-3 times a day. You need an algorithm to filter what is going on.
Re: (Score:3)
Exactly.
The experiment should be opt in.
NOT opt out or in Fazebook's (sic.) case "not given a choice to opt in/out."
Re: Informed consent? (Score:1)
The Nuremberg Convention (yes, that Nuremberg) begs to differ.
Facebook hurts the Internet (Score:3, Informative)
Stop using it.
Re: Facebook hurts the Internet (Score:1)
Please tell me an alternative thats popular but doesnt fuck its users. Hint: wont find one.
Re: (Score:3)
Re: (Score:3)
That is somewhat clunky lifestyle, which in turn is the reason why so many want to be in Facebook despite the datamining concerns.
Is it? A few 5-minute phone calls, a few text messages, a few emails as opposed to people watching a deluge of irrelevant posts for an hour or more a day and not being able to remember 99% of them? Interrupting meals (as *real* time to socialize) to check facebook, to post, etc. because they might "miss something."
Re: (Score:2)
I fear for the human race (or at least the part that is is logged into Facebook).
Using a modicum of time and energy (and not much of either) to keep one of the most important aspects of your life (social interactions) intact?
Oh look! A squirrel!
Re: (Score:2)
There are other ways to communicate with people such as email and the Real World(tm).
Re: (Score:2)
Pffft email? Real men keep in touch with people they don't like via telegram.
Re: (Score:2)
Re: (Score:2)
And there won't be one until people start using something else.
Re: (Score:1)
And you won't build it? What ideas do you have for the alternative? I have many, see my other posts in this thread. In a nutshell, I would use a less monolithic CMS, regionalize it, distribute it, cloud it. Secondly, I would allow for more conversation complexity. I would allow for Markdown format in replies with quoting of previous comments and allow for topic change. I would allow for users to define their own layout and themes. Facebook had to be extremely stingy with comment formats, ruthlessly squeezi
Re: (Score:2)
Because I don't want to be the weird guy who insists using e-mail for communications. I would just be an endurance to everyone.
Funny. In my circle of friends, anybody who would insist on communicating only via Facebook would be the "weird guy" (actually, as far as I know, most of them do not even have a Facebook account, specifically because of privacy concerns). We communicate via e-mail, or, if we need to talk to someone Right Now, via telephone. It works, try it.
Re: (Score:1)
That is easy, but if you want to keep track of family, for example, it is quite hard to ignore it. I use it with greater and greater caution, and I don't post much to it. I believe that FB is totally manipulative and that most of its features are annoying. I'd like to leave it, but what I would really like is competition with it. I'd like to see someone come along and destroy its business and drive it out of existence. I have thought about how I think it fails and why. I know that someone could do a much b
Every commercial website experiments on its users (Score:3, Funny)
Including this one. [slashdot.org]
Re: (Score:1)
MOD THE PARENT UP!
No way. He linked to Beta!!!! I'd rather mod up a goatse link.
"They trust me — dumb fucks" (Score:5, Informative)
-Zuckerberg.
http://gawker.com/5636765/face... [gawker.com]
if you give FB the power.... (Score:1)
If you make FB the gatekeeper of all your communications, don't bitch when they act as such.
Yes (Score:5, Insightful)
Yes, it's very simple:
"Dear Facebook user, Facebook is conducting a study to better understand our users. The study will last 2 weeks and no personally identifying information will be recorded. You will likely not notice any difference to Facebook while it's going on. By helping with this study you will help to improve facebook for everyone! Do you consent to be a part of this study?" Y/N
It's called "informed consent"
Re: (Score:1)
Re: (Score:3)
this. Almost. When someone is made aware that they are being observed, their behaviour changes. Study is biased from the off.
Re: (Score:2)
Re: (Score:2)
Simply ask for permission without disclosing the nature of the study or the objectives.
Of course, nobody can be expected to use a free single-provider service and honestly agree to the terms of such use without fraudulently agreeing to a contract they have no intention of adhering to. Facebook is a human entitlement at this point, like water in Detroit. The courts should create an obligation on the part of the Facebook employees (enforced by the gun squad of the Marshal's Office) to provide that service t
Re: (Score:1)
Simply ask for permission without disclosing the nature of the study or the objectives. This is how scientist do it (of necessity) when they get a bunch of subjects in a room for a test. The subjects perform their tasks without knowing what the actual study is, hence there's little or no bias.
To be more exact: merely watching somebody who is in an environment where being watched doesn't require permission. The moment you start manipulating their environment in ways like Facebook did, or observing them in places that are supposed to be private, you need permission. You are however not obligated to tell them before the fact the true nature of the experiment, if they ask after (or if you're just observing a public space, while) you give an honest answer, and it's always going to go past the rev
Re: (Score:2)
Remember, it's a free service. As long as you're not paying Facebook anything for it, you don't have a right to anything - Facebook is under no obligation to give you anything since you aren't providing them compensation. They can put whatever restrictions or requirements on the service that they want, including being able to perform A/B testing. Your only choice if
Re: (Score:2)
I said informed consent. Anyone doing a scientific study knows exactly what that means. The fact that some chose to do previous studies without informed consent is rather shocking. Intentionally manipulating peoples lives and decisions is despicable. I'll never be in marketing for that very reason. But scientists doing it? That's unforgivable. They should be sanctioned by whatever body it is that governs their work.
Don't see the problem. (Score:2)
Doctors give placebos to deathly sick patients all the time for research purposes.
Patients agree beforehand just as FB users and those aren't even (that) sick.
Re: (Score:1)
This. FB users have already agreed to this, so there is no ethical issue.
If you don't want to agree to FB's terms, then by all means, don't use FB. Nobody is forcing you to.
Re: (Score:2)
Alright - then show me the signed waivers from the Facebook users they experimented on showing that they have given the INFORMED CONSENT considered a standard requirement for human experimentation, be it physical or psychological. A paragraph buried in down in the terms of service nobody reads doesn't cut it.
Your feeeeeeelings? (Score:1)
Awww poor baby.
The Hound (Score:3)
To quote The Hound..."Social media is for cunts."
solution: call it something else (Score:2)
Facebook is crap for a number of reasons, but not because they do what most people with webpages do.
Hmmm (Score:4, Funny)
For example, when a user has a bad day, he's likely to look up acquaintances who have it worse off, and feel a bit better that way.
That sounds automatable. Schadenfreude, the browser extension.
Re: (Score:2)
Everything is an experiment (Score:1)
Carl Sagan once commented (can't recall where) on the general aversion people hold toward the government conducting experiments in public policy. He then detailed how every change of law was an experiment of sorts, although often without proper controls.
If Facebook should be required to inform consumers of how they experimentally manipulate them then should Kellogg's reveal the details of how they use marketing to manipulate kids into buying Fruit Loops?
Re: (Score:2)
That would be like pushing an experimental kernel as a stable one. Public policy should be the result of experiments, not the experiment. Though I don't think this happens.
We all know (or should know) public policy is written by issue specific think tanks with an agenda. An experiment presumes a curiosity about the outcome. People writing policy know which way they want "the experiment" to go. When HCI/Brady group write gun control legislation, they have no desire to see the net effects of their legislation
ethical science (Score:2)
...where the subjects behaviour changes when he knows he's being observed is the simple expedient of not letting the subject know he's being experimented on.
The simple solution to this is to append the TOU with a clause that allows Facebook to conduct blind studies into behaviour by shaping traffic *on a randomised and anonymous basis*. Individual users are NOT informed when their account is being used in a study - they've already agreed to let it happen.
*For definition of "randomised and anonymous" in cont
Re: (Score:2)
No, you don't get to get blanketly avoid informed consent simply because it makes your experiment *hard*.
You ask an IRB for a waiver, each time, like you're supposed to. The whole point is that you are not supposed to be running experiments *on humans* without supervision. We've had way too many problems with that in the past, and so the requirement of getting a 3rd party to sign off first was invented as an incentive against running unethiccal experiments.
For some reason, there are a lot of people that are
Random (Score:2)
Re: (Score:2)
It's not beside the point (Score:5, Insightful)
(Of course, few stopped to think about how Facebook decided what to show people in the first place, but that's beside the point.)
No, it's the entire point. Your stream, to the extent that it's "your" stream, is already manipulated in ways you're not told, based on what Facebook thinks you will find interesting, funny, or engaging enough to come back and see more ads. Experiments have to happen as part of Facebook's desire to expand, if only to see which manipulations mean more ads displayed. The only difference is that now the people who are interested in something closer to science than ad sales at Facebook won't tell us about their results. Are you happy?
No. (Score:2)
When asking these questions, stop using Facebook and substitute wit, "Facebook stockholders."
That answers that question.
Get approval of an IRB, like everybody else (Score:3)
The whole point here is that you need somebody else who is not the experimenter to sign off on the experiment when humans are involved. We call those people the IRB.
What's that, tech industry? You think a/b testing would be impacted? That's a popularity contest, not an experiment. Facebook's experimet had the specific goal of being able to manipulate the emotions of their users, which goes far beyond simply asking which website layout they find more attractive or useful.
What's that, tech industry? You think it would take way too much time if you had to get approval for experiments? Then throw together a multi-company group to found your own IRB. I'm sure there are universites that would be willing to partner with that group to lend their advice and help the group get started quickly.
What's that, tech industry? You think that there is not way you could conduct your experiments if you had to get proper informed consent, (which has specific criteria - an EULA or TOS does not count)? First: welcome to the club. Sometimes, doing proper and ethical experiments is hard. Many disciplines have tog deal with that, and I guarantee it is easier to find alternative ways to test your theories about "social media" than it is for the psychologist trying to investigate complex mental health issues, and both of those areas of research get to skip the whole "untested, unknown, and probably horribly dangerous new drug" mess that some doctoroso have to find a way to test without killing the participants.
Worse - and this betrays the total and complete ignorance of the people at Facebook that ran this experiment - if they had bothered to ask an IRB like you are supposed to, there is a good chance they could have gotten some requirements such as having to get informed consent in advance could have been waived. Their experiment simply wasn't that risky, compared to most experiments involving human testing.
TL;DR - If the tech industry decided to work with the process and bothered to ask an IRB, they would have avoided a lot of bad PR. Their failure to do this - and their insistence afterwords that even a trivial "trust but verifyf" is the kind of thing that only applies to *other people* only serves to make people fear the entire insutry. Justifiably. Would you want to buy stuff from people that avoid every ethics regulation?
Of course, I haven't addressed any of the state laws, some of which have even stronger requirements...
Re: (Score:2)
hear, hear.
Yes. It's called "informed consent." (Score:1)
That's all there is to it.
Re: (Score:2)
Changing how your website performs text output is not experimenting with users. It's really annoying when /.rs start buying into misnomer. There's no need for consent when I move a button, nor when facebook changes an algorithm. Take a breath and reconsider.
Re: (Score:2)
This is true, of course.
Which is why so many people are angy at Facebook - they went far beyond simply changing the layout or pagination or similar features. Instead, they set out to see if they could manipulate the emotions of their users, by the indirect means of selecting bits of content that the user requested that had certain properties, and changing how they would be presented (effectively hiding those items for the testing period).
How they may have intended to improve user's emotions. By most externa
Re: (Score:2)
> Of course not. That kind of change is totally off topic.
I totally disagree. It's specifically the same. Many companies regularly perform macro/micro experiments (Digg, /. beta, etc), if we are going to call them such. There is only a question of degree and depth of analysis. You should really take up the ethical ramifications of paint colors chosen by market chains to influence human behavior.
> Why is it that people who are supposedly highly educated, experience [observation: your low /. UID] and us
Re: (Score:2)
> There is only a question of degree and depth of analysis.
I take it you didn't read the first half of my post. That is not the only difference between what is effectively a passive popularity contest that you happen to measure in revenu or subscriptions and an attempt to affect change in the emotions as an active result of my actions as a specific goal. Seriously, this is not a complicated distinction.
> The Common Rule does not apply here.
I see you are not up to date on varoius state laws. I suggest
Re: (Score:2)
> I take it you didn't read the first half of my post. Seriously, this is not a complicated distinction.
I don't agree with the distinction. I stand by my statement. There is only a question of degree and depth of analysis.
> Remember, the entire point I'm trying to make
That's not what you are communicating and obviously not your point, as the majority of what you're saying is trying to convince me that my conclusions are spurious.
> So, not only are you not listening to what those people are saying,
Something better than facebook? (Score:2)
Anonymized From The Start (Score:2)
Sure, a set of anonymized, randomized set of users at the beginning and ensure they remain anonymous throughout the study, then do the study. The question is whether FB can truly anonymize the data they are studying. I would place a wager that they cannot. There is so much information creep in FB that anonymizing the data may not be possible.
Second solution, give the research projects to people who truly have no interest in the data or the results.
Re: (Score:2)
I for one welcome our Schadenfreuding overlords!
what "rights"? (Score:2)
For all we know if could be designed as a parody website, and shows us things that our friends looked at, modulated by some sarcasm filter.
Everything that Facebook shows us is an experiment. And you object because they adj
Re: (Score:2)
Isn't anybody screening these calls? (Score:1)
Yes: anonymized, "read only" data (Score:2)
IMO, It's ethical to collect and use data on people that has been stripped of identifying information -- census data, for example, is a major element of sociology research. You still need an institutional review board, but it can be OK. Where Facebook went wrong was by changing things for people to try to manipulate them.
In short: anonymous "read only" experiments on human subjects are OK; "read/write" experiments are a no-go without explicit individual consent and monitoring.
("But if we can't manipulate
Sure. Obtain consent first. (Score:2)
That is all... just ask people to opt in. If they do... go crazy with it. And try to reward them proportionately for it.
Never (Score:1)
Sure (Score:2)
All they have to do is put up a new login/greeting page that reads "Welcome to Facebook, test subject 42. All your actions are belong to us."
Re: (Score:1)
They already are... (Score:1)
pondering the effects of their site.
testing it
publishing results
great - hopefully the insane backlash wont stop them showing future tests publicly.
if you dont want FB to find out theeffects of their site on you... don't use it.
What about FB's meme-munching "data scientists"? (Score:1)
YES (Score:1)
Re: (Score:2)
No, they haven't. That's why we have the concept of "informed consent" instead of just "consent".