Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Facebook Social Networks Science

Ask Slashdot: Is There an Ethical Way Facebook Can Experiment With Their Users? 141

An anonymous reader writes: This summer, news broke that Facebook had conducted an experiment on some of their users, tweaking which posts showed up in their timeline to see if it affected the tone of their later posts. The fallout was extensive — Facebook took a lot of flack from users and the media for overreaching and violating trust. (Of course, few stopped to think about how Facebook decided what to show people in the first place, but that's beside the point.) Now, Wired is running a somewhat paranoid article saying Facebook can't help but experiment on its users. The writer says this summer's blowback will only show Facebook they need to be sneakier about it.

At the same time, a study came out from Ohio State University saying some users rely on social media to alter their moods. For example, when a user has a bad day, he's likely to look up acquaintances who have it worse off, and feel a bit better that way. Now, going on social media is going to affect your mood in one way or another — shouldn't we try to understand that dynamic? Is there a way Facebook can run experiments like these ethically? (Or Twitter, or Google, or any similarly massive company, of course.)
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Is There an Ethical Way Facebook Can Experiment With Their Users?

Comments Filter:
  • by Anonymous Coward

    No.

    • by flyneye ( 84093 ) on Sunday October 05, 2014 @08:59AM (#48067527) Homepage

      Actually, if they got VOLUNTEERS, it would be ethical.
      Kind of like experimental drug studies, the subjects need to know they are being experimented on, in order that it be ethical.

      • by Anonymous Coward on Sunday October 05, 2014 @09:14AM (#48067581)

        They did get volunteers. Everyone who uses FB is a volunteer. They have agreed to FB's terms of service, which allows FB to do this.

        Don't like how they run the show? Then don't use FB. If enough other people don't, it will die the death it deserves. But if you keep allowing all your data to be sold for profit, they'll keep doing so. It's not that complex of a concept.

        • by duck_rifted ( 3480715 ) on Sunday October 05, 2014 @10:41AM (#48067927)
          That's duplicitous. People use Facebook to keep in touch with family and play little web video games. We both know that grandma isn't going to fetch her spectacles to read pages of legalese in tiny grain print.

          Your point might technically cover Facebook legally, but Zuckerberg has a history of proving that not all legal acts are ethical. It's the philosophy that his career is founded upon.

          The ethical way to do research is to get explicit, informed consent. If you tried at a university to pass off that consent in a TOS buried in a site related to totally different subject matter then you'd probably be expelled.
          • by pepty ( 1976012 )
            But Facebook is just doing the same things other consumer product/service providers do to their customers. It was only newsworthy because it was Facebook.
          • by Anonymous Coward

            Facebook very existence is based on extracting and modeling user data. If you are too naive to understand this and what it implies, then you should not be using a computer. That's why 13 year old's are off limits. How did you come to believe that Facebook was a charity?

            • It may be that Facebook's business strategy results in something other than what most people use it for, and it may be that the uses it is put to should drive its design more than its business strategy. That implies that somebody ought to bury Facebook, develop an alternative that undoes the wrongs in the design and that undercuts the economics of the design. It takes an analysis of the model Facebook uses and a means to undercut its viability and offer an alternative that gains traction by better supporti

        • Sadly, the US don't seem to have a "reasonable expectation" clause in its consumer protection laws.

        • "Sign up for other services and you volunteer to participate in random experiments" is NOT ethical.

          For it to be ethical it would have to be a very clear opt-in procedure, not something buried in a ToS somewhere.
        • Um, no. I don't buy this line of reasoning. People do not agree to participate in experiments when they sign up for social media. Even if, by some legal handwavery, this can be construed through an acceptance of the terms of use, it's still unethical. Facebook is a big company. If they do not hold themselves to high ethical standards then perhaps they need to be held to those standards by the society upon which they are constituted. So to answer OP: yes, Facebook can ask users if they want to participate
        • No they didn't. That's not how informed consent works.

    • How about opt in? (*Not* opt out, because companies can be sneaky about where and how to opt out.)

      If someone wants to be experimented on, who are we to say no? But that would be the only way. And if this muddies the data, too bad.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Sunday October 05, 2014 @08:31AM (#48067453)
    Comment removed based on user account deletion
    • It depends on what every one considers ethical.
      I mean Facebook itself is an advertising platform. You as a user really should know that. It sells advertising, you get a place to tell your friends how much you like your cat.
      Knowing this you need to expect Facebook to try to get the best possible adds in front of your face so they can sell these adds for more money to the buyers.
      Now Facebook isn't hurting people and the fact that the adds are labeled as such and not tricking you into thinking your friend endo

    • Don't confuse ethical for moral.

      People do A/B testing all the time, that's perfectly legitimate and therefore ethical. What's any different when Facebook does it?

  • Informed consent? (Score:4, Informative)

    by DeadDecoy ( 877617 ) on Sunday October 05, 2014 @08:37AM (#48067469)
    Scientists request it all the time through their internal review board, This isn't really a complex issue, which is why the approach facebook took is considered underhanded and skeevy.
    • Exactly.

      The experiment should be opt in.

      NOT opt out or in Fazebook's (sic.) case "not given a choice to opt in/out."

  • by koan ( 80826 ) on Sunday October 05, 2014 @08:38AM (#48067473)

    Stop using it.

    • by Anonymous Coward

      Please tell me an alternative thats popular but doesnt fuck its users. Hint: wont find one.

      • by S.O.B. ( 136083 )

        There are other ways to communicate with people such as email and the Real World(tm).

      • "Please tell me an alternative to cake that's sweet but won't make me fat." ... ?
      • by Imrik ( 148191 )

        And there won't be one until people start using something else.

      • And you won't build it? What ideas do you have for the alternative? I have many, see my other posts in this thread. In a nutshell, I would use a less monolithic CMS, regionalize it, distribute it, cloud it. Secondly, I would allow for more conversation complexity. I would allow for Markdown format in replies with quoting of previous comments and allow for topic change. I would allow for users to define their own layout and themes. Facebook had to be extremely stingy with comment formats, ruthlessly squeezi

    • That is easy, but if you want to keep track of family, for example, it is quite hard to ignore it. I use it with greater and greater caution, and I don't post much to it. I believe that FB is totally manipulative and that most of its features are annoying. I'd like to leave it, but what I would really like is competition with it. I'd like to see someone come along and destroy its business and drive it out of existence. I have thought about how I think it fails and why. I know that someone could do a much b

  • by Anonymous Coward on Sunday October 05, 2014 @08:40AM (#48067477)

    Including this one. [slashdot.org]

  • by koan ( 80826 ) on Sunday October 05, 2014 @08:40AM (#48067479)

    -Zuckerberg.
    http://gawker.com/5636765/face... [gawker.com]

  • by Anonymous Coward

    If you make FB the gatekeeper of all your communications, don't bitch when they act as such.

  • Yes (Score:5, Insightful)

    by Charliemopps ( 1157495 ) on Sunday October 05, 2014 @08:48AM (#48067495)

    Yes, it's very simple:

    "Dear Facebook user, Facebook is conducting a study to better understand our users. The study will last 2 weeks and no personally identifying information will be recorded. You will likely not notice any difference to Facebook while it's going on. By helping with this study you will help to improve facebook for everyone! Do you consent to be a part of this study?" Y/N

    It's called "informed consent"

    • This. It takes a minute to type up, less to send out to any concerned parties. Guaranteed, the vast majority of people, presented with this message, would click "yes". Most understand that Facebook is a business, and needs to do certain things to keep that business successful. Now, if you don't like that...stop using it. You could always go Google+(the only reason it isn't successful is because you all are too lazy to learn something new), or just message the real friends you have, or just stick with T
    • by ihtoit ( 3393327 )

      this. Almost. When someone is made aware that they are being observed, their behaviour changes. Study is biased from the off.

      • Simply ask for permission without disclosing the nature of the study or the objectives. This is how scientist do it (of necessity) when they get a bunch of subjects in a room for a test. The subjects perform their tasks without knowing what the actual study is, hence there's little or no bias.
        • Simply ask for permission without disclosing the nature of the study or the objectives.

          Of course, nobody can be expected to use a free single-provider service and honestly agree to the terms of such use without fraudulently agreeing to a contract they have no intention of adhering to. Facebook is a human entitlement at this point, like water in Detroit. The courts should create an obligation on the part of the Facebook employees (enforced by the gun squad of the Marshal's Office) to provide that service t

        • Simply ask for permission without disclosing the nature of the study or the objectives. This is how scientist do it (of necessity) when they get a bunch of subjects in a room for a test. The subjects perform their tasks without knowing what the actual study is, hence there's little or no bias.

          To be more exact: merely watching somebody who is in an environment where being watched doesn't require permission. The moment you start manipulating their environment in ways like Facebook did, or observing them in places that are supposed to be private, you need permission. You are however not obligated to tell them before the fact the true nature of the experiment, if they ask after (or if you're just observing a public space, while) you give an honest answer, and it's always going to go past the rev

    • I have a sneaking suspicion every Facebook user out there already consented to that in the long EULA when they created a Facebook account.

      Remember, it's a free service. As long as you're not paying Facebook anything for it, you don't have a right to anything - Facebook is under no obligation to give you anything since you aren't providing them compensation. They can put whatever restrictions or requirements on the service that they want, including being able to perform A/B testing. Your only choice if
      • I said informed consent. Anyone doing a scientific study knows exactly what that means. The fact that some chose to do previous studies without informed consent is rather shocking. Intentionally manipulating peoples lives and decisions is despicable. I'll never be in marketing for that very reason. But scientists doing it? That's unforgivable. They should be sanctioned by whatever body it is that governs their work.

  • Doctors give placebos to deathly sick patients all the time for research purposes.
    Patients agree beforehand just as FB users and those aren't even (that) sick.

    • by Anonymous Coward

      This. FB users have already agreed to this, so there is no ethical issue.

      If you don't want to agree to FB's terms, then by all means, don't use FB. Nobody is forcing you to.

    • Alright - then show me the signed waivers from the Facebook users they experimented on showing that they have given the INFORMED CONSENT considered a standard requirement for human experimentation, be it physical or psychological. A paragraph buried in down in the terms of service nobody reads doesn't cut it.

  • Awww poor baby.

  • by BlackHawk-666 ( 560896 ) on Sunday October 05, 2014 @09:00AM (#48067533)

    To quote The Hound..."Social media is for cunts."

  • Say that you're "trying out something new on the server". End result is that it's a potential improvement to the service that you are "trying out" rather than human experimentation which sounds scary and stuff.

    Facebook is crap for a number of reasons, but not because they do what most people with webpages do.
  • Hmmm (Score:4, Funny)

    by TheSpoom ( 715771 ) <slashdot&uberm00,net> on Sunday October 05, 2014 @09:11AM (#48067573) Homepage Journal

    For example, when a user has a bad day, he's likely to look up acquaintances who have it worse off, and feel a bit better that way.

    That sounds automatable. Schadenfreude, the browser extension.

    • Ha! Awesome. Thanks for that -- you made my day. But in all seriousness, I bet that would sell... after all, it works for the evening news.
  • Carl Sagan once commented (can't recall where) on the general aversion people hold toward the government conducting experiments in public policy. He then detailed how every change of law was an experiment of sorts, although often without proper controls.

    If Facebook should be required to inform consumers of how they experimentally manipulate them then should Kellogg's reveal the details of how they use marketing to manipulate kids into buying Fruit Loops?

    • That would be like pushing an experimental kernel as a stable one. Public policy should be the result of experiments, not the experiment. Though I don't think this happens.

      We all know (or should know) public policy is written by issue specific think tanks with an agenda. An experiment presumes a curiosity about the outcome. People writing policy know which way they want "the experiment" to go. When HCI/Brady group write gun control legislation, they have no desire to see the net effects of their legislation

  • ...where the subjects behaviour changes when he knows he's being observed is the simple expedient of not letting the subject know he's being experimented on.

    The simple solution to this is to append the TOU with a clause that allows Facebook to conduct blind studies into behaviour by shaping traffic *on a randomised and anonymous basis*. Individual users are NOT informed when their account is being used in a study - they've already agreed to let it happen.

    *For definition of "randomised and anonymous" in cont

    • by Endymion ( 12816 )

      No, you don't get to get blanketly avoid informed consent simply because it makes your experiment *hard*.

      You ask an IRB for a waiver, each time, like you're supposed to. The whole point is that you are not supposed to be running experiments *on humans* without supervision. We've had way too many problems with that in the past, and so the requirement of getting a 3rd party to sign off first was invented as an incentive against running unethiccal experiments.

      For some reason, there are a lot of people that are

  • Just add a small randomness factor for everybody. Afterwards, try to find patterns in people's behaviour.
    • "Is there an ethical way for Facebook to experiment on its users?" asks Slashdot, which randomly dumps users in Beta and says give it a try, you might like it...
  • by cjc25 ( 1961486 ) on Sunday October 05, 2014 @09:52AM (#48067733)

    (Of course, few stopped to think about how Facebook decided what to show people in the first place, but that's beside the point.)

    No, it's the entire point. Your stream, to the extent that it's "your" stream, is already manipulated in ways you're not told, based on what Facebook thinks you will find interesting, funny, or engaging enough to come back and see more ads. Experiments have to happen as part of Facebook's desire to expand, if only to see which manipulations mean more ads displayed. The only difference is that now the people who are interested in something closer to science than ad sales at Facebook won't tell us about their results. Are you happy?

  • When asking these questions, stop using Facebook and substitute wit, "Facebook stockholders."

    That answers that question.

  • The whole point here is that you need somebody else who is not the experimenter to sign off on the experiment when humans are involved. We call those people the IRB.

    What's that, tech industry? You think a/b testing would be impacted? That's a popularity contest, not an experiment. Facebook's experimet had the specific goal of being able to manipulate the emotions of their users, which goes far beyond simply asking which website layout they find more attractive or useful.

    What's that, tech industry? You think it would take way too much time if you had to get approval for experiments? Then throw together a multi-company group to found your own IRB. I'm sure there are universites that would be willing to partner with that group to lend their advice and help the group get started quickly.

    What's that, tech industry? You think that there is not way you could conduct your experiments if you had to get proper informed consent, (which has specific criteria - an EULA or TOS does not count)? First: welcome to the club. Sometimes, doing proper and ethical experiments is hard. Many disciplines have tog deal with that, and I guarantee it is easier to find alternative ways to test your theories about "social media" than it is for the psychologist trying to investigate complex mental health issues, and both of those areas of research get to skip the whole "untested, unknown, and probably horribly dangerous new drug" mess that some doctoroso have to find a way to test without killing the participants.

    Worse - and this betrays the total and complete ignorance of the people at Facebook that ran this experiment - if they had bothered to ask an IRB like you are supposed to, there is a good chance they could have gotten some requirements such as having to get informed consent in advance could have been waived. Their experiment simply wasn't that risky, compared to most experiments involving human testing.

    TL;DR - If the tech industry decided to work with the process and bothered to ask an IRB, they would have avoided a lot of bad PR. Their failure to do this - and their insistence afterwords that even a trivial "trust but verifyf" is the kind of thing that only applies to *other people* only serves to make people fear the entire insutry. Justifiably. Would you want to buy stuff from people that avoid every ethics regulation?

    Of course, I haven't addressed any of the state laws, some of which have even stronger requirements...

  • That's all there is to it.

    • by Jack9 ( 11421 )

      Changing how your website performs text output is not experimenting with users. It's really annoying when /.rs start buying into misnomer. There's no need for consent when I move a button, nor when facebook changes an algorithm. Take a breath and reconsider.

      • by Endymion ( 12816 )

        This is true, of course.

        Which is why so many people are angy at Facebook - they went far beyond simply changing the layout or pagination or similar features. Instead, they set out to see if they could manipulate the emotions of their users, by the indirect means of selecting bits of content that the user requested that had certain properties, and changing how they would be presented (effectively hiding those items for the testing period).

        How they may have intended to improve user's emotions. By most externa

        • by Jack9 ( 11421 )

          > Of course not. That kind of change is totally off topic.

          I totally disagree. It's specifically the same. Many companies regularly perform macro/micro experiments (Digg, /. beta, etc), if we are going to call them such. There is only a question of degree and depth of analysis. You should really take up the ethical ramifications of paint colors chosen by market chains to influence human behavior.

          > Why is it that people who are supposedly highly educated, experience [observation: your low /. UID] and us

          • by Endymion ( 12816 )

            > There is only a question of degree and depth of analysis.

            I take it you didn't read the first half of my post. That is not the only difference between what is effectively a passive popularity contest that you happen to measure in revenu or subscriptions and an attempt to affect change in the emotions as an active result of my actions as a specific goal. Seriously, this is not a complicated distinction.

            > The Common Rule does not apply here.

            I see you are not up to date on varoius state laws. I suggest

            • by Jack9 ( 11421 )

              > I take it you didn't read the first half of my post. Seriously, this is not a complicated distinction.

              I don't agree with the distinction. I stand by my statement. There is only a question of degree and depth of analysis.

              > Remember, the entire point I'm trying to make

              That's not what you are communicating and obviously not your point, as the majority of what you're saying is trying to convince me that my conclusions are spurious.

              > So, not only are you not listening to what those people are saying,

  • Why can't we have something (a program?) like facebook, without the need for a centralized server? Whoever writes the program first, will be rich.
  • Sure, a set of anonymized, randomized set of users at the beginning and ensure they remain anonymous throughout the study, then do the study. The question is whether FB can truly anonymize the data they are studying. I would place a wager that they cannot. There is so much information creep in FB that anonymizing the data may not be possible.

    Second solution, give the research projects to people who truly have no interest in the data or the results.

  • by Prune ( 557140 )
    "look up acquaintances who have it worse off, and feel a bit better"

    I for one welcome our Schadenfreuding overlords!
  • Forgive me for raising a stupid question, but what expectation / right do we have to expect that Facebook is a certain way, or behaves in a certain way? Or shows you content without adulteration? It's not a fundamental government service, or anything we even paid for, after all.

    For all we know if could be designed as a parody website, and shows us things that our friends looked at, modulated by some sarcasm filter.

    Everything that Facebook shows us is an experiment. And you object because they adj
  • To the OP. Yes, of course Facebook can run experiments ethically, they're doing it all the time. All corporations do - "which color packaging on our box of soap sells more"? Count sales, over, done, next. They're a corporation, which is a legal entity, so their studies need to be legal, or some government will make it impossible for them to do business. Can they run experiments "like these" ethically - probably, but defining "ethics" is a lot harder - call the philosophers! According to the ultimate b
  • IMO, It's ethical to collect and use data on people that has been stripped of identifying information -- census data, for example, is a major element of sociology research. You still need an institutional review board, but it can be OK. Where Facebook went wrong was by changing things for people to try to manipulate them.

    In short: anonymous "read only" experiments on human subjects are OK; "read/write" experiments are a no-go without explicit individual consent and monitoring.

    ("But if we can't manipulate

  • That is all... just ask people to opt in. If they do... go crazy with it. And try to reward them proportionately for it.

  • How can the means be ethical if the goal is not? Whether the means are ethical or not is irrelevant in that case.
  • by msobkow ( 48369 )

    All they have to do is put up a new login/greeting page that reads "Welcome to Facebook, test subject 42. All your actions are belong to us."

  • Comment removed based on user account deletion
  • As much as i dislike facebok... they did nothing wrong here.

    pondering the effects of their site.
    testing it
    publishing results

    great - hopefully the insane backlash wont stop them showing future tests publicly.

    if you dont want FB to find out theeffects of their site on you... don't use it.
  • Last month Mother Jones ran this fluff article about a Facebook meme. Nobody seemed to think it was unusual that "Facebook's data scientists" are watching everything that is posted and collating the results.

    Recently, a status update ran around Facebook asking people to "List 10 books that have stayed with you in some way. Don't take more than a few minutes, and don't think too hard. They do not have to be the 'right' books or great works of literature, just ones that have affected you in some way." Facebook

  • It's called https://en.m.wikipedia.org/wiki/Oxymoron

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...