Shared publicly  - 
I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it's possible.
Michel Alexandre Salim (Arcane Hexed Mill)'s profile photoLauren Weinstein's profile photoGreg Nixon's profile photoAndrew “The” Bolt's profile photo
honestly the first thing that jumped into
my mind. 
Ethics. Privacy. Just a bunch of words to Emperor Zuckerberg.
I'm usually an apologist for services such as FB or Google. I have no issue with the idea of targeted marketing as, in theory, it would be a win-win for both the advertisers and the consumers. The consumer gets to use a service that allows communication with friends and family while receiving ads that are relevant to them. The advertisers get to see the monetary value of a well targeted ad. But this manipulation is beyond the pale. We're already giving them outlandish amounts of data about ourselves, do they have to trick us to help improve their algorithms?
Looks like the research was approved by an institutional review board. That's the gold standard.

I wonder if users can complain directly to the irb the way you would in other irb reviewed studies. 
At this point, with the aggregate revelations about Facebook, it's kind of like hanging out at the beach after the tsunami warning has been fairly given.
Just who are these institutionalized review boards?
I'd like to study how these review boards feel if their accreditation was yanked.
It's one thing to run experiments to see what sort of user interface is preferred. That's done all the time by all manner of services. But to purposely manipulate the news people see to try make them sad is not only atrocious, but incredibly dangerous. Everybody involved should be condemned and held responsible.
Sounds like rendition. The US can't torture (though they do anyway) but they can outsource it, and likewise here bona fide research institutions can't do this kind of research themselves but journals can publish results obtained by ethically-challenged corporations? 
The "test subjects" may have agreed to the TOS, but they certainly did not sign a medical consent form that stated the risks involved in the experiment.
Meh! If this research is valid then newspapers (especially tabloids) should be accountable for manipulating emotional content everytime they choose to put a "hell-in-a-handbasket" article on the front page.

So now that Facebook knows it can manipulate emotions, should it be obliged to maximise happiness by censoring unhappy stories?
The sample size of the study was apparently n=680,000. Statistically speaking, someone somewhere had to have at least contemplated suicide as a result of the sadmaking by Facebook. 
Well, one of my friends has been raging at FaceBook because they deliberately did not put his surname first, even though he told it to.

EDIT: I'm close to beating him over the head with a frying pan every time he starts on about it. That's how insulted and ragey he is about it.

I'm quite glad we don't live in the same house right now.
The tone deafness from FB and its researchers in this case is appalling, IRB approval or not.    The research is quite fascinating though.
+Roberto Bayardo This is the kind of stunt that could be the last straw when it comes to eliminating whatever residual trust exists between users and social media in general.
At the end of the day people are still responsible for their own actions so I dont think facebook "killed" anyone
+Greg Nixon Your apparently deep background in suicide prevention and situational psychology is truly awe-inspiring. 
Trouble is you can get really tanlged in knots if you don't keep a sense of perspective with big numbers. With a large-enough reach, and sufficiently robust studies, you could demonstrably cause a countable number of suicides while testing the right shade of blue to use in your stylesheet. At some point, you have to accept that putting soft-padding on the entire universe isn't the right way to prevent pain and suffering.

Here's another thought experiment: should Facebook be allowed to search for warning symptoms in your posts/browsing habits, and alert the Samaritans / local welfare organisations to pay you a visit (or just show you more happy kitten videos)? Arguably that could save thousands of lives (38000 deaths by suicide in the US in 2010).
Add a comment...