Building Global Community - Mark Zuckerberg https://www.facebook.com/notes/mark-zuckerberg/building-global-community/10154544292806634Too much all under one roof, too left, definitely into censorship, outright creepy datamining and psychosocial testing and engineering (search youtube for facebook talking heads on this crap), shaping and pushing their narrative down upon userbase, with some say insecure Zuckerberg seeking validation. Above and beyond the billions already made exploiting you 'for free'. Would hundreds of millions of candy crush playing idiots all rising up at once on command feel like a "safe space" to you? Facebook is a closed corporation and said to have TOP SECRET government involvement, the same for Zuckerberg. Trust? If you want to enter politics and vie for good on your own and solicit followers, great, go for it. But don't go engineering your captive users into it as your own private army. Good that Zuck and Chan sortof spent his wad on medical disease reserch, but only if it's free for all opensourced and free of patent royalty copyright license fee, goes on to foster same, etc. Facebook did opensource some datacenter things, but that's negligible. Trap? Huge. At least this time it was announced in public. Who will listen? You decide.
Facebook emotion study breached ethical guidelines, researchers say
Lack of 'informed consent' means that Facebook experiment on nearly 700,000 news feeds broke rules on tests on human subjects, say scientists
Researchers have roundly condemned Facebook's experiment in which it manipulated nearly 700,000 users' news feeds to see whether it would affect their emotions, saying it breaches ethical guidelines for "informed consent".
James Grimmelmann, professor of law at the University of Maryland, points in an extensive blog post that "Facebook didn't give users informed consent" to allow them to decide whether to take part in the study, under US human subjects research.
"The study harmed participants," because it changed their mood, Grimmelmann comments, adding "This is bad, even for Facebook."
But one of the researchers, Adam Kramer, posted a lengthy defence on Facebook, saying it was carried out "because we care about the emotional impact of Facebook and the people that use our product." He said that he and his colleagues "felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out."
The experiment hid certain elements from 689,003 peoples' news feed – about 0.04% of users, or 1 in 2,500 – over the course of one week in 2012. The experiment hid "a small percentage" of emotional words from peoples' news feeds, without their knowledge, to test what effect that had on the statuses or "Likes" that they then posted or reacted to.
The results found that, contrary to expectation, peoples' emotions were reinforced by what they saw - what the researchers called "emotional contagion".
But the study has come in for severe criticism because unlike the advertising that Facebook shows - which arguably aims to alter peoples' behaviour by making them buy products or services from those advertisers - the changes to the news feeds were made without users' knowledge or explicit consent.
Max Masnick, a researcher with a doctorate in epidemiology who says of his work that "I do human-subjects research every day", says that the structure of the experiment means there was no informed consent - a key element of any studies on humans.
"As a researcher, you don’t get an ethical free pass because a user checked a box next to a link to a website’s terms of use. The researcher is responsible for making sure all participants are properly consented. In many cases, study staff will verbally go through lengthy consent forms with potential participants, point by point. Researchers will even quiz participants after presenting the informed consent information to make sure they really understand.
"Based on the information in the PNAS paper, I don’t think these researchers met this ethical obligation."
Kramer does not address the topic of informed consent in his blog post. But he says that "my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."
When asked whether the study had had an ethical review before being approved for publication, the US National Academy of Sciences, which published the controversial paper in its Proceedings of the National Academy of Sciences (PNAS), told the Guardian that it was investigating the issue.