Facebook mind control experiments linked to DoD research on civil unrest
Facebook’s Unethical Experiment Intentionally Manipulated The Emotions Of Thousands Of Its Users
Facebook’s experiment on over half-a-million unsuspecting users has taken a new twist with the revelation that a researcher connected to a Department of Defense-funded program to use the military to quell civil unrest also participated in the study.
Social media sites exploded over the weekend after it was revealed that Facebook, no stranger to controversy of late, secretly manipulated posts being seen by nearly 700,000 users in 2012 in order to allow researchers to study how emotional states are transmitted over the platform.
Source: Pentagon-facebook-study
The results of a secret psychological experiment conducted on nearly 700 thousand Facebook users without their consent recently came to light. The fact that the experiment was aimed at studying the emotions of the members of the social network without them knowing it has caused online outrage.
In January 2012, the number of positive and negative posts displayed in the news feeds of 690,000 Facebook users was secretly manipulated. The next step was to collect and analyze comments and statuses written by the unsuspecting participants of the experiment. The study concluded that those who had less positive content in their news feeds eventually expressed negative mood themselves. In contrast, those who saw less negative content tend to use fewer words with negatively charged emotions in their own posts. The experiment was conducted with a random sample of Facebook users and the results were published in the PNAS journal on June 17, 2014.
In other words, Facebook intentionally made well over half a million of its users sad.
The study’s authors came to the conclusion that the emotions experienced by users online are influenced by the posts they read, which can also affect their behavior in real life. As they said,“Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network.”
ere is the relevant section of Facebook’s data use policy: “For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
The methodology of Facebook raises serious ethical questions..
However, the fact that Facebook uses its members as guinea pigs raises many questions in relation to the number and content of behavioral experiments that may have already been conducted on the social network by similar methods.
Source: Facebook-conducts-a-secret-psychological-experiment-on-its-unsuspecting-users
Experiment: Facebook verteidigt Manipulation der Statusmeldungen
Fast 700.000 Nutzern wurden von Facebook für eine Studie mehr positive oder negative Statusmeldungen ihrer Freunde gezeigt. Die Psychologie-Studie sollte untersuchen, wie sich das auf ihre Stimmung auswirkt. Facebook hat die Studie jetzt verteidigt.
Facebook hat das Experiment verteidigt, bei dem der Feed der Statusmeldungen von fast 700.000 Nutzern manipuliert wurde. Für das soziale Netzwerk sei es wichtig zu verstehen, wie Mitglieder auf verschiedene Inhalte reagierten, erklärte ein Facebook-Forscher.
Bei dem Experiment waren ohne Vorwarnung die Newsfeeds von 689.003 Nutzern der englischsprachigen Facebook-Version manipuliert worden. Über drei Millionen Einträge wurden nach Stichworten von der Software ausgewertet, die diese bestimmten Gefühlen zuordnete.
Bei dem einwöchigen Experiment im Januar 2012 wurden die Einträge vorgefiltert: Einer Gruppe wurden mehr positive Nachrichten angezeigt, den anderen mehr negative. Die Studie ergab, dass Menschen, die mehr positive Nachrichten sahen, etwas eher dazu neigten, auch selbst Einträge mit positivem Inhalt zu veröffentlichen – und umgekehrt.
Quelle: Experiment Facebook verteidigt Manipulation der Statusmeldungen