STAFF NEWS & ANALYSIS
Despite the Latest Scandal, Facebook Won't Change
By Staff News & Analysis - July 02, 2014

9 answers about Facebook's creepy emotional-manipulation experiment … Facebook's explosive study on user feelings and the newsfeed reads like something straight out of a bad sci-fi movie. There's "emotional contagion." Dire ethical qualms. The cold, thoughtless manipulation of hundreds of thousands of vulnerable humans by the oppressive Facebook machine. It's terrifying, right? Overwhelming? Confusing, even? But fortunately, rather than let you stew in your manipulated feels, we have synthesized a quick explainer. It's not entirely exhaustive, by any means — but it does come in the form of nine vaguely conversational questions that you may or may not have been embarrassed to ask. – Washington Post

Dominant Social Theme: Facebook – it's good for you.

Free-Market Analysis: The Washington Post has leapt to the defense of Facebook. Both Facebook and the Washington Post are affiliated with US Intel, from what we can tell, so it's probably not surprising.

Facebook recently ran an experiment on its users to see if what they read over a month could influence their mood in a negative or positive way. The result: If users read negative posts online they apparently grew slightly more negative themselves.

The proximate cause of the ruckus involves the use of the data: It ended up in an article to be published in a prestigious scientific journal. Thus, Facebook was revealed to have allowed the larger scientific community to perform experiments on its user base.

The Post article defends the experiment on the grounds that just before it was conducted, Facebook added the word "research" into the fine print that Facebook users agree to but never read when they sign up. Notifying users that they could be used for "research" gave Facebook more legal cover.

Here's more from the Post article:

What is Facebook? The world's largest social network is many things: a technological revolution, an advertising mammoth, the star of a hit movie scripted by Aaron Sorkin. But many casual users don't realize that Facebook is also a profound source of data for academic researchers, both inside and outside the company.

… What's stopping Facebook from doing this all again? Well, nothing, for all the aforementioned reasons: it's legal, it benefits Facebook, and users do (technically!) opt in.

… In the two hours it's taken me to write this, Facebook has probably run dozens of "experiments" on new features and design changes that will make the site more user-friendly, by showing different versions of the site to different users and seeing how they react. This is called A/B testing, in industry parlance. Framed another way, it looks a whole lot like psychological research.

… I'm still mildly horrified. I should delete my Facebook ASAP, right? No, that's a total overreaction. You don't need to delete your Facebook, but you do need to consider this a wake-up call on a number of pretty critical fronts.

First: Facebook owns your data, and your data is valuable. It's tempting to see Facebook as some kind of confection or distraction — a quick thing to scroll through on your phone while you're waiting in line for coffee, or a place to share a #hilarious inside joke. But in reality, everything you do on Facebook is being recorded and, potentially, observed.

Second: Facebook can, and does, manipulate what you see in your newsfeed. That isn't necessarily malicious, and it can improve your experience on the site. (Have you noticed fewer Buzzfeed quizzes lately?) That said, Facebook's control over the news feed gives it a profound amount of influence over the information you consume, particularly if you use Facebook as your main source of news. Recently, an article in the New Republic theorized that Facebook could swing entire national elections through newsfeed, if it wanted to.

Third: Facebook is not the only one doing all this. Almost every Web site you visit collects data on you, and many of the sites you use everyday — Google, Amazon, Netflix, online dating — operate according to impenetrable algorithms that can be, and are, changed at any time. Frequently, those algorithms are intentionally deployed to manipulate your behavior, whether that's clicking an ad, watching a movie, or "liking" a particular post.

… If you pay attention to some of the bigger issues at play here — big data, big algorithms, the profound influence of corporations over ordinary people — this is an important case. The rabbit-hole, it turns out, goes pretty deep.

Some of this may sound entirely reasonable, but the Washington Post article leaves out a point made by Infowars in an article entitled, "Facebook Emotional Experiment Linked To Pentagon Research On Civil Unrest."

The point made in the article is that the study "has direct ties to research funded by the Department of Defense concerning the likelihood of civil unrest."

The salient linkage has to do with Cornell University's Jeffrey T. Hancock who is listed as a study author. Infowars adds the following:

Hancock is also listed on the Pentagon's Minerva initiative website, where it is noted that he received funding from the Department of Defense for a study called "Cornell: Modeling Discourse and Social Dynamics in Authoritarian Regimes". The section of the website devoted to that study includes a visualization program that models the spread of beliefs and disease.

What Pentagon researchers are after, apparently, is the ability to use social media to influence the emotions of users and thus influence behavior.

We're tempted to ask, "what's the fuss about?" After all, the alternative 'Net media in particular has been reporting on Facebook/intel links for years. You can read some of our articles here:

Facebook: Changing the Face of Mercantilism

Facebook IPO Is US Intel Operation?

Our point in these articles, amply confirmed elsewhere, is that Facebook and other large tech firms are at this point definitive assets of the US military-industrial complex.

Belatedly, people like Facebook's Mark Zuckerberg have woken up to the dangers of being so closely affiliated with the Pentagon and US intel. As whistleblower Edward Snowden has released more information about how US intel has combined corporate and defense resources to create a seamless web of spying and oversight, Zuckerberg and other CEOs have begun registering their displeasure with how they are being perceived.

In March, Reuters and other news facilities reported that Zuckerberg had personally called President Barack Obama to complain about US surveillance practices. "When our engineers work tirelessly to improve security, we imagine we're protecting you against criminals, not our own government," Zuckerberg wrote in a personal post.

"I've called President Obama to express my frustration over the damage the government is creating for all of our future. Unfortunately, it seems like it will take a very long time for true full reform," Zuckerberg added.

Zuckerberg was complaining because the NSA had apparently mimicked Facebook pages to trick users into communicating personal data directly to the NSA. But as has been reported in many alternative media posts, Facebook's relationship with US intel likely goes back to Facebook's formative years; the company likely received CIA seed money to help it grow.

There is probably little Zuckerberg can do at this point to shake the perception of many that Facebook is inextricably linked to Western intel, as are Google, Microsoft and a slew of other high tech giants. This is unfortunate because it retards the growth of tech firms generally. US tech firms have gone from being considered trusted providers worldwide to being viewed with suspicion. Contracts have been cancelled as a result and hope for a seamless Internet is also gone.

This latest Facebook fiasco points up the problems once again of corporate personhood mentioned in the other article in today's issue. It is Facebook's sheer size that makes social experiments both tempting and viable. Corporations of this size are entirely artificial entities but that doesn't make them any less dangerous.

After Thoughts

Given the kind of power wielded by social media, perhaps people will eventually tune out or find other facilities. Facebook et al. are not going to change …<

Posted in STAFF NEWS & ANALYSIS
loading
Share via
Copy link
Powered by Social Snap