Barack Obama's 2008 and 2012 presidential campaigns used the same tactics and techniques for which Cambridge Analytica and the Trump campaign criticized in 2018.
In March 2018, social media giant Facebook and psychographics firm Cambridge Analytica came under heavy scrutiny for improperly accessing data from millions of users in an effort to manipulate their behavior in the 2016 presidential election, reportedly handing an unlikely victory to U.S. President Donald Trump.
In the wake of the controversy, the company and conservative pundits engaged in what has become a pervasive use of “whataboutism” to defend the Trump administration from criticism by pointing fingers at Trump’s predecessors, including President Barack Obama. A 20 March 2018 op-ed in the Capitol Hill news publication The Hill by DailyWire.com editor Ben Shapiro was perhaps representative of the phenomenon:
On Sunday, The Guardian reported on the supposedly nefarious workings of President Trump’s data-gathering team at Cambridge Analytica. The report suggested that Cambridge Analytica had essentially issued questionnaires through a third party; those questionnaires, which were personality quizzes, requested that you use your Facebook login. Cambridge Analytica then compiled data regarding those who completed the quiz and cross-referenced that data with political preferences in order to target potential voters.
This isn’t particularly shocking. In 2012, The Guardian reported that President Obama’s reelection team was “building a vast digital data operation that for the first time combines a unified database on millions of Americans with the power of Facebook to target individual voters to a degree never achieved before.”
From what is known publicly from statements by a whistleblower and thorough investigative reports from the New York Times, the Guardian and Observer and a documentary series by Britain’s Channel 4, attempts to paint the Obama campaign’s digital efforts as equal in scope and nefariousness minimize the apparent rule-breaking and potential criminality of what Cambridge Analytica is accused of doing.
Although the Obama campaign in 2012 did target potential voters using information gathered from Facebook profiles, there were key differences. The Obama for America organization accessed voters’ Facebook information when they logged on to the campaign web site via Facebook. Obama supporters were given a permission screen in which they could approve or deny the request, which clearly came from the Obama campaign.
Although Obama for America did collect data on users’ friends, it was at the time in line with Facebook policy. A Facebook spokesperson told us both candidates Obama and Republican Mitt Romney had access to the same tools. In 2015, Facebook changed the rules so that apps could no longer target the friends of users who downloaded them.
In the case of Cambridge Analytica, information was gathered from users and given to a third party under false pretenses. According to Facebook, University of Cambridge psychologist Aleksandr Kogan created a personality quiz which users could download in an app called “thisisyourdigitallife.” Kogan presented the app as a tool that would be used for academic research — but the work was paid for by Cambridge Analytica. Facebook users were not informed that their data (and that of their friends) would be deployed by a political firm hired by the Trump campaign for psychographic profiling in the upcoming election.
As a result of the scandal, whistleblower Christopher Wylie — along with Cambridge Analytica, its London-based parent company SCL Group, and Kogan — were suspended by Facebook. Kogan denies knowingly violating Facebook’s policy and said that he is being used as a scapegoat.
Much of the information the Obama campaign compiled was publicly available and contained in voter files, or records of registered voters and their electoral activity kept by secretaries of state. Rayid Ghani, chief data scientist for Obama’s 2012 campaign, wrote:
We did not build any complex (certainly not the so-called psychographic) models of facebook users using their facebook data. Most of the models we built were using the publicly available “voter file” that contains information people typically provide when filling out their voter registrations forms. We did build models to understand which of a supporter’s friends we want to ask to register to vote, or to get them to vote and how likely the friend was to take action based on the ask.
We only contacted the people who had given us access and permission to get their own email address. We did not get any contact information for their friend and did not (and could not ) contact any of their friends directly. All we could do was ask our “primary” supporters to contact their friends and we would recommend who those friends were based on the data they allowed us to access.
While the Obama campaign essentially used data gathered from Facebook to create a digital hub of information about potential volunteers and voters to contact and mobilize, Wylie describes Cambridge Analytica and SCL Group’s tactics quite differently. Wylie, in fact, calls the tactics employed by the firms upon the mostly-unwitting American electorate as weapons of psychological warfare:
So [Breitbart.com, Trump campaign and Cambridge Analytica executive Steve [Bannon] came to SCL because he believes in something called the Breitbart Doctrine, which is that in order to change politics you first have to change culture because politics is downstream from culture. And in order to fight a culture war, you need an arsenal of information weapons, and who better to go to than a company like SCL which is a military contractor based in the U.K. to help set up those information weapons.
And so what we worked on at SCL and then later at Cambridge Analytica was data harvesting programs where we would pull data from users of apps and all of their friend networks and run that data through algorithms that could profile their personality traits and other psychological attributes so that we would know exactly what kind of information we would need to seed on to online platforms to exploit mental vulnerabilities that our algorithms showed that they had.
An investigation by Channel 4 exposed Cambridge Analytica officials discussing dirty tricks like blackmailing rival candidates and baiting them with sex workers. In one exchange, executive Mark Turnbull tells an undercover reporter that the firm anonymously feeds toxic and misleading information into the “bloodstream” of the Internet, without leaving any fingerprints behind:
Sometimes you can use proxy organizations who are already there. You feed them. They are civil society organizations, like charities or activist groups, and we use them, feed them the material, and they do the work. We just put information into the bloodstream of the Internet and then watch it grow. Give it a little push every now and again to watch it take shape. And so this stuff infiltrates the online community and expands, but with no branding, so it’s unattributable, untrackable.
In another exchange, he tells the reporter that Cambridge Analytica played on the fears of voters to manipulate them emotionally:
The two fundamental human drivers when it comes to taking information onboard effectively are hopes and fears and many of those are unspoken and even unconscious. You didn’t know that was a fear until you saw something that just evoked that reaction from you. And our job is to get, is to drop the bucket further down the well than anybody else, to understand what are those really deep-seated underlying fears, concerns.
One Trump campaign official told Bloomberg in October 2016 that there was an active effort afoot to suppress potential votes for Trump’s rival, Hillary Clinton — though it remains unclear whether it was effective.
Aviv Ovadaya, chief technologist at the Center for Social Media Responsibility at the University of Michigan School of Information told us how the information was used is another layer of context:
It’s also a question of what type of messaging you are using that might sway whether this is a good thing for a society. Is your goal to create anger and hatred? Is it to spread denigrating stereotypes? Is it to incite violence? Or is it a message about how we can make the world a better place?
Another distinction to consider—is the message attempting to get people to vote, or is it trying to get people not to vote? Those are two very different approaches.
As a result of the ongoing fallout from the reports, Cambridge Analytica has suspended chief executive officer Alexander Nix while queries into the matter have been opened in the UK, Israel, and the United States. Facebook stock has tumbled as a result of the scandal and U.S. legislators are calling on CEO Mark Zuckerberg to testify before Congress.
Meanwhile, key Cambridge Analytica figures have already set up a new data company, Emerdata, whose employees include Nix, Rebekah Mercer, and her sister Jennifer Mercer, and which is linked to Erik Prince, the former mercenary (and brother to Education Secretary Betsy DeVos) who created the private military company Blackwater, now known as Academi.
A Word to Our Loyal Readers
Support Snopes and make a difference for readers everywhere.
- David Mikkelson
- Doreen Marchionni
- David Emery
- Bond Huberman
- Jordan Liles
- Alex Kasprak
- Dan Evon
- Dan MacGuill
- Bethania Palma
- Liz Donaldson
- Vinny Green
- Ryan Miller
- Chris Reilly
- Chad Ort
- Elyssa Young
Most Snopes assignments begin when readers ask us, “Is this true?” Those tips launch our fact-checkers on sprints across a vast range of political, scientific, legal, historical, and visual information. We investigate as thoroughly and quickly as possible and relay what we learn. Then another question arrives, and the race starts again.
We do this work every day at no cost to you, but it is far from free to produce, and we cannot afford to slow down. To ensure Snopes endures — and grows to serve more readers — we need a different kind of tip: We need your financial support.
Support Snopes so we continue to pursue the facts — for you and anyone searching for answers.