News

Pink Spy-angle

Published June 29, 2015

NEWS: Facebook's rainbow "Celebrate Pride" profile picture filter is rumored to be a data experiment initiated by the social network to gather intelligence on its users.

On 26 June 2015, Facebook marked the historic Supreme Court decision (in Obergefell v. Hodges) that the Constitution guarantees a right to same-sex marriage with a new "Celebrate Pride" page, through which users of the social network could apply a rainbow filter to their Facebook profile photos. Subsequently, users across the service (gay and straight alike) began proudly adding rainbows to their Facebook profile images to demonstrate their support for LGBT rights and the Supreme Court's ruling legalizing marriage equality across all fifty states.

Given the scope of the Facebook rainbow profile photo meme and its rapid spread, a variety of rumors soon followed in the wake of the trend. Some rumors were clearly of a trolling nature, such as the one suggesting that sex offenders were obliged to apply a rainbow filter to their social media profile photos. Other rumors were more substantive, such as Boing Boing's article ("What does Facebook learn about you when you rainbowify your profile pic?") positing that the rainbow profile option was a ruse Facebook's part to lure its users into a "creepy" social experiment and mine them for valuable data:

When Facebook offered a "rainbow filter" for images, following last week's landmark Supreme Court decision in favor of gay marriage, people joked that it was probably another creepy social experiment. Well, probably, yes.

That article quoted a 28 June 2015 Atlantic piece (titled and subtitled "Were All Those Rainbow Profile Photos Another Facebook Study? The social network learns more about its users than they might realize.") which was far less accusatory and more speculative, focusing less upon Facebook's intent in introducing their "Celebrate Pride" rainbow profile photo filter and more about what the social network might have learned when a similar meme organically swept the social network in March 2013:

In March, the company published a paper that got little outside attention at the time, research that reveals some of the questions Facebook might be asking now. In “The Diffusion of Support in an Online Social Movement,” Bogdan State, a Stanford Ph.D. candidate, and Lada Adamic, a data scientist at Facebook, analyzed the factors that predicted support for marriage equality on Facebook back in March 2013. They looked at what factors contributed to a person changing his or her profile photo to the red equals sign, but the implication of their research is much larger: At stake is our understanding of whether groups of citizens can organize online — and how that collective activity affects larger social movements.

The passage quoted above simply spoke to the extensive amount of data generated by a collective action on the part of Facebook users and not the social network's intent in enabling such an action. More important, another passage in the Atlantic's piece contradicted the suggestion that Facebook's "Celebrate Pride" filter was created with the goal of compiling data on the sly by quoting Facebook spokesman William Nevius, who maintained the add-on was the result of in-house experimentation and initially was not intended for use by a wider audience:

Is Facebook’s Celebrate Pride an experiment on users? Nevius, the Facebook spokesman, told me the feature was designed by two interns at a recent company hackathon. When it became popular with employees, Facebook made it available to all users globally, just in time for the Supreme Court decision and other global pride events. But it’s not part of an experiment that involves tinkering with what various Facebook users see, a spokesman said. That makes it different from Facebook's “I voted” study or its “emotion contagion” research, which tested effects by varying the experience of randomly-chosen users.

International outcry in June 2014 over Facebook's use of algorithms to elicit (or inhibit) engagement and emotional impact on users led to an October 2014 statement by the social network regarding ethics in data science. In that statement, Facebook promised that similar future research efforts would be publicly published in a single place and subjected to multiple prior levels of review:

Guidelines: we’ve given researchers clearer guidelines. If proposed work is focused on studying particular groups or populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions) it will go through an enhanced review process before research can begin. The guidelines also require further review if the work involves a collaboration with someone in the academic community.

The rumor immediately resurfaced when the fake news site NYMeta published an article titled "EVERYONE WHO CHANGED THEIR FACEBOOK PHOTOS TO RAINBOW JUST GOT DUPED" on 2 July 2015. That site's "About" page explains:

NY Meta is an entertainment website. We post the latest trending articles circulating the web. Whether interesting, controversial, abnormal, thought provoking or satirical we aim only to entertain with the stories we publish.

Although Facebook may indeed ultimately glean valuable insights about its user base from the "Celebrate Pride" rainbow profile photo filter meme, it doesn't appear from available evidence that the social network's intent in introducing the feature was primarily to trick users into participating in a social experiment. Facebook's own transparency guidelines (if they truly uphold them) would reveal the nature of such an experiment via the requirement that all such research be published for public review first (and would likely garner objections similar to those expressed about the 2014 "social experiment" that prompted a review at Facebook of data science and research practices). That "probably yes" looks a bit more like a "maybe later."

Kim LaCapria is a former writer for Snopes.

Article Tags