A once pro-Trump Facebook page, World USA, built a following of nearly 1 million users between October 2016 and late September 2019 by sharing generically patriotic, Norman Rockwellian remembrances of a bygone America. Far from being a homegrown testament to American patriotism, however, the page — based on Facebook’s transparency tool — was actually run by two individuals in Ukraine.
In what may be illustrative of a broader digital influence strategy, the page appeared to be targeting older Americans. No doubt a minor player in the overall digital landscape, World USA could be representative of a larger problem facing Facebook — an inability to police the existence of clearly inauthentic pages that purport to represent the views of Americans but are actually run by people outside of the United States.
World USA is far from the only example of foreign users creating pages designed to look American. Facebook itself (or its algorithms) recommended several pages we should follow based on our interest in World USA, and they included myriad pro-America pages run by users in other countries. Facebook recommended we follow, for example, the President Of The United States Fan Club (run from Armenia), Melania Trump’s Fan (India-based), USA for President Trump 2020, and Stand With President Trump 2020 (both Vietnamese operations), Barron Trump Fan (Pakistan), and We Love Donald Trump (Morocco).
Each of these pages, along with World USA, was taken down on Sept. 30, 2019, following our request for a comment from Facebook. “We invested in transparency features so that people can flag potentially suspicious activity for us to investigate and take action when we find violations,” a Facebook spokesperson told us by email. “We’ve removed the Pages that violate our policies and will continue enforcing when we determine misuse.”
On Sept. 23, Judd Legum, the former editor-in-chief of ThinkProgress, reported on “a complex network of Facebook pages” in his Popular Information newsletter. These pages, all managed by people in Ukraine, amassed large audiences by posting memes “about [American] patriotism, Jesus, and cute dogs.” The problem, Legum argued, was that these pages were “now being used to funnel large audiences to pro-Trump propaganda,” including re-shared content first produced by the Russian Internet Research Agency (IRA) troll farm. Facebook ultimately removed those pages after Legum’s reporting, as well.
World USA was formed on Oct. 29, 2016 — less than two weeks before the 2016 presidential election. In those early days, it posted ample pro-Trump content, linking to dubious clickbait political websites while embracing the term “deplorable” with pride. In late 2016 and early 2017, the page’s content featured items like this:
Overtly political content tapered off on World USA in early 2017, when more broad, sentimental patriotism, similar to posts described in Legum’s reporting, replaced it.
Strategically, this makes sense if the goal was to build a follower base, according to Joshua Tucker, a co-founder and co-director of the NYU Social Media and Political Participation (SMaPP) laboratory, who has researched the manipulation efforts of Russian trolls in the 2016 election. Those Russian pages, he says, built a follower base by tweeting neutral local news articles, potentially, he argues, because they appeared trustworthy. “The behavior you are describing sounds similar: build up followers with non-political content and/or politically neutral content, and then pivot to a particular message at a more politically salient period of time,” he told us via email.
Another feature of the Russian troll operations, according to Clemson University Professor of Communications Darren Linvill, was that they sought to target specific demographics with their ostensibly benign content. In the case of World USA, the posts seemed to be targeting specifically older Americans. Via email, Linvill, whose research has informed several U.S. intelligence and security agencies, wrote that “with the Russian [Internet Research Agency] we saw ample evidence that they attempt to build following among a specific demographic,” playing up different themes to different audiences.
To be clear, outside of Facebook identifying this page as being operated by two individuals in Ukraine, we do not know who, specifically, was behind World USA, nor do we know if their intent was political, financial, or something else. Some evidence suggests the pages may have been purely profit-driven. In its early political days, World USA sent viewers on multiple occasions to an ad-driven, now-defunct clickbait news website named Usanewsworld.com reminiscent of the myriad pro-Trump political pages run for profit by Macedonian teens in the run-up to the 2016 presidential election. World USA’s flip from being political to being apolitical was, however, unique, according to Linvill. “We have seen them switch from benign to political, but not the other way around,” he said.
But even if World USA’s intent was apolitical, it is significant that these Ukrainian Facebook users for nearly three years successfully gathered nearly a million followers by presenting themselves as an ardently American page. Further, this page did so, it appears, by targeting a demographic most likely to share misinformation online. Tucker, the NYU professor, and his colleagues, published a study in January 2019 suggesting that “users over 65 shared nearly seven times as many articles from fake news domains” as people aged 18-29. The associated World USA Facebook group (which is still online as of this writing) is full of dubious or false claims, including those of “Jihadi training camps” in the United States. Such a group, wittingly or not, could have significant reach in the spread of misinformation.
When we reached out to Facebook regarding World USA, we asked if pages that present themselves as an outlet for patriotic Americans run by non-Americans would constitute what Facebook calls “inauthentic” behavior — a violation of the platform’s terms of service. Though all the pages we identified were removed, we never received a direct answer to that question, nor did we learn specifically what Facebook terms of service each of these pages violated. Facebook, in its response to us, lauded the transparency tool they developed as a way for people to “flag potentially suspicious activity for us to investigate,” seemingly putting the onus on journalists and users to report violations.
The transparency tools are helpful, but the greatest tool might be Facebook’s own algorithm. After all, it recommended all but one of the pages we discovered.