The social media-borne hoaxes that now infamously spread online like diseases after major news events returned in force in the immediate aftermath of the 15 April 2019 fire that heavily damaged the iconic Notre Dame cathedral in Paris, France.
Conspiracy theorists instantly attributed the fire to Islamic terrorists and compared the event to 9/11. Some shared audio recordings they falsely claimed captured Muslims shouting Islamic slogans at the scene. Others touted videos and images supposedly showing Muslims celebrating and laughing at the destruction of the cathedral. Still others asserted a nonexistent link between the Notre Dame fire and a foiled terrorist plot that actually took place in 2016.
The hoaxes probably aren’t going away any time soon, said David Carroll, associate professor of media design at the New School in New York City and a prominent critic of social media platforms. After mass casualty events in the past like the Las Vegas shooting and the deadly white supremacist rally in Charlottesville, tech companies such as Facebook, Twitter, and Google got bad press for letting hoaxes spread in their wakes. Although the technology giants have made efforts to improve, the Notre Dame fire showed it’s not enough.
“It’s the attention economy that is the economic motivation and incentive at the heart of this,” Carroll told us. “If we just think of [the platforms’] economic incentives, they were designed to maximize and monetize attention. Attention is the resource that is harvested and mined without any care for whether it is good or bad.”
In a 16 April 2019 blog post, Twitter executives Donald Hicks and David Gasca said the platform was taking a more proactive approach toward addressing abusive content. However as Poynter reported, “Twitter doesn’t have a policy strictly aimed at decreasing the reach of false posts. Among the actions that the company does take is removing bogus accounts posing as news organizations. But that policy can be gamed — and it isn’t applied uniformly.”
And conspiracy theories have a way of capturing people’s attention. “You can establish a good audience for yourself as a conspiracy theorist” on social media platforms, Carroll added.
Sometimes the hoax themes are explicitly racist, targeting certain groups. In the case of the Notre Dame fire, media gadflies pointed the finger at the Muslim community and claimed without evidence that the fire was an act of terror, even as French authorities announced the incident was likely the result of an accident and was not being investigated as arson.
“There is an actual terrorist that burned down three black churches in Louisiana. How many times have you seen his face and his name in the news?” said Omar Suleiman, Islamic studies professor at Southern Methodist University and founder of the Yaqeen Institute for Islamic Research in Texas. “But because the idea of Muslim terrorism is so prevalent, we get blamed now collectively for something that not even a single Muslim had anything to do with. This has reached a level of ridiculousness that is beyond guilt by association. It’s just made up.”
Authorities charged Holden Matthews, 21, on 15 April 2019 with hate crimes in relation to fires that destroyed three historically black churches in Louisiana over 10 days.
Suleiman pointed to conspiratorial comments made by media personality Glenn Beck, who speculated on his radio show that, “If this [fire] was started by Islamists, I don’t think you’ll find out about it” and compared the event to the 9/11 terrorist attack on the World Trade Center in New York.
“A guy like [Glenn Beck] can associate Muslims with the cathedral burning and have no facts and no analysis behind it, yet it perpetuates the vicious cycle of keeping the Muslim community under a cloud of suspicion,” Suleiman said. “It’s hard to disassociate this with what’s happening with [U.S. Rep.] Ilhan Omar (D-Minnesota). If even a Muslim congresswoman cannot be free from being associated with 9/11 by the president of the United States, then how can we expect more from bloggers who thrive on sensationalism and conspiracy?”
Suleiman was referring to a 12 April 2019 tweet posted by President Donald Trump that juxtaposed Omar, one of the first two Muslim women to serve in the U.S. Congress and the first to wear a hijab or religious head covering, with video footage of the 9/11 attacks. Omar received an increased number of death threats after Trump’s tweet was posted.
In March 2019, we asked Twitter about anti-Muslim hate speech on the platform following the mass shooting in Christchurch, New Zealand in which a white supremacist has been accused of killing 50 people in a shooting rampage at two mosques. Spokesman Ian Plunkett told us at that time that “our policies do not allow hateful conduct and/or the extremism,” but when we asked why, for example, known white supremacist David Duke still has an active account, we were told the company doesn’t have any comment on individual cases.
Hoaxes reflecting the baseless theme that the Notre Dame fire was the result of terrorism spread quickly. A Twitter user posted a video from the scene of the fire and falsely claimed that Muslims were shouting the Arabic phrase “alahu akbar,” which means “God is great.” It’s actually French authorities heard shouting “allez en avant” to bystanders, which roughly translates to get out of harm’s way.
Are they yelling Allahu Akbar in Paris? pic.twitter.com/XxDzORrkUB
— Magaphobia (@MAGAphobia) April 15, 2019
In another video that accompanied a now-deleted tweet, audio of people shouting “alahu akbar” was superimposed on video footage of the cathedral burning to again give the misleading impression that Muslim people were celebrating the fire. As NBC News pointed out, “The audio comes from a years-old video, which is the first result when a user searches for ‘Allahu Akbar Scream’ in Google.
Anti-Muslim blogger Pamela Geller also perpetuated baseless claims that Muslim people were celebrating in the fire’s aftermath. Geller shared a post purporting to show that people with “Arab [Muslim] names” reacted to the event with smiling or laughing emoticons on Facebook.
But as BuzzFeed News pointed out, trying to lay culpability at the feet of the Muslim community using Facebook reactions is both facile and unverifiable. “During the Notre Dame fire the laughing face emojis were clearly in the minority and it’s impossible to know why people chose a specific emoji, or for that matter the religion of people reacting to a Facebook video. It’s also difficult to verify the authenticity of the accounts. Bottom line: Facebook emojis on a video do not tell us anything about a group of people.”
That same post from Geller also featured an image of two smiling men ducking under police tape with the burning cathedral in the background, claiming without evidence they were “jihadists” celebrating the destruction.
A number of Twitter accounts shared a 2016 story about a foiled terror plot that involved a car carrying gas canisters that was parked near the cathedral, giving the false impression that the incident happened in the days leading to the fire.
Not surprised. I already thought it was Islamic terrorism. 👇
— SHERI 🇺🇸 (@Sheri_Hischild) April 17, 2019
In yet another instance, conspiracy mongers claimed that a firefighter seen on camera was a nefarious, robed person.
— Katie Hopkins (@KTHopkins) April 16, 2019
In one of the most high-profile social media platform snafus, people watching livestreamed footage of the Notre Dame fire on YouTube were mistakenly fed background information for the 9/11 terror attacks by YouTube’s algorithm. As The Associated Press reported, “The background note was posted by a system YouTube recently put in place to combat well-known conspiracies about such events as the moon landing or 9/11. In this case, the algorithm might have had the opposite effect, fueling speculation about the cause of the fire and who might be behind it.”
— Joshua Benton (@jbenton) April 15, 2019
Carroll concurred: “It seems like a better technical solution could have been that as the breaking news event is occurring, to remind people about simple rules, like wait for authorities to provide verified reports and be aware that conspiracy theories are going to happen, don’t fall for them when they do,” he said. “They tried to automatically correct [the problem] which ended up making things worse because the algorithm misfired.”
Carroll said he’s optimistic that the looming presidential campaign will kick up these issues in a high-profile political environment. Debates need to happen, he said, around what the platforms are responsible for doing, how best to address the issue, and what types of regulations could resolve some of the problems.
“Some of it comes down to the fact that [the platforms] don’t employ humans to do what reporters do, meaning it doesn’t seem to me that there are people employed at these companies that sit on the platforms and look for threat models and proactively deal with them accordingly,” Carroll told us. “It’s all being outsourced. Where is the in-house defense layer? Doesn’t seem to be there.”
YouTube (which is owned by Google) said in a statement to The New York Times that, “These [fact checking] panels are triggered algorithmically and our systems sometimes make the wrong call.” The panels, as the Times reported, were announced in summer 2018, “as part of a broader effort to root out misinformation” and would appear “alongside videos on a small number of well-established historical and scientific topics that have often been subject to misinformation, like the moon landing and the Oklahoma City Bombing.”
For Suleiman, who in March 2019 traveled to Christchurch to help bury mosque shooting victims, the issue is both pressing and a matter of public safety. He has seen first hand the effects of hateful online rhetoric.
“You cannot see this issue disconnected from the elements around it,” he told us. “We’re coming off the brink of the worst terrorist attack against a Muslim minority in the West, and also the president is inciting his followers against one of our first two Muslim women in Congress and connecting them to terrorists.”