The Russian Military’s People Problem
It’s Hard for Moscow to Win While Mistreating Its Soldiers
Late in the evening on April 11, the Texarkana, Texas, Police Department started receiving 911 calls about an imminent attack on one of their own. A man in a black Chevy truck was crisscrossing the area looking for a lone police officer to “ambush and execute” and streaming his search to Facebook Live. Using the video, police were able to quickly locate the truck. After a high-speed chase, 36-year-old Aaron Swenson surrendered to police, a search of his truck turning up several loaded firearms.
The ensuing investigation revealed that Swenson had been deeply immersed in the online culture of the so-called boogaloo bois: heavily armed men, often clad in armored vests and incongruously festive Hawaiian shirts, who in recent months have appeared at protests around the country against both COVID-19 lockdowns and police brutality. Swenson isn’t the only member to have embraced violence: on May 30, in Las Vegas, three boogaloo bois on their way to a Black Lives Matter protest were arrested with numerous firearms and Molotov cocktail ingredients—the trio have military backgrounds, and according to the Federal Bureau of Investigation, two had plotted to firebomb a power substation. Last week, two men associated with the movement were charged in the killing of a courthouse guard in Oakland, California.
Part meme, part subculture, the boogaloo is a mash-up of antigovernment apocalyptic screed, Second Amendment evangelism, and dark-humored satire. The term itself refers to a hoped-for civil war that will bring about the collapse of society—and, in some adherents’ vision, its replacement by a white ethnostate. (The origins of the name itself are a bit complicated, but they trace back, through countless message board conversations and in-jokes, to the 1984 break-dancing movie, Breakin’ 2: Electric Boogaloo.)
Even before the coronavirus pandemic hit, white supremacist terrorism was a growing menace. Although the specter of jihadism has received more attention, the threat of racially motivated extremism—of which white supremacy is a part—has been rising steadily over the last few years. As recently as 2016, it accounted for only 20 percent of terrorism-related deaths in the United States, according to the Anti-Defamation League. By 2018, that figure had increased to 98 percent. In February, FBI Director Christopher Wray testified before Congress that racially and ethnically motivated extremists had been the “primary source of ideologically motivated lethal incidents and violence” over the last two years. Wray also noted that 2019 marked the deadliest year of white supremacist violence since the Oklahoma City bombing in 1995.
Just as they have for food delivery services and videoconferencing platforms, lockdowns have proved to be a time of growth and opportunity for white supremacists. Indeed, violent extremists across the ideological spectrum have exploited the pandemic to take advantage of people who are at their most vulnerable, desperate, and available—relegated to their homes (or their parents’ homes) with little to distract them aside from surfing the Web. The dearth of large public gatherings and crowds moved the terrorism battle space inside and online. But with an antigovernment message designed for online virality, twenty-first-century white supremacists were especially well positioned to profit from this shift. And the evidence so far suggests that they succeeded in doing so, with results that, as the recent arrests show, can all too easily become offline threats.
As with terrorist groups such as the Islamic State (or ISIS), today’s white supremacist threat is both global and virtual. The war in Ukraine, for example, has attracted hundreds of foreign fighters with ties to the far right who use the battlefield as a networking space. That includes dozens of Americans, some of whom have come home with new contacts and fighting experience. Outside Ukraine, white supremacist training camps exist in Poland, Bulgaria, and even the United Kingdom, and many white supremacist organizations operate transnationally.
But whereas for ISIS the Internet is a tool to create and grow the caliphate, for white supremacists the Internet is the caliphate: a headquarters, a virtual training camp, and a staging ground all in one. This reliance on the Internet has served the group well during the pandemic. But it may also be its Achilles’ heel.
In the early afternoon of March 15, 2019, Brenton Tarrant sat parked in his car in Christchurch, New Zealand. Recording himself with his smartphone, he told viewers to “subscribe to PewDiePie,” a popular and deliberately provocative Swedish YouTuber with an ambiguous relationship to the far right. Tarrant then got out of his car and embarked on a shooting spree at two local mosques, killing 51 people. With a helmet-mounted GoPro camera, he livestreamed the mass murder on Facebook. Within a day of the attacks, Facebook had reportedly blocked some 1.5 million attempts to view the video.
Livestreaming crimes in progress, as Tarrant expertly did—and as Swenson, the man arrested in Texarkana, tried to do—is a tactic brought into the mainstream by ISIS. Amedy Coulibaly, the perpetrator of the attack on the Hyper Cacher supermarket in Paris in 2015, tried desperately to upload GoPro footage that he had taken of his assault, commissioning the help of one of his hostages in the failed effort; the next year, in the French city of Magnanville, the ISIS adherent Larossi Aballa posted a 13-minute live video on Facebook after stabbing to death a police officer and slitting the throat of the officer’s girlfriend. The idea was to leverage the viral power of violence, individual foot soldiers becoming content creators and influencers and using social media as a force multiplier. Turning a physical attack into a piece of propaganda in real time lets the perpetrator control the narrative.
Whereas for ISIS the Internet is a tool to create and grow the caliphate, for white supremacists the Internet is the caliphate.
At its height, ISIS deployed social media as effectively as it deployed violence. As the group’s flagship magazine, Dabiq, put it, “The information campaign is indistinguishable from the military campaign. . . . Violence is itself a message when you use it correctly.” In addition to the beheading, immolation, and post-attack promotional videos designed to horrify the public and draw headlines, ISIS’s information campaign produced slick biopics lionizing individual fighters. Content was spread across official and unofficial channels to ensure its longevity. And to this day, followers worldwide are instructed to take the initiative and carry out whatever acts of violence they are able to, using whatever weapon is available (a knife, gun, or car)—and attesting, preferably on video, that they are doing it for ISIS.
ISIS’s media strategy was a significant step up from that of previous terrorist groups, but white supremacists have continued to innovate. Formally, much of their propaganda and media strategy resembles that of ISIS: skillfully produced videos; bold graphics that use manipulated imagery from video games and movies to issue crude threats; the deliberate channeling of new followers from open social media platforms—effective for recruitment—to encrypted ones, where bomb-making instructions can be shared, potential targets are discussed, and the real mobilization takes place.
With white supremacists, however, the culture of the Internet suffuses the entire movement. In a 74-page manifesto that Tarrant posted to the extremist online messaging board 8chan before his attack, he writes, “Memes have done more for the ethno-nationalist movement than any manifesto.” The entire document is marbled with sarcasm, and Tarrant at one point jokingly suggests that he was radicalized by the video games Spyro the Dragon and Fortnite. This is unusual manifesto territory. As with most content on the 8chan platform, the tone is not that of a religious zealot but that of a winking online troll—a “shitposter.” The text is emblematic of a subculture that cloaks its extremist ideology in layers of ironic detachment and nihilistic jocularity, mocking mainstream liberal values of inclusivity, equality, and democracy as repressive, traditionalist, and conservative.
This peculiar form of discourse binds the movement together. Unlike al Qaeda and ISIS—organizations with hierarchy, territory, franchises, mission, vision, and values—the white supremacist world is dispersed and highly dynamic. There are plenty of groups, some operating across borders, but they are often small, disorganized, and quick to fragment and reconstitute. Group members may belong to more than one group at once, which can sometimes mean little more than participating in multiple group chats on different online platforms. In this world of shifting, overlapping allegiances, the memes and the irony—who is in on the joke and who is not—are the connective tissue.
Meanwhile, the violence spreads by contagion rather than direction—another reason white supremacists are particularly well suited to take advantage of a global pandemic. Counterterrorism analysts tend to break down global jihadism based on how connected threat actors are to a particular organization—whether they are directed, enabled, or inspired by that organization. The taxonomy of white supremacy, on the other hand, centers on how individual threat actors influence one another. In addition to being a terrorist, Tarrant is a social media influencer, and his manifesto is as much a recruitment speech as an explanation of his worldview. Multiple attackers have cited Tarrant as an influence over the 15 months since his attack—including John Earnest, who livestreamed his attack on a synagogue in Poway, California; Patrick Crusius, who killed 22 people in a Walmart in El Paso, Texas, in August 2019; and Philip Manshaus, who attacked the Al-Noor mosque in Oslo, Norway, in August 2019. Tarrant himself cites Anders Breivik, perpetrator of a horrific attack in Norway in 2011 that killed 77, most of them young people at a summer camp. The most influential figureheads are not organizational or spiritual leaders, such as al Qaeda’s Anwar al-Awlaki or ISIS’s Abu Muhammad al-Adnani, but individual attackers, who are often rhetorically sanctified online—“Hail Saint Tarrant,” “Saint Crusius,” or “Saint Breivik,” and so on.
All of this complicates the job of law enforcement. The veil of irony and memes makes it hard to identify who poses an imminent threat and who is simply joking—which, of course, is precisely the point. The remit of law enforcement is to find future Brenton Tarrants before they carry out their attacks, but when it comes to flagging incitement to violence, it is hard to tell signal from noise. And if irony generates static, contagion is often silent. The United States government is well equipped to prevent violence organized by groups or structured networks that communicate, travel, train, and carry out other activities that trip wires. Predicting which lone actor will mobilize and carry out a violent attack is much more difficult.
Ultimately, however, the online nature of white supremacist terrorism may be its weak point. For one, it’s an open question how scalable a movement based on leaderless resistance, influence, and contagion can be. And dependence on the Internet makes it hard to exist without it. Efforts to deplatform and remove white supremacist actors and content lag behind similar measures against jihadist groups, but if applied, they will likely be even more effective. Al Qaeda and ISIS are bigger, more lethal, and more powerful organizations thanks to social media, but they would be able to do much of what they do without it. This is not the case for racist extremists. Shutting down their content, message boards, forums, profiles, and sites can deal them a serious blow. Denying these sites protection from cyberattacks is another powerful tool, already used against Gab, a social media site popular among racially motivated extremists, in the aftermath of Robert Bowers’s attack against the Tree of Life synagogue in Pittsburgh, and against 8chan after the El Paso shooting.
The online nature of white supremacist terrorism may be its weak point.
These measures are no catchall solution. Committed extremists will migrate to new, often “dark” or encrypted platforms and groups or simply return under new names after their profiles and sites have been taken down. But making it harder for uninitiated users to engage with extremist content, or for radicalized individuals to find one another online, checks the movement’s spread. Deplatforming can reduce the speed of contagion. It can also deter those who aren’t serious, who are in it for the memes but not the violence—it can shut out the noise so that law enforcement can concentrate on the signal.
Social media companies have dedicated significant resources to removing content associated with groups such as ISIS and al Qaeda. They are starting to do the same for white supremacists, with a particular focus on content related to COVID-19, such as disinformation (guidance, for example, purporting to be from the Centers for Disease Control and Prevention suggesting sanitizing one’s house with combinations of household-cleaning materials that are toxic or explosive) or tactical and targeting advice (directives to intentionally infect minorities, law enforcement, or health-care workers by coughing, sneezing, or spitting on them). Last week, Facebook removed hundreds of accounts linked to white supremacist groups that had discussed bringing weapons to protests over the killing of George Floyd. Overall, however, these content removal efforts have not kept up with content production. In the fall of 2019, the encrypted messaging app Telegram, with the help of Europol, completed the largest and to date most successful ISIS purge, removing thousands of accounts from its platform. This action was a significant setback for the group, sending the online jihadist community reeling, decentralized, to multiple alternative platforms to try to reconstitute. Social media companies have not taken similar action against racially motivated violent extremist accounts.
Until they do, the onus is on law enforcement to identify the radicalized before they mobilize. The window of opportunity, however, is often short. Right before Robert Bowers attacked the Tree of Life synagogue in Pittsburgh in October 2018, he posted on Gab, “I can’t sit by and watch my people get slaughtered. Screw your optics, I’m going in.” Tarrant’s manifesto went up moments before his attack. “Well lads, it’s time to stop shitposting,” it read, “and time to make a real life effort post.” Law enforcement cannot be all that stands between the virtual and the real, between the optics, the shitposting, and the real-life effort.