(The Hill) – Domestic extremists adapt their online strategies to push disinformation and conspiracies despite crackdown on social media platforms in the year following a pro-Trump mob attack on Capitol Hill .

Online extremist groups and far-right influencers are using more coded language to fill in gaps in the application of moderation to mainstream content and are still active on alternative platforms that have grown in popularity since the Riot. January 6, 2021.

Experts say efforts to counter domestic extremism must also adapt, otherwise the spread of disinformation online poses real risks as the November midterm elections and the presidential election in November approach. 2024.

“There will always be this synergistic relationship between the content moderation failures of Facebook, Twitter, and alternative technology platforms like Parler,” said Candace Rondeaux, director of the New America think tank’s Future Frontlines program. “So we absolutely have to expect that as we approach the mid-term of 2022, especially in battlefield states where things are extremely polarized, we will see a similar dynamic.”

What is even more worrying is what can happen as the 2024 presidential election approaches – where the capital available to candidates, parties and interested stakeholders is greater and platforms can be used. to “sway the debate to the point where things can get violent.” said Rondeaux.

Research reports released this week have highlighted accounts of disinformation among far-right groups on platforms such as Parler, Gab and Telegram. The Department of Homeland Security has also warned partners of an increase in discussions on extremist pages online.

While fringe platforms differ from traditional social media, they are closely related to the broader reach of internet conversations. Experts have cautioned against rejecting the influence of alternative platforms despite their weaker user base.

“Much of the activity that takes place on these platforms is always responsive to things that happen on the mainstream platforms. So really understand that dynamic and not treat it as a completely separate and distinct factor when we think of the internet, I also think it’s important, ”said Jared Holt, resident researcher at the Digital Forensic Research Lab (DFRLab).

New America released a report on Wednesday with researchers at Arizona State University analyzing millions of messages on Speaking before and immediately after last year’s insurgency.

“While more research is needed, our preliminary analysis suggests that as long as regulatory loopholes for social media platforms persist, the United States will face the prospect of a continuing crisis of several months – or worse, several. years as the public revolt against right-wing political violence pushes extremist cadres deeper into marginal areas of the internet and deeper into the dark web, ”the report said.

New platforms posing as a safe haven for conservatives who are fed up with moderating content on mainstream sites have emerged since the riot.

In July, former Trump campaign aide Jason Miller launched Gettr, which describes itself as “the social media platform for free speech that fights culture cancellation.” Gettr announced on Friday that it had reached 4 million users.

Trump is considering launching his own social media network called “TRUTH Social”. While the platform’s reach is still vague, Trump’s media company announced a deal with Rumble, a YouTube alternative popular with some right-wing audiences, in December.

A DFRLab report released this week detailed the ways in which extremist groups have adapted their strategies, including through what Holt has described as a “giant game of musical chairs” – with influential far-right figures moving on. a range of marginal sites and bringing in their followers.

While Parler and other alternative platforms target a far-right audience and brag about their lack of content moderation, mainstream platforms including Facebook and Twitter were also filled with posts amplifying election misinformation and groups organizing before the insurgency.

Since last year’s violent attack, major platforms have cracked down on disinformation. But they may not be doing enough, experts say.

Paul Barrett, deputy director of the NYU Stern Center for Business and Human Rights, said the tendency for platforms to be responsive makes “a lot more likely that in 2022 and 2024 we are seeing new chaos online and in the real world. “.

After the insurgency, Facebook took action to tackle content that amplified election misinformation and QAnon conspiracy theories. Facebook has banned content with the phrase “Stop the Steal” and banned pages, groups, and accounts related to the QAnon conspiracy theory.

Facebook and Twitter have both also cut the accounts of former President Trump, although Facebook has left open the possibility of restoring Trump’s account next year.

“Even after January 6, and even after banning President Trump indefinitely or permanently, it seems to me that platforms are still more prone to reacting to what they see as public relations crises, so they are prone to s ‘tackle these very serious issues in a forward-looking and comprehensive approach,’ Barrett said.

“There just hasn’t been a concerted industry-wide effort to say, ‘Look, we’re part of the problem.’ Not because we intend to be part of the problem, but it turns out that we have inadvertently created tools that are misused in relation to absolutely vital institutions and processes, ”he said. added.

A Facebook spokesperson rebutted the claim that his actions were reactive.

“Facebook has taken extraordinary steps to tackle harmful content and we will continue to do our part,” a spokesperson said in a statement. “We continue to actively monitor threats on our platform and will respond accordingly. “

A Twitter spokesperson said the company took “vigorous enforcement action” against accounts that “incite violence” before and after the insurgency.

“We recognize that Twitter has an important role to play and we are committed to doing our part,” the spokesperson said in a statement.

While conspiratorial disinformation is no longer “as prevalent” on Facebook and Twitter, the platforms have not “completely solved the problem of disinformation,” Holt said.

‘Hyperpartisan’ media content still performs well across platforms and much of that content incorporates “the same ideology and rhetoric from sources that are otherwise prohibited,” he said.

Disinformation also continued to circulate on mainstream platforms via coded language to bypass moderation, said Bret Schafer, head of the Alliance for Securing Democracy’s information manipulation team.

“You still see a lot of the same kind of stories out there, it’s just a little more subtle and certainly the band names aren’t quite so on the nose,” Schafer said.

Many far-right extremists have also encouraged their followers to channel their efforts towards more local events, such as tackling COVID-19 restrictions or challenging school curricula, which can be more difficult to monitor and detect online. than a nationwide event, said Holt.

By targeting local events, they need fewer attendees to cause disruption, which means groups and accounts with smaller followers that may go under the radar always present risk.

“[If] the goal of an extremist group is not to rally 1000 people in Washington, but to rally 10 people in a school board, they don’t need a channel or an account with a million followers . They could be successful with 30 if they’re the right 30 people, ”Holt said.

“We’re not necessarily looking for the big fish anymore,” he added. “These big fish will continue to be important, but in this developmental situation even those small groups that might not have a particularly large reach might still be able to organize themselves.”