in

Facebook Report Names Russia Leader Of Disinformation

Russia is the king of disinformation, according to Facebook 0:53

(CNN Business) – Russia and Iran are the top two sources of coordinated false behavior on Facebook, according to a new report released by the company. Facebook’s report, released Wednesday, shows how undercover influencers, both foreign and domestic, have changed their tactics. And they have become more sophisticated in response to efforts by social media companies to crack down on fake accounts and influence operations.

Facebook has removed more than 150 networks of coordinated inauthentic behavior since 2017, according to the report. Twenty-seven networks have been linked to Russia and 23 to Iran. Nine originated within the United States.

The United States remains the top target for foreign influence campaigns, according to the Facebook report, highlighting 26 such efforts by a variety of sources from 2017 to 2020 (Ukraine is a distant second).

Yet during the 2020 election season, those responsible for sowing disinformation were increasingly US domestic actors, not foreign agents. In the run-up to the elections, Facebook removed as many US networks that targeted the United States with so-called coordinated inauthentic behavior (CIB) as Russian or Iranian networks, according to the company’s report.

“In particular, one of the CIB networks that we found was operated by Rally Forge, a US-based marketing company, working on behalf of its clients, including the Turning Point USA Political Action Committee,” says the report. “This campaign tapped into authentic communities and recruited a team of teenagers to run fake and duplicate accounts posing as unaffiliated voters to comment on news pages and political actor pages.”

That campaign was first denounced by The Washington Post in September 2020.

A Turning Point spokesperson, in a statement to the newspaper at the time, described the effort as “sincere political activism by real people who passionately hold the beliefs they describe online, not an anonymous troll farm in Russia.” At the time, the group declined to comment in response to a CNN request.

Another US network, which Facebook announced it removed in July 2020, had links to Roger Stone, a friend and political adviser to former President Donald Trump. The network maintained more than 50 accounts, 50 pages, and four Instagram accounts. It had a reach of 260,000 Facebook accounts and more than 60,000 Instagram accounts. (After Facebook’s withdrawal, Stone shared the news of his lockdown on the alternative social media site Parler, along with a statement: “We have brought to light railroad work that was so profound and so blatant during my trial, for what they must silence me. As they will soon learn, they cannot and will not silence me “).

The presence of false and misleading content on social media became the dominant story haunting tech platforms such as Facebook, Twitter, and YouTube, following the 2016 elections, when Russia’s attempts to meddle in it came to light. democratic process of the United States. Foreign influence campaigns have attempted to sow division in the electorate by posing as American voters, targeting voters with misleading digital advertisements, creating fake news and other techniques.

The discovery of these campaigns has led to intense political and regulatory pressure on big tech companies and has also raised persistent questions about the disproportionate power of industry in politics and the wider economy.

Since then, many critics have called for the dissolution of the big tech companies and for legislation regulating the way social media platforms moderate the content of their websites.

Tech companies like Facebook have responded by hiring more content moderators and establishing new platform policies on fake activity.

In another statement on Wednesday, Facebook said it will expand the penalties it applies to individual Facebook users who repeatedly share misinformation that has been disproved by its fact-checking collaborators. Currently, when a user shares a post that contains disproved claims, Facebook’s algorithms downgrade that post on their news feed, making it less visible to other users. But with Wednesday’s change, repeat offenders can risk having all of their posts downgraded in the future.

Facebook has already been applying general account-level downgrades to pages and groups that repeatedly share corroborated misinformation, he said, but Wednesday’s announcement covers individual users for the first time. (Politicians’ accounts are not covered by the change because political figures are exempt from Facebook’s fact-checking program.)

Although Facebook has improved its restraint efforts, many undercover disinformation providers have evolved their tactics, according to the report. From creating more tailored and targeted campaigns that can evade detection to outsourcing their campaigns to third parties, threat actors are trying to adapt to Facebook surveillance in an increasingly complex game of cat and mouse. , according to the company.

“So when four years of covert influence operations come together, what are the trends?” Ben Nimmo, a co-author of the report, wrote on Twitter Wednesday. “There are more operators who are trying, but there are also more operators who are discovered. The challenge is to keep moving forward to be ahead and catch them.

19 million years ago there was a mysterious mass extinction of sharks

Lucerito Mijares and his parents’ advice against criticism