Enroll Course

100% Online Study
Web & Video Lectures
Earn Diploma Certificate
Access to Job Openings
Access to CV Builder



Online Certification Courses

Russia is the king of disinformation on Facebook, the company says

Russia is the king of disinformation on Facebook, the company says. 

Russia is the king of disinformation on Facebook, the company says

According to a new report released by the company, Russia and Iran are the top two sources of coordinated fake activity on Facebook (FB).

Facebook's report, which was published Wednesday, demonstrates how foreign and domestic covert influence operators have adapted their tactics and become more sophisticated in response to social media companies' efforts to crack down on fake accounts and influence operations.

Since 2017, Facebook has deactivated over 150 networks of coordinated fake activity, the report stated. Twenty-seven networks are believed to be connected to Russia, while 23 are believed to be connected to Iran. Nine were born in the United States.

The United States continues to be the primary target of foreign influence campaigns, according to Facebook's report, which highlighted 26 such efforts from a variety of sources between 2017 and 2020. (Ukraine comes in a distant second place.)

However, during the 2020 election season, domestic actors in the United States, rather than foreign operatives, were increasingly responsible for disinformation. Facebook removed nearly as many American networks targeting the US with so-called coordinated inauthentic behavior (CIB) as it did Russian or Iranian networks in the run-up to the election, according to the company's report.

"Most notably, one of the CIB networks we discovered was operated by Rally Forge, a marketing firm based in the United States that works on behalf of clients such as the Political Action Committee Turning Point USA," the report stated. "This campaign tapped into authentic communities and hired a staff of teenagers to run bogus and duplicate accounts posing as unaffiliated voters to comment on news pages and political actors' pages."

The Washington Post first reported on this campaign in September 2020. A Turning Point spokesman described the effort to the Post at the time as "sincere political activism conducted by real people who passionately hold the beliefs they describe online, not an anonymous Russian troll farm." At the time, the group declined to comment in response to a CNN request for comment.

Another US network, which Facebook announced would be shut down in July 2020, had ties to Roger Stone, former President Donald Trump's friend and political adviser. The network had over fifty accounts, fifty pages, and four Instagram accounts. It reached 260,000 Facebook users and more than 60,000 Instagram users. (Stone shared news of his Facebook ban on the alternative social media site Parler, along with the following statement: "We've been exposing the railroad collusion that was so pervasive and obvious during my trial, which is why they're attempting to silence me. As they will soon discover, I am unable and unwilling to be silenced "')

Following the 2016 election, the presence of fake and misleading content on social media platforms such as Facebook, Twitter, and YouTube became the dominant story, as revelations about Russia's attempts to meddle in the US democratic process surfaced. Foreign influence campaigns have attempted to sow division within the electorate by posing as US voters, targeting voters with misleading digital advertisements, fabricating false news stories, and other techniques.

The revelation of those campaigns has resulted in increased political and regulatory pressure on Big Tech, while also raising persistent concerns about the industry's disproportionate influence in politics and the wider economy. Numerous critics have since called for the dismantling of large technology companies and the enactment of legislation regulating how social media platforms moderate content on their platforms.

 

Facebook's Response

Facebook and other technology companies have responded by hiring additional content moderators and enforcing new platform policies against fake activity.

Separately, Facebook announced Wednesday that it is expanding the penalties it imposes on individual Facebook users who share misinformation that has been debunked by fact-checking partners. At the moment, when a user shares a post that contains debunked claims, Facebook's algorithms demote the post in the user's news feed, reducing its visibility to other users. However, under Wednesday's change, repeat offenders risk having all of their future positions demoted.

Facebook had already been degrading pages and groups that shared fact-checked misinformation on a regular basis, it said, but Wednesday's announcement covers individual users for the first time. (The change does not apply to politicians' accounts, as political figures are exempt from Facebook's fact-checking program.)

 

Conclusion

However, even as Facebook's moderation efforts have improved, many covert purveyors of misinformation have refined their tactics, the report stated. From more tailored and targeted campaigns that evade detection to outsourcing campaigns to third parties, threat actors are attempting to adapt to Facebook's enforcement in an increasingly complex game of cat and mouse, the company reports.

"So, when four years' worth of covert influence ops are combined, what trends emerge?" Ben Nimmo, a co-author of the report, stated Wednesday on Twitter. "While more operators are attempting, an equal number of operators are being caught. The challenge is to continue advancing in order to stay ahead of them and catch up with them."

 

Courses and Certification

Facebook Marketing Course and Certificate

Social Media Marketing Course and Certificate

Content Marketing Course and Certificate

Internet Marketing Course and Certificate

Corporate Training for Business Growth and Schools