Why did Facebook remove 3 bn accounts in six months?
Facebook says that five percent of monthly active accounts are fake. It noticed a sudden increase in the creation of abusive, fake accounts in the last six months.
Facebook released its third Community Standards Enforcement Report on May 23. It provides insights to some of the tech giant’s major issues such as adult nudity and pornography, bullying and harassment, sexual exploitation of children, fake accounts, hate speech, spam, global terrorist propaganda and violence and graphic content.
Fake accounts
In the last six months, Facebook has removed over three billion fake accounts.
Facebook says that five percent of monthly active accounts are fake. It noticed a sudden increase in the creation of abusive, fake accounts in the last six months. The Facebook transparency report said, “We catch most of these accounts within minutes of registration. However, automated attacks have resulted in more of these accounts making it past our initial detection, which increased prevalence. The number of actioned accounts also increased, since we removed more accounts that should not be on Facebook. (sic)”
These fake accounts are removed by blocking accounts before their creation, removing accounts during sign-up and removal of already existing accounts. The report said, “We have two main goals with fake accounts. Preventing abuse from fake accounts but also giving people the power to share through authentic accounts. We have to strike the right balance between these goals.”
Content removal
Antigone Davis, Facebook’s Global Head of Safety in a blog post on March 15 announced that the company was focused on developing artificial intelligence systems to catch material without the need for users to report it first.
When an image or video is said to violate the company’s policy, a digital fingerprint is created. This will help the algorithm detect and automatically remove similar content in the future.
Davis said, “Finding these images goes beyond detecting nudity on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram.” The post added that victims—afraid of retribution—are reluctant to report the content themselves or are unaware that the content has been shared.
For every 10,000 views on facebook, 11 to 14 views contained content that violated the company’s adult nudity and sexual activity policy and 25 views contained content that violated its violence and graphic content policy.
Less than 0.03 percent of views belonged to content that exploited children. This means less than 3 of every 10,000 views on Facebook contained violating content that depicted Child Nudity and Sexual Exploitation of Children.
The company said that between January and March 2019, it took action on 34 million posts for hosting violent or graphic content. The report said, “We found 98.4 percent of violent and graphic content before users reported it in October to December 2018 and 98.9 percent in January to March 2019.” Of the 34 million posts, less than 24,000 posts were restored.
Despite these numbers, the New Zealand mosque shooting on March 15 was live-streamed on Facebook for a total of 17 minutes before it was taken down.
Facebook is under growing pressure to combat hate on its platform. A year ago, Facebook’s Artificial Intelligence algorithms were able to detect just 38 percent of hate speech content. But the number has risen to 65 percent between January and March 2019. The report said, “In the first quarter of 2019, we took down four million hate speech posts.”
By the end of 2018, Facebook increased the size of its safety and security team to 30,000 people. This move was interpreted as an attempt to respond quickly to offensive content reported on the platform. Despite these high-cost investments, Facebook’s reputation has suffered over the years. While the numbers show that Facebook is trying hard to regulate content, it just isn’t enough. The social giant has a long way to go.