Facebook has been able to tag no fewer than 50 million contents relating to coronavirus on its website as “misinformation” in April alone
The tech giant also removed 2.5 million items for the sale of hand sanitizers, surface disinfecting wipes, masks, and Covid-19 test kits. The company is taking all of these actions to prevent the spread of false and misleading information about the pandemic.
Facebook has started attaching these warnings to the posts sharing articles on its site. These articles have been reviewed by the company’s independent fact-finding partners. So far, the company has attested that the number of people who view the original content have reduced drastically. Once they see the warning, most people do not go ahead to view that content.
On Tuesday, Facebook released its annual transparency report, and in the report, giving facts about the misinformation. This is the fifth time the social media giant would be releasing a transparency report as regards the company. However, since the outbreak of the pandemic, this is the first transparency report the company would be sharing.
The social networking site has been using more of artificial intelligence to guide the contents flying on the site. As it is, Facebook can now detect about 90% of contents before the general public can have access to these contents. By doing this, they can cross check every content before it is out for public digest.
Mark Zuckerberg, the CEO on Facebook mentioned on a press call that; “I know our systems aren’t perfect. They have certainly been impacted by having less human review during Covid-19, and we do unfortunately expect to make more mistakes until we’re able to ramp everything back up.”
Sometime back, Facebook got a lot of backlash from critics and regulators for helping in the spread of misinformation online. In a bid to stop this, Facebook has used a combination of machine learning systems and human workers. These two categories would root out all contents that violate its community standards.
So far, most of the company’s employees have been working from home and this has caused Facebook to rely heavily on artificial intelligence.
Guy Rosen, Facebook VP of Integrity wrote in a blog post that; “When we temporarily sent our content reviewers home due to the Covid-19 pandemic, we increased our reliance on these automated systems and prioritized high-severity content for our teams to review in order to continue to keep our apps safe during this time.”
Also, the company has agreed to a landmark settlement of $52 million. This money would be shared among both current and former moderators who have developed PTSD while doing their job.
Meanwhile, Facebook is not the only social media platform tagging misinformation or related topics around the coronavirus; Twitter in the latest development announced that it will start labeling any tweet around coronavirus with a warning message.
COMPARISON WITH LAST QUARTER
Facebook has shared a comparison between the last quarter and the first three months of 2020 to show the effort of the company so far.
For Facebook social media site, the moderators removed 4.7 million pieces of content on “connected to organized hate”. This is over 3 million content more than the previous quarter.
For Instagram, the moderators removed 175,000 related contents. There is a moderated increase of 139,800 from the last quarter of 2019.
The company also mentioned that its proactive detection rate for these content increased from 89.6% to 96.7% on Facebook. Instagram, on the other hand, has increased from 57.6% to 68.9%.
FURTHER ACTIONS TAKEN BY FACEBOOK
Asides getting rid of misinformation, Facebook has also taken extraordinary steps to promote reliable information. The site would be getting this information from public health agencies and organizations like the WHO and U.S. Centers for Disease Control.
In addition, they have given the World Health Organization as many free ads as they need for their coronavirus response.
A Bloomberg reporter asked if Facebook has any plan to hire additional moderators since they have complained about a strain in their processes. Rosen replied by stating that they are trying to get existing moderators to get back to work.
In all, it is good that Facebook has been taking strict measures to make sure that its users are emotionally stable until the pandemic is over.