Meta platforms proof against serial errors

IMAAfter a historic break that has hit Facebook, WhatsApp and Instagram, these social networks are suffering another blow. Their engineers and leaders are portrayed as irresponsible technocrats incapable of managing their own immensely powerful creation. Indeed, many users are currently facing the problem of where the Facebook login gateway is turned off or not working. Apparently I can’t access services, games or apps using my Facebook account due to error code: 2 (Login failed).

Meta attributed the rush to a flaw in its media matching technology. A chart from the Quarterly Law Enforcement Report showed that the social network has updated more than 400,000 content that was mistakenly marked as terrorist. Facebook has taken action against 2.5 million pieces of content reported for organized hatred in the first quarter of 2022, compared to 1.6 million in the fourth quarter of 2021. It has also decided to ban 16 million pieces of terrorist content during this first quarter. ie almost double compared to the previous period, ie 7.7 million contents.

Fault and problem charts reported to Meti and published daily on a specialized platform “Downdetector” provide an overview of recorded problem reports, based on a typical amount specific to each hour of the day. Depending on the platform, “Certain problems are usually reported during the day.” However, the platform reports an incident only when the number of problem reports is significantly higher than the usual volume.

Most reported problems

On June 6, for example, 45% of Internet users requested to link to their accounts, 15% had difficulty with their publications (errors, preview not available, media import error, etc.), and 14% did not. can connect to the server, with the answer: “Make sure you have the latest version of the Facebook app or delete the app and then reinstall it…”

Reduction, a headache for Meta

For years, Facebook has promoted quality reduction as a way to improve the quality of the News Feed and has constantly expanded the types of content that its automated system operates on. Basically, the cuts have been used in response to wars and controversial political histories, raising concerns about the shadow ban and calls for legislation. Despite its growing importance, Facebook has yet to discover its impact on what people see and, as this incident shows, what happens when a system goes wrong.

An unusual mistake

A group of Facebook engineers identified a “Massive ranking failure” which revealed up to half of all News Feed reviews “integrity risks” prospects over the past six months, according to an internal report of the incident by The Verge. Engineers first noticed the problem last October, when a sudden wave of misinformation began circulating on the News Feed. Instead of deleting the posts of multiple misinformation offenders reviewed by a network of external fact checkers, News Feed instead distributed the posts, increasing reviews by up to 30% worldwide.

Leave a Comment