Meta admitted to CNBC This Instagram is experiencing an error that floods user accounts with rolled up videos that are generally not surfaced by its algorithms. “We are correcting an error that has led some users to see content in their Instagram roller flow which should not have been recommended,” the company told the press organization. “We apologize for the error.” Users have brought social media platforms to ask other people if they have also recently been flooded with coils that contain violent and sexual themes. A User on Reddit said their wrapped pages were flooded with school shots and murder.
Others said They obtain consecutive gore videos, such as stabs, beheadings and castration, nudity, Non -censored porn and right rape. Some have said that they still saw similar videos even if they had made their control of sensitive content. Social media algorithms are designed to show you videos and other content similar to those you usually watch, read, love or interact. In this case, however, Instagram has shown graphic videos even to those who have not interacted with similar rollers, and sometimes even after the user took the time To click on “not interested” on a coil with violent or sexual content.
Meta spokesman did not say CNBC How exactly was the error, but some of the videos that people said they saw should not have been on Instagram in the first place, based on business policies. “To protect users … We delete the most graphic content and add warning labels to other graphic content so that people are aware that it can be sensitive or disturbing before clicking on”, the policy of society reading. Meta rules also indicate that it removes “real photographs and videos of nudity and sexual activity”.
This article originally appeared on engadget to