Meta removes over 23 million pieces of ‘bad content’ on Facebook, and Instagram in India: Facebook and Instagram’s leading body Meta has launched its monthly compliance document for India for November. This record highlights Meta’s content moderation efforts in 13 rules for Facebook and 12 for Instagram. In November, more than 23 million pieces of content were removed across both platforms. The record also mentions Meta’s response to consumer reviews received through the Indian criticism mechanism. This report is released every month in accordance with the IT Rules 2021 in India.
Facebook and Instagram’s leading company Meta has released its monthly compliance report for India for the month of November. The company revealed that it removed more than 23 million pieces of content from both platforms within the month.
Meta removes over 23 million pieces of ‘bad content’ on Facebook, and Instagram in India. Notably, the company removed 37 million pieces of terrible content in the last month. Which represents a good amount of reduction on a monthly basis.
This record provides additional information on content moderation efforts in 13 rules for Facebook and 12 rules for Instagram.
It also highlights Meta’s response to consumer reports received through the Indian complaint mechanism.
Removing Content on Facebook
Meta removes over 23 million pieces of ‘bad content’ on Facebook, and Instagram in India: According to the company, more than 18.3 million pieces of content have been removed from Facebook. The platform received 21,149 reviews. Providing tools to fix 10,710 users’ problems.
Special review: Of the previous 10,739 reviews, 4,538 required special assessment and ended with removal of material.
Removing Content on Instagram
Like Facebook, Instagram also had more than 4.7 million pieces of content removed and the photo and video-sharing platform received 11,138 reports, with 4,209 users providing tools to resolve issues.
Special assessment: Of the remaining 6,929 reports, 4,107 required special assessment and resulted in material being removed.
Of the other 6,929 reports that needed special assessment. We reviewed the content according to our rules, and we took effort on 4,107 reviews in total. A final 2,822 reports were reviewed but not acted upon,’ it said.
Meta emphasized that the variety of removed pieces reflects content that violates its standards. Such as removing disturbing content or adding warnings.
Every month, large social media corporations including Meta and X (formerly Twitter) must release a compliance file following the implementation of the IT Rules 2021 in India.
How does Meta identify and remove ‘bad content’ on Facebook and Instagram in India?
Meta removes over 23 million pieces of ‘bad content’ on Facebook, and Instagram in India: Meta identifies and removes ‘bad content’ on Facebook and Instagram in India through a combination of automated systems and human review. These systems use artificial intelligence and machine learning algorithms to analyze and detect content that violates the platform’s community standards. Additionally, Meta has a team of content reviewers who manually review reported content and take appropriate action. Such as removing the content or restricting access to it. This multi-layered approach helps Meta maintain a safe and positive environment for users in India.
What measures does Meta take to prevent the spread of harmful content on its platform in India?
Meta takes several measures to prevent the spread of harmful content on its platform in India. These measures include enforcing strict community guidelines. Employing advanced artificial intelligence algorithms for content moderation, and collaborating with local authorities and organizations to address specific concerns. Additionally, Meta encourages users to report any harmful or inappropriate content they encounter. And they have dedicated teams that review reported content and take immediate action.
What will be the consequences for users posting ‘bad content’ on Facebook and Instagram in India?
Consequences for users posting ‘bad content’ on Facebook and Instagram in India could include account suspension, content removal and legal action.
How does Meta work with local authorities in India to address the issue of ‘bad content’ on their platforms?
Meta works closely with local authorities to address the issue of ‘bad content’ on its platforms by actively collaborating with law enforcement agencies and government bodies in India. Through regular engagement and collaboration. META ensures that local authorities are informed of any content that violates their laws or regulations. This partnership allows for a quick and efficient response to reports of harmful or illegal content. Enabling Meta to take appropriate action such as removing or restricting access to such content to maintain a safe and responsible online environment.
Read This: How to Redeem Amazon Gift Card Codes