(IANS) Meta, the newly-formed parent company of Facebook and its family of apps, on Monday said that it has removed over 30 million pieces of content in September on Facebook and Instagram in India, as it faces intense security over user data privacy.
The social network acted upon 26.9 million pieces of content across 10 policies for Facebook and over 3.2 million pieces of content across 9 policies for Instagram in compliance with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, the company said in its monthly report.
“This report will contain details of the content that we have removed proactively using our automated tools and details of user complaints received and action taken,” a Meta spokesperson said in a statement.
“In accordance with the IT Rules, we’ve published our fourth monthly compliance report for the period for 30 days (01 September to 30 September),” the spokesperson added.
In September, Meta received 708 reports through the Indian grievance mechanism, and responded to all of those reports. Of these reports, Facebook provided tools for users to resolve their issues in 589 cases.
The social network took action on 33,600 pieces of content related to hate speech and 516,800 pieces of content in the adult nudity and sexual activity category in the country.
Meta also took action on 307,000 pieces of content related to bullying and harassment in India.
“We use a combination of Artificial Intelligence, reports from our community and review by our teams to identify and review content against our policies,” said the Meta spokesperson.
All tech giants have been directed to produce monthly compliance reports under the new IT rules 2021.
WhatsApp banned 20.7 lakh accounts in India in the month of August. In the period from June 16-July 31, WhatsApp had banned 30.2 lakh accounts in India in compliance with the new IT rules.