Facebook’s action, action taken against 1.62 crore contents in India in November

WhatsApp Channel Join Now
Telegram Group Join Now
WhatsApp Channel Join Now

[ad_1]

Leading social media company Meta is continuously taking action against violation of content related rules. The company has said that in India it took ‘action’ on more than 1 crore 62 lakh content pieces on Facebook in the month of November. This action was taken under 13 violation categories. During the same period, the company’s photo sharing platform Instagram took action against more than 32 lakh contents in 12 categories.

Under the IT rules implemented earlier this year, large digital platforms with more than 50 lakh users are required to publish compliance reports every month. Information about the action taken on the complaint received by the platforms is given in it. This report also contains details of content removed or disabled using automatic tools.

In October, Facebook had taken ‘action’ on more than 1 crore 88 lakh contents in 13 categories. During the same period, Instagram took action against more than 3 million pieces of content across 12 categories.

In the latest report, Meta has said that between November 1 and November 30, Facebook received 519 user reports through its Indian complaint mechanism. Of these, 461 cases were solved by providing tools to the users.

According to the report, out of more than 1 crore 62 lakh materials on which Facebook took action in November, 1 crore 10 lakh materials were related to spam. 20 lakh items were of violence. 15 lakh materials were of adult nudity and sexual activity. Apart from this, action was also taken on more than one lakh head speeches. Action was also taken in many other categories. These include 102,700 pieces of content on bullying and harassment, and 370,500 pieces of content on suicide and self-harm. Action was also taken on threats related to children.

At the same time, among the 12 categories in which action was taken on more than 32 lakh contents, most of the cases were related to suicide and self-harm. Action was taken on 815,800 such materials. 333,400 cases involving violent and graphic content were removed.

[ad_2]

WhatsApp Channel Join Now
Telegram Group Join Now
WhatsApp Channel Join Now

Leave a Comment