IT Ministry Urges Social Media to Remove Obscene Content Promptly

An advisory has been released by the IT Ministry, warning the social media platforms presumably to remove the obscene content before it is reported, and stressing adherence to the IT Rules 2021. If platforms do not use technology for detection, they could face legal risks, which would also include losing safe harbour protections.

A formal advisory had been issued by The Ministry of Electronics and Information Technology against the companies that publish unethical and pornographic content, asking to be even more strict in the removal of such content. On the 29th of December, 2025, the advisory outlined the requirements as per the IT Rules, 2021 and, moreover, e-warned of severe legal repercussions in case of non-compliance, which would also mean loss of safe-harbour protections.

What the advisory requires

The advisory makes it clear that the big platforms – that is, those with a user base of more than 50 lakhs – are to use technology for the automatic elimination of obscene, pornographic, vulgar or pedophilic content. Moreover, the companies are requested to immediately go for a critical evaluation and then hardening of their compliance mechanisms, moderation standards, etc.

Platforms are required to take swift action in removing or limiting access to illegal content as soon as they come to know through court orders or through a proper intimation from government agencies who are authorised. The IT Rules, 2021 lay down the timelines and procedures that intermediaries will have to comply with to remain legal.

Safety from Lawsuit with Section 79

The guidance for intermediaries is that if they are not taking due diligence, they are not eligible for the safe harbour provided under Section 79 of the IT Act. Not taking down banned content can result in taking away the exemption and so the platform would be then held liable for third-party user content both civilly and criminally. The government warns that not following the rules could result in prosecution under the IT Act, the Indian Penal Code, and other criminal laws.

Rule 3(2)(b): 24-hour removal window

It shall be the duty of the intermediaries under Rule 3(2)(b) of the IT Rules to prevent sexual content that is prima facie or is presenting someone as being a part of a sexual act to be present on their platforms and at the same time, to remove the content within 24 hours of a complaint from the affected person or their representative. This even makes the process of individual redress faster and at the same time strengthens the obligations on the rapid takedown once the complaint is filed.

Actions of the government in the field of enforcement and similar things had been taken up recently that made the government block the platforms that were mainly dealing with erotic content and, in turn, banning nearly 25 OTT services which were in India already and depended on the internet. The advisory was made after the judiciary was alarmed about the spread of obscene content on the internet and this shows a growing willingness of the regulators to use blocking, prosecution, and other means to control the illegal content.

The obligation of the big platforms is to apply the automated recognition tools exclusively for the inappropriate and adult content detection and deletion. This orientation implies the establishment of a policy in favor of the massively assisted moderation with the support of a human reviewer for appeals and the like. In addition to the above all, a part of the procedural due diligence is that platforms need to place accessible complaint redressal and compliance officers in India.

Challenges and human rights for companies and users

There are some problems with operations that automated removals bring such as mistakes in the identification process, overblocking of rightful speech, and intrinsic limitations in recognizing the manipulated or context-dependent materials. On the one hand, platforms will have to be fast in moderating but at the same time – not sacrificing accuracy, thus they will have to take special measures in order to protect privacy and free expression, and have a very strong appeals system to minimize the number of wrong takedowns.

Impacts on the users and the market

Faster takedown timelines would be excellent for the end users who are seeking content safety, in particular, child exploitation. For the intermediaries and platforms, the advisory will escalate legal risks and compliance costs. Companies have to go back to the drawing board with moderation algorithms, put money in local compliance teams, while fine-tuning the reporting mechanisms to be able to meet the given timelines.

What to await

Some rules may require follow-up by external audits or leading to lead law enforcement against the platforms that will not demonstrate themselves to be controlled to the required level. Policy makers could also attribute a wider range of statutory amendments or extra guidance for the contentious areas as the borderline sexual content and the scope of the automation of the moderation process become clear.

The advisory must not fail to mention the first very important trend that it implies, namely that the online decency laws are going to be strictly enforced and the regulatory scrutiny is increasing. The social media platforms applying the same technology, policy, and legal practices are the ones that will be best positioned to comply, protect user rights, and at the same time, avoid enforcement action.