Up to date on: Nov 28, 2025 10:30 pm IST
The Supreme Court docket’s directive on regulating user-generated content material is well timed — however the coverage must be nuanced to filter unhealthy content material and never throttle free speech
When the Supreme Court docket directed the Centre this week to draft pointers inside 4 weeks for regulating user-generated content material, it acknowledged an issue expertise firms have lengthy most well-liked to disregard. Dangerous content material — spreading hate, defamation, concentrating on of susceptible communities — spreads with impunity whereas platforms cover behind insufficient self-regulation. The apex court docket noticed that there have to be accountability for content material uploaded on platforms, and present mechanisms have confirmed ineffective: It additionally spoke in regards to the want for an impartial oversight physique. The identical bench ordered comedians who mocked individuals with disabilities to host fundraisers twice month-to-month — after petitions detailed how India’s Bought Latent, a purported comedy-talent present, mocked households desperately elevating funds for kids with spinal muscular atrophy. A key a part of the issue within the current instances was the format — movies and audio — the place detection proves most difficult. Audio and video content material current distinct moderation challenges. By the point objectionable materials is flagged and eliminated — sometimes 48 to 72 hours after first flagged — it has already gone viral. Automated techniques educated predominantly on English-language datasets battle with context, cultural nuance, and linguistic complexity in different languages.
Research present that content material moderation accuracy drops precipitously for non-English audio system. The India’s Bought Latent present crystallises the failure. Crass humour will not be unlawful, however the present has lengthy flirted with hateful notions till its creators cancelled it in February because of the backlash. The cancellation got here when podcaster Ranveer Allahbadia made remarks about parental intercourse in February, triggering a number of police instances in opposition to the present. It was solely then that YouTube intervened, and Samay Raina, the host of India’s Bought Latent and one in all its judges, deleted all content material. The excellence issues: Edgy comedy that challenges conventions differs essentially from content material that normalises hurt.
The apex court docket’s intervention drew comprehensible considerations about free speech. Senior advocates representing platforms warned in opposition to overreach, noting present frameworks beneath problem in excessive courts. The court docket itself stated it seeks rules that aren’t meant to “throttle” anybody however to create a “sieve”, filtering out unhealthy content material. Right here, it’s essential to remind ourselves of India’s constitutional structure — an method that rejects absolutist interpretations of free speech. Article 19(2) permits affordable restrictions on speech as a result of unfettered expression can inflict measurable hurt. That is the place the Large Tech platforms, greater than fame-seeking web customers resembling Allahbadia and Raina, should be held accountable.
Large Tech depends on self-regulatory our bodies for many content material policing selections (aside from outright illegality, together with gore or pornography). When self-regulation fails to stop months of amassed hurt that damages susceptible communities, regulatory intervention turns into obligatory. The platforms’ incapacity in moderating audio-visual content material in non-English languages at scale will not be merely a technical limitation — it displays prioritisation selections about the place to take a position assets. The Court docket directive represents overdue requires accountability over systematic failures. A properly thought-out coverage should observe, presumably after the chief considers the views of all stakeholders and due deliberation in Parliament.
















