The necessity for moderating the OTT platforms in India: Sahil Chopra
The new rules will play a vital role in fighting fake news, hate speech, cyber crime, sexual harassment and explicit content, and will make social media a safer place to be.
The beginnings of the public internet can be traced back to the 1980s and since then it has succeeded in penetrating every branch of our lives. It revised that
In contrast to television, print or radio, which comply with the guidelines of the I&B ministry, social media and over-the-top platforms (OTT) had little or no regulations for the selection of the content offered. Anyone, anywhere, can write and post anything instantly without reviewing or moderating. The few policies that exist can only be reviewed in the form of a moderation review, where content is flagged after it has been published and subjected to review and possible removal. Thus, these platforms underline the concept of post-moderation of content, which in lay language can be described as “publish first, moderate later”.
After realizing that the content with bad intentions could go viral and potentially reach millions of people before the platforms were able to review and remove it after moderation, the OTT operators decided to do the necessary Take steps before it is too late to repair and adopt the voluntary self-regulatory codes for the content displayed on their platforms. In January 2019, OTT players like Netflix, Hotstar and Alt Balaji signed a Code of Best Practices with others. The Code was created with the sole aim of empowering audiences to make informed decisions about age-appropriate content and protecting their interests in choosing the content to view at their own time, will, and convenience.
With all the steps, it felt like things were falling into place. But the recent controversy surrounding Tandaav tells a different story. The controversies and Twitter trends made the Indian government jump in the field and announced new rules to regulate social media platforms, OTT service providers and digital content providers in India, known as the “Intermediary Guidelines and Code of Ethics for Digital Media” were.
Take a look at the changes and challenges the new guidelines offer:
1. The OTT platforms must now divide the content into five age-based categories: U (Universal), U / A 7 years, U / A 13 years, U / A 16 years and A (adults).
2. The OTT platforms must provide a parental control mechanism for content rated U / A 13+ or higher and reliable age verification mechanisms for content rated “A” in order to increase safety for children.
3. The social media platforms must remove flagged posts within 36 hours of receiving the notification, and the encrypted messaging apps must track the originator of the controversial messages. You must also delete messages including sexual content within 24 hours.
4. There will be a three-stage legal protection mechanism, with the first level being self-regulation by the publishers and the second level being self-regulation by the self-regulating bodies of the publishers. The third level, on the other hand, would be a monitoring mechanism.
5. The social media intermediaries must comply with the due diligence prescribed in the IT rules 2021. If the duty of care is not observed, the provisions of the “safe haven” according to Section 79 of the IT Act do not apply to the intermediary of social media. This is the legal protection for intermediaries who host user-generated content and relieves them of liability for the actions of users on their platform if they adhere to the guidelines prescribed by the government.
6. The digital media must adhere to the Indian Press Council’s Journalistic Standards of Conduct and program code under the Cable TV Network Regulation Act.
7. Companies must appoint an India-based complainant to be responsible for redressing complaints. The executive decides within 15 days on any complaint received by the company.
8. There will be one or more self-regulatory agencies of publishers headed by a retired Supreme Court Justice, Supreme Court, or independent scholar, with no more than six members.
1. According to the new guideline, social media platforms must identify the originator of a message that is classified as anti-national by the authorities. This is in direct conflict with freedom of speech and the privacy of citizens. Therefore, it will be a challenge for social media platforms to maintain end-to-end encryption and ensure privacy for user data.
2. The OTT platforms and digital media companies need to put in place a three-tier mechanism for resolving complaints. The third and top level of this structure will consist of representatives from various ministries and government departments. It will give the bureaucrats the power to censor and block the content, thereby limiting the creative freedom of the content curators.
3. The new guidelines will also be extended to digital new media, which bring with them the challenges to freedom of the press. The inter-ministerial bureaucratic committee will examine what can and cannot be published on a media platform. It will put digital media under tight government scrutiny and affect free and unhindered reporting.
4. The OTT Streaming Platform “Code of Ethics” places the content broadcast on them under government scrutiny and creates regulatory observations that directly affect the nature and quality of the content created and viewed.
5. The next challenge is the type of tools that new rules will use social media platforms to filter out the objectionable content. These technology-based tools and automated grades can suffer from accuracy issues that can lead to functional creep. Furthermore, there is no accountability and transparency in such forms of surveillance and this will affect the user’s freedom of expression and expression.
The new guidelines have been welcomed by almost all digital media agencies and companies. It is believed that new rules will play a vital role in fighting fake news, hate speech, cybercrime, sexual harassment and explicit content, and make social media a safer place. However, there would be some challenges to their implementation given a very thin line between privacy and user security. However, going from post-review moderation to pre-review moderation can certainly bring some changes to our digital universe. Let’s see how these changes are welcomed by the nation.
[This is an authored article by Sahil Chopra, Founder & CEO of iCubesWire. All views, opinions and expressions are personal and limited to the author.]