0

The Amendment to the Information Technology Rules is not intended to suppress criticism of the Government: Central Government to the Bombay High Court

Kunal Kamra vs. State of Maharashtra and Ors.

CORUM: Justices GS Patel and Neela Gokhale

Background of the case

The Information Technology (Intermediary Guidelines and Digital Media Ethics code) amendment rules, 2023, which give the Union government the authority to establish a distinct unit for fact-checking content from digital media and to order its removal if found to be “fake, false, or misleading,” was challenged by comedian Kunal Kamra in a petition filed with the Bombay high court on April 6,2023. The petition challenged the provision on the grounds that it violates Section 79 of the Information Technology Act of 2000 and Articles 14, 19(1)(a), and 19(1)(g) of the Constitution.

As a political satirist, Kamra had asserted that he is compelled to remark on the conduct of the Union government and its employees through the extensive internet and social media platforms in order to spread his work. His capacity to parody politics would be unjustly and excessively restricted if his content were put through an obviously subjective, hand-picked ‘fact check’ by a unit chosen by the Union administration. Therefore, he argues that satire does not lend itself to such a fact-checking effort by its very character. If political comedy were to be examined by the Union government and banned for being “fake, false, or misleading,” it would completely destroy the point of the genre.

Kamra argues that the ambiguity of “in respect of any business” and “reasonable efforts” will result in a chilling effect wherein intermediaries choose to resort to removing any data flagged by the fact-checking unit of the Union government rather than risk losing safe harbour.

Recent Developments in the Case

During the hearing of the petitions contesting changes to the Information Technology Rules of 2023, the Central government informed the Bombay High Court on Tuesday that humour or satire of any kind directed at any governing body is always acceptable as long as it is not offensive or contains obscenity.

The Ministry of Electronics and Information Technology (MeitY)’s Solicitor General, Tushar Mehta, stated that the laws are simply meant to control false news and do not forbid any expression of opinion or criticism against the government.

“Whether we like it or not, any sarcasm or humour directed towards the political establishment has nothing to do with this rule. Unless the humour crosses lines with inappropriate content like abuse or pornography. In any form, humour satire is always welcome. It can’t be forbidden. The administration is solely concerned about the spread of false information and the anonymity of the media. There is not even the slightest chance that any humour or satire would fall under this law.”

The Rules call for the creation of fact-checking units (FCUs), and the petitioners have particularly disputed Rule 3’s authority for FCUs to identify and tag what they deem to be “false or fake online news” in relation to government actions.

Solicitor General outlined how the fundamental rights of five different stakeholders—the internet user, the middleman, the recipient, the government, and the general public—have been taken into account when drafting the Rules. Also made clear that nothing is made illegal or contains any penal provisions in the Rules of 2023. He clarified that the Rules only govern the content and settle disagreements between the content provider and the harmed party.

According to him, the intermediary has three alternatives when FCU flags content: remove it; don’t remove it but add a statement that it has been flagged; or dismiss communication from FCU. The court questioned the need for the change if the government was not going to oblige intermediaries to abide by the FCU’s instructions.

There is a wider public interest on the one hand, and a statute with confusing language on the other. Will it ever allow us to include qualifiers that don’t actually exist? How may “shall” (stated in rule 3) be given a different colour to preserve the safe harbour clause) and how can we read information as fact? Is it legal to interpret clauses in this way? The court posed the query after the session was over. If a person is offended by the message, they might take the middleman to court, which would ultimately determine whether or not the content was true. the change was necessary because it would prevent intermediaries from using the “safe harbour” defence provided by Section 79 of the Information Technology Act to avoid liability.

The recent clarification given by the Central Government to the Bombay High Court is significant reassurance. However, there is still cause for concern due to the rules vagueness, especially in regards to the operation of fact-checking units and the potential chilling impact on intermediaries. Our legal system must strike a fair and just balance as this case progresses, protecting both the freedom to criticise the government and the need to combat the dissemination of false information. This case illustrates the importance of ensuring the appropriate and accountable application of democratic values and the free expression they entail, even in the digital era.

 

 

“PRIME LEGAL is a full-service law firm that has won a National Award and has more than 20 years of experience in an array of sectors and practice areas. Prime legal fall into a category of best law firm, best lawyer, best family lawyer, best divorce lawyer, best divorce law firm, best criminal lawyer, best criminal law firm, best consumer lawyer, best civil lawyer.”

Written by: Shivanshi Singh

0

Social Media and Freedom of Speech: The Legal Boundaries in India

Introduction:

Social media platforms have become integral to modern communication, enabling people to express their opinions and engage in public discourse. However, the intersection of social media and freedom of speech raises important legal considerations. This article explores the legal boundaries of free speech on social media platforms, the challenges of content moderation, and the delicate balance between freedom of expression and harmful speech in India.

Freedom of Speech in India:

 India, as a democratic country, values freedom of speech as a fundamental right enshrined in the Constitution. Article 19(1)(a) guarantees the right to freedom of speech and expression, subject to reasonable restrictions for the protection of public order, decency, and morality.

Legal Boundaries on Social Media:

While individuals enjoy the right to express their opinions on social media, certain restrictions exist within the legal framework. The Information Technology Act, 2000, and its subsequent amendments regulate online activities, including social media platforms.

The legal boundaries on social media platforms involve a combination of legislation, regulations, and court interpretations. Here are some key aspects that define the legal boundaries on social media:

  1. Information Technology Act, 2000, and its Amendments:

 The Information Technology Act (IT Act) in India governs various aspects of online activities, including those related to social media. Some key provisions within the IT Act that set the legal boundaries include:

  1. Section 69A: Under this section, the government has the power to block online content that threatens national security, public order, or incites violence. It empowers the government to issue directions to block access to specific content or websites.
  2. Section 79: This section deals with the liability of intermediaries, including social media platforms. It requires intermediaries to observe due diligence and promptly remove or disable access to illegal content upon receiving notice from appropriate authorities.
  3. Section 505(2): Sharing or spreading content that promotes hatred, enmity, or ill-will among religious or social groups is punishable under this section.
  4. Intermediary Guidelines and Digital Media Ethics Code (2021):

 The Intermediary Guidelines and Digital Media Ethics Code were introduced by the government of India in 2021 to regulate social media intermediaries and digital media platforms. These guidelines impose certain obligations on intermediaries, including social media platforms, such as:

  1. Appointment of a Chief Compliance Officer, Grievance Redressal Officer, and Nodal Contact Person.
  2. Implementation of a robust content moderation mechanism, including the removal of specific categories of prohibited content within 36 hours of receiving a court order or notification from appropriate authorities.
  3. Establishment of a grievance redressal mechanism to address user complaints within a specified timeframe.

Court Interpretations and Precedents:

Indian courts have played a significant role in defining the legal boundaries on social media through their interpretations and judgments. Some key principles established by courts include:

  1. Balancing Fundamental Rights: Courts strive to strike a balance between freedom of speech and other fundamental rights such as the right to privacy, reputation, and public order. They often weigh the context, intent, and potential harm caused by online speech while evaluating its legality.
  2. Clarity and Proportionality: Courts emphasize the need for clear and specific legal provisions to restrict online speech. They also stress that restrictions must be proportionate to the harm sought to be prevented and should not be overly broad or vague.
  3. Due Diligence and Content Moderation: Courts have recognized the responsibility of social media platforms to implement effective content moderation mechanisms. Platforms are expected to have clear policies, guidelines, and mechanisms to remove or disable access to illegal or harmful content.

It’s important to note that the legal boundaries on social media platforms are subject to ongoing debates, legislative changes, and judicial interventions. As technology evolves and new challenges emerge, the legal landscape continues to develop and adapt to address the complexities of social media and freedom of speech.

Challenges of Content Moderation:

Content moderation on social media platforms poses several challenges due to the scale, diversity, and constantly evolving nature of user-generated content.

Here are some key challenges faced in content moderation:

  1. Volume and Speed: Social media platforms receive an enormous volume of user-generated content every second. Moderating this massive amount of content within a short time frame is a significant challenge. The speed at which content is shared and spread on social media requires platforms to employ efficient systems and technologies to identify and moderate harmful or objectionable content promptly.
  2. Contextual Nuances: Content moderation becomes challenging due to the need to understand and interpret the contextual nuances of user-generated content. Differentiating between legitimate expression and harmful speech requires considering the specific context, intent, cultural factors, and local sensitivities. The lack of contextual understanding can lead to the misinterpretation or wrongful removal of content.
  3. Varying Legal and Cultural Standards: Social media platforms operate globally, but legal standards and cultural norms differ across jurisdictions. Compliance with diverse legal frameworks and striking the right balance between different cultural sensibilities poses a challenge. Platforms must navigate these variations to ensure consistent and fair content moderation practices.
  4. Automation and Human Review: To cope with the volume of content, platforms often rely on automated systems for content moderation. However, relying solely on automation can lead to errors, false positives, and removal of legitimate content. Striking the right balance between automated systems and human review is crucial to ensure accurate and context-sensitive content moderation.

Recent Legal Developments:

In recent years, India has taken steps to regulate social media platforms and address concerns related to harmful speech. Notable developments include:

  1. Intermediary Guidelines and Digital Media Ethics Code (2021): These guidelines introduce obligations for social media intermediaries, including appointing a Chief Compliance Officer and implementing a grievance redressal mechanism.
  2. Court Decisions: Indian courts have intervened in cases involving freedom of speech on social media platforms, aiming to strike a balance between free expression and protecting individuals from harm.

Balancing Freedom of Expression and Harmful Speech:

Balancing freedom of expression and the need to combat harmful speech is crucial. Striking the right balance involves creating transparent content moderation policies, considering contextual factors while assessing content, and promoting public awareness about responsible digital citizenship.

Here are some key considerations in balancing freedom of expression and harmful speech:

  1. Legal Framework and Restrictions: Freedom of expression is not an absolute right and is subject to reasonable restrictions. Legal frameworks, including national constitutions, human rights conventions, and local laws, define the boundaries within which freedom of expression operates. These restrictions are typically aimed at protecting public order, national security, the rights and reputations of others, and preventing harm. Identifying and enforcing these restrictions effectively is essential to maintain the delicate balance between free expression and preventing harm.
  2. Clear Definitions and Standards: One of the challenges in balancing freedom of expression and harmful speech lies in defining and interpreting what constitutes harmful content. Vague or ambiguous definitions can lead to inconsistencies and subjective decision-making in content moderation. It is crucial to establish clear standards and guidelines that provide clarity on the types of speech that are considered harmful and warrant intervention.
  3. Contextual Evaluation: Context plays a vital role in determining the potential harm caused by speech. The intention behind the speech, its social and cultural context, and the potential impact on individuals or marginalized groups must be carefully considered. Contextual evaluation helps distinguish between legitimate expressions of opinion and speech that incites violence, promotes hate, or targets individuals or communities. Platforms need to develop sophisticated moderation mechanisms that account for contextual factors while assessing and handling content.
  4. Proportionality and Consistency: Ensuring proportionate responses to harmful speech is crucial. The severity of the harm caused, the intent of the speaker, and the potential impact on society should be taken into account when determining appropriate measures. Responses to harmful speech should be consistent and applied uniformly to avoid allegations of bias or unfair treatment.

Finding the right balance between freedom of expression and harmful speech on social media platforms is an ongoing challenge. It requires continuous evaluation, refinement of policies and practices, and an understanding that the digital landscape is constantly evolving. By incorporating legal frameworks, clear definitions and contextual evaluation, a more inclusive and responsible digital environment can be fostered.

Case Laws

Some of the case laws that provide important legal precedents and interpretations in the context of social media and freedom of speech in India include:

  1. Indian National Congress (I) v. Union of India (2014): In this case, the Supreme Court upheld the validity of Section 66A but clarified its interpretation. The court held that online speech could only be restricted if it posed an actual threat to public order or had the potential to incite violence. It emphasized the importance of striking a balance between free speech and maintaining public order.
  2. Kamlesh Vaswani v. Union of India (2015): This case dealt with the issue of blocking websites hosting objectionable content, particularly child pornography. The Supreme Court held that intermediaries like social media platforms have a responsibility to proactively identify and block access to such content to protect children from exploitation.
  3. Faheema Shirin R.K. v. State of Kerala (2019): The Kerala High Court ruled in this case that the freedom of choice and expression of an individual cannot be curtailed merely based on objections raised by others on social media. It emphasized the importance of allowing individuals to express their opinions freely without fear of retaliation or censorship.
  4. Maheshwari v. Union of India (2020): This case involved a plea seeking quashing of an FIR filed against a social media user for allegedly posting objectionable content. The Supreme Court emphasized that social media users cannot be held liable for the mere forwarding or sharing of content unless there is a clear intent to promote hate speech or incite violence.

Conclusion:

Social media platforms have transformed the way people communicate, making freedom of speech a critical issue in the digital age. While India recognizes freedom of speech as a fundamental right, it also imposes reasonable restrictions to protect public order and harmony. The legal boundaries of free speech on social media platforms must be navigated carefully, ensuring a balance between the right to express opinions and the prevention of harmful speech. Ongoing discussions, regulatory developments, and judicial interventions contribute to shaping the legal landscape and finding this delicate balance in India.

References

Case Laws

  1. Shreya Singhal v. Union of India, (2015) 5 SCC 1
  1. Indian National Congress (I) v. Union of India, (2014) 16 SCC 1
  2. Kamlesh Vaswani v. Union of India, (2015) 2 SCC 701
  3. Faheema Shirin R.K. v. State of Kerala, (2019) SCC Online Ker 529
  4. Maheshwari v. Union of India, (2020) SCC Online SC 1223

Websites

Statutes

  • Information Technology Act, 2000

ARTICLE WRITTEN BY AMIT ARAVIND

“PRIME LEGAL is a full-service law firm that has won a National Award and has more than 20 years of experience in an array of sectors and practice areas. Prime legal fall into a category of best law firm, best lawyer, best family lawyer, best divorce lawyer, best divorce law firm, best criminal lawyer, best criminal law firm, best consumer lawyer, best civil lawyer.”