Breaking News: Access to the Freedom of Information and Expression in the Digital Economy in BRICS Countries

Gunnar Larson g at
Thu Nov 24 10:22:50 PST 2022

Home <> // Blog
<> // Access To The Freedom Of Information
And Expression In The Digital Economy In BRICS Countries

   - BLOG
   - 24 NOV 2022
   - 39 MINS READ

Access to the Freedom of Information and Expression in the Digital Economy
in BRICS Countries

Digital Economy Blog Series: Part 6

   - <>Dhruv Somayajula
   - <>Jai Vipra


This is the last post in our series of posts studying the access to the
digital economy in BRICS countries. In previous posts, we have studied the
access to the internet
the access to decent work
the access to do digital business
the access to consumer rights
and the access to welfare
In keeping with our attempt to broaden the understanding of access to
include access for people in all their roles, in this post, we examine the
access to the freedom of information and expression in the digital economy
in BRICS countries.

Free access to information and expression are both independently important
for a healthy digital economy. In addition, both these types of access are
interdependent. Without free access to information, the right to freedom of
expression is ineffectual. Article 19 of the UN Declaration of Human Rights

“Everyone has the right to freedom of opinion and expression; this right
includes freedom to hold opinions without interference and to seek, receive
and impart information and ideas through any media and regardless of

This right contains the right to seek and receive
regardless of frontiers as part of the right to freedom of expression.
Freedom of access to information is impeded by both state and non-state
actors, including through restrictive intellectual property regimes,
vigilante-led intimidation, the prevalence of illiteracy, and overt state
censorship. Thus, we examine some impediments to free access to information
along with expression in BRICS countries.

The notion of intellectual property (IP) has its roots in censorship. Scott
(2001) has shown
<> that the
early examples of patents include patents granted to printing presses that
would print only approved books. Copyright law, along with the custom to
have the printer’s and author’s name included in a book, was developed
<> to assign
accountability per laws against heresy, sedition and treason.

Mainstream economic analysis too recognises that IP rights often create
inefficiency. Boldrin and Levine’s (2002) seminal work shows
<> how current IP regimes
confuse the property rights over objects embedded with ideas, with the
property rights over ideas themselves. That is, current IP regimes allow
for control <> over the use
of ideas after the sale of an object, creating an information monopoly
ostensibly to reward innovation. This strand of enquiry recognises that IP
rights are a government-enforced monopoly on the use and spread of ideas.

IP regimes in their implementation exhibit a clear Global North and Global
South divide. Global South countries, particularly BRICS countries, have
been at the forefront of defending public healthcare against IP rights. The
most recent example of such action has been the demand by India, South
Africa and other southern countries for a temporary waiver on patents for
Covid-19 vaccines, so as to increase access. There was a clean divide
between North and South countries at the WTO over this issue, with northern
countries arguing for
continuation of patents. The same divide was evident in the debate on IP
rights on genetic resources, with corporations and research institutions
from northern countries patenting biological and traditional knowledge
southern countries on the basis of the claim that they discovered this
knowledge. In general, empirical work demonstrates
the adoption of IPRs in developing countries is due to external pressure
rather than domestic needs, or that developing countries are policy takers
in the domain of IPR.

Another significant impediment to access to digital information is the lack
of accessibility for persons with disabilities. Further analysis on this
impediment can be found in Vidhi’s other research, including research on
making cities accessible by design
an upcoming project on digital accessibility for persons with disabilities.

The right to freedom of expression can also be restricted by itself,
including through liability on intermediaries like social media companies.
Intermediary liability describes the extent to which intermediaries are
liable for content created by their users. An unreasonably high level of
intermediary liability incentivises intermediaries to delete or disallow
content that could potentially be illegal, hampering free speech. An
unreasonably low level of intermediary liability can allow harassment and
hate speech to flourish, chilling free speech. We analyse the different
levels of intermediary liability chosen by BRICS countries, differentiating
by the type of intermediary. We also briefly analyse content moderation
laws and rules in BRICS countries and how they work to either stifle speech
or protect people from online harms.

Part I of our post broadly examines the access to freedom of information in
the digital economy in BRICS countries in the context of digital
information, through the lens of copyright laws. Part II of our post
broadly examines the access to freedom of expression in the digital economy
in BRICS countries.
Part I: Access to freedom of informationBrazil

The Law on Copyright and Neighbouring Rights, 1998 (Brazil Copyright Law or
BCL) regulates <> copyright law in
Brazil. Brazil’s laws on copyright infringement create a very restrictive
framework for the digital sharing of information. Under the BCL, a
copyrighted work may not be reproduced without the express consent of the
author of the work, save for a few limited exceptions.

The BCL protects the author’s economic rights first and foremost. Under
Section 29, any form of reproduction, translation, distribution on the
internet, indirect use in broadcasting or storage on a computer system must
have <> the express consent of the
author. These provisions set out a high standard of copyright protection
for various use cases. Additionally, reproducing a work with an intent to
explain or comment on it without taking permission from the author,
generally permitted under the doctrine of fair dealing, is expressly
prohibited under Section 33 of the BCL. Under the BCL, any annotations or
commentaries must be published separately. This distinguishes the BCL from
other liberal copyright jurisdictions such as India which permits reviews,
comments or explanations as part of fair dealing of copyrighted works for
non-commercial purposes.

Brazil’s laws provide safe harbour provisions and a clearly defined
intermediary liability framework. Article 18 of the *Marco Civil da
Internet*, 2014 (the Civil Rights Framework or CRF) offers
protections to any intermediary for any damages resulting from actions of
third parties. The intermediary is only liable when a court order directing
the intermediary to block access to the infringing content has not been
carried out. However, Article 19(2) of the CRF does note that intermediary
liability and takedown of content relating to copyright offences must be
balanced against the freedom of expression provided in Brazil’s
constitution. In 2015, Brazil’s Superior Court of Justice held that
could not be held liable for its users sharing links that violate copyright
laws. Applying the doctrine of contributory negligence, the court ruled
that Orkut did not provide any infrastructure to download content or share
files. Therefore, it merely acted as a platform and did not play a part in
its user’s actions. This robust approach towards offering safe harbour
protection to intermediaries is part of a series of verdicts protecting
platforms from liability, balancing the need to protect the right to
expression against the right to privacy. In a landmark decision in 2018,
the Court held that
CRF requires intermediaries to take down content specified in a court
order. It required Facebook to provide a URL locator to link the specific
content to be taken down per the court order, failing which, Facebook could
be personally held liable for continuing to host the content.

Russia follows continental European law on copyrights, as opposed to the
common law form of copyright law followed in England and other common law
countries. As can be seen from its provisions, digital sharing of
information is permitted through Russia’s jurisprudence. Digital sharing of
information in accordance with copyright law is moderately restrictive,
with a textual approach to copyright law as opposed to a doctrinal approach
which would allow greater flexibility. Recent introductions of intermediary
liability set out stringent penalties for enabling digital sharing of
information violative of Russian copyright law.

Under Article 15.2 of the Federal Law on Information, Information
Technologies and the Protection of Information, 2006 (Russia IT Law), a
copyright holder may approach
 the *Roskomnadzor *(the Russian Federal Service for Supervision of
Communications, Information Technology and Mass Media) notifying the
copyright infringement by any service provider on the Internet illegally
offering access to the copyrighted work. Within 3 days of providing this
notice, the *Roskomnadzor *shall send a notice to the entity hosting the
copyrighted work, informing them of the infringement and directing them to
block access to the infringing content. If the hosting service provider or
the owner of the site refuses to block the content within the given
timelines, the internet service provider is obliged to do the same within
24 hours of receiving a notice from the *Roskomnadzor*. The copyright
holder may also directly approach
owner of the website seeking takedown of the copyrighted material who is
obliged to take down the infringing content within 24 hours, under Article
15.7 of the Russia IT Law.

Part IV of the Civil Code of the Russian Federation (CCRF) provides for
intellectual property regulations. This code defines copyrights and the
rights of authors, while simultaneously providing for limited cases of fair
dealing. Section 1274 of the CCRF sets out
conditions under which copyrighted information can be distributed freely
for informational, scientific, educational or cultural purposes. This
provision also covers the adaptation of copyrighted works to allow persons
with disabilities to access the work. Significantly, this allows the fair
dealing of information for distribution in terms of parodies or
caricatures, with court decisions
parody or satire as exempted from copyright. Fair dealing under Russian
copyright laws are specified in the legal text, and as such, are limited in
their scope, unlike the doctrine of fair dealing followed in some other

Section 1299 of the CCRF protects
concept of digital rights management (DRM), wherein the copyright holder
may use technological measures to prevent or restrict the sharing or
storage of copyright works without the permission of the owner. While
violations of DRM are not permitted, this provision accommodates the idea
of fair dealing to an extent. Users seeking to remove the DRM from a
copyrighted work for uses covered under fair dealing clauses may require
the author to lift the DRM. This is not mandatory in all cases, and the
author is only required to lift the DRM for fair dealing cases if it is
technologically possible and does not require significant costs.

Under the Russian Anti-Piracy Law 2013, courts are authorised
<> to pass
preliminary interim orders protecting the rights of copyright holders of
films and television works from being circulated on the internet or other
telecommunication networks. These provisions are similar to the Digital
Millennium Copyright Act, 1998 applicable in the US. Article 15 allows
copyright holders to seek take-down orders from courts if copyrighted works
are distributed on the Internet or other telecommunication networks in
violation of the exclusive rights held by the copyright holders.
Additionally, this law sets out the liability of intermediaries involved in
carrying such works, and provides safe harbour provisions based on their
level of involvement and response to take-down requests from the copyright
holders. However, the Anti-Piracy Law has prescribed stringent penalties
for intermediaries that violate copyright laws by carrying or distributing
such content illegally, including  the power to order the relevant hosting
service to permanently block the intermediary’s site on the internet.

The Copyright Act, 1957 (Copyright Act India or CAI) regulates
<> the protection
of copyrights in India. The CAI has been amended six times since it has
been notified, with the latest major amendments in 2012. Section 52 of the
CAI provides for a wide range of fair dealing exceptions to copyright
infringement, including review, commentary, educational and research uses,
increasing accessibility, and making three-dimensional objects from a
two-dimensional artwork, allowing reverse engineering of technical devices.
These instances allow various levels of access to copyrighted works for
research, reporting or criticism. In 1995, the Delhi HC interpreted
<> Section 52 as a tool that protects
the freedom of expression guaranteed under the Indian Constitution. In
Section 65A, the CAI prohibits circumvention of technological protection
measures on copyrighted works. However, the provision excludes any
circumvention for the purposes covered under fair dealing under the CAI.

In 2016, the Supreme Court delivered
<> a landmark verdict regarding the
law of fair dealing in *The Chancellor, Masters & Scholars of the
University of Oxford v. Rameshwari Photocopy Services*. The appellants were
a consortium of major international publishing houses seeking a permanent
injunction on the defendant’s photocopy services that reproduced, printed,
sold and distributed the appellants’ publications as course packs in the
University of Delhi. Under Section 52(1)(i) of the CAI, the reproduction of
any copyrighted work by a pupil in the course of instruction was permitted
as fair dealing. Keeping this in mind, a single-judge bench ruled in favour
of the defendants, and refused to grant a permanent injunction. While a
division bench has set aside this ruling to allow the parties to appeal,
the parties in this case have since chosen to withdraw
not to pursue this case anymore.

Section 52(1)(c) permits intermediaries to host or store copies of
copyrighted works, provided that the intermediary is not aware that it is
an infringing copy. However, this awareness must be actual knowledge
regarding the specific infringing content. In 2016, the Delhi High Court
upheld <> the safe harbour provisions
enjoyed by intermediaries under the Information Technology Act, 2000, in
the case *MySpace v. Super Cassettes Industries Ltd*. The court held that
unless intermediaries have actual knowledge, they cannot be held liable for
merely providing a social media website that was subsequently used by its
users to share infringing content. Under the CAI, the owner of the
copyright is required to provide this knowledge to the intermediaries
through a written complaint. Upon receiving a written complaint on the
infringement, the intermediary must
<> block
access for a maximum of twenty-one days while the owner may obtain a court
order confirming the blocked access. If the intermediary does not obtain
the court order, access to the content can be permitted by the intermediary.

The Copyright Law of the People’s Republic of China, 1990 (Copyright Law of
China or CLC) sets out
for copyrighted works. The CLC was first notified in 1990, with major
amendments in 2001, 2010 and most recently in 2020. The latest amendments
to the CLC, effective from June 2021, expand the scope of copyrightable
works, reshape the scope of fair dealing, and drastically increase the
penal consequences of copyright violations.

Article 3 of the CLC defines works as original expressions of intellectual
achievements in certain fields such as written, musical, photographic or
graphic works, computer software etc. Previously an exhaustive list, the
definition of works has been expanded by the latest amendment to the CLC to
include ‘other intellectual achievements conforming to the characteristics
of works’. This expansion acknowledges that advances in technology allow
for developing intellectual property works in myriad ways, and avoid
hard-coding certain forms of works as ‘protected works’.

Article 24 of the CLC allows the use of copyrighted works without obtaining
permission from the author in certain circumstances. Unlike Brazil’s
copyright law, the CLC allows fair dealing of copyrighted works as long as
the copyrighted work being commented on has been appropriately cited.
Additionally, changes to the fair dealing exceptions allow dissemination of
copyrighted works to persons with print disabilities such as visual
impairments. The copyright holder’s rights have also been strengthened
under the 2021 amendments, requiring that the exceptions for fair dealing
must not in any way affect the normal use or legitimate rights of the
copyright holder.

Like the Russian CCRF, the CLC recognises the concept of DRM as a
legitimate means to protect copyrighted works by rights holders. Unlike the
Russian copyright law which requires permission from the author for fair
dealing exceptions to DRMs, the CLC expressly permits unilateral
circumvention of DRMs in certain limited cases. This includes the use of
published works for classroom teaching or scientific research, providing
published works to print-disabled individuals without making profits,
testing of security standards or encryptions in computers, and research on
reverse engineering computer software.

The Regulations on Protecting Right to Dissemination via Information
Networks, 2006 were notified in 2006 and were amended in 2013 to clarify
the role of intermediaries in copyright protection. Under Article 22, an
intermediary or a network service provider providing or allowing storage of
works or audio-visual recordings would not be liable
it was not aware of any infringement of copyright, did not gain any
economic benefit from it, and deleted the works or recordings upon
receiving a notice from the rights holder. Article 23 reflects similar safe
harbour provisions for any network service provider that allows searching
or provides links to any URLs linking to any works being disseminated in a
manner that infringes the holder’s rights. However, the intermediary would
be jointly liable if it was aware of the infringement and continued
permitting the infringing work to be hosted and available.
South Africa

The Copyright Act, 1978 (South Africa Copyright Act or SACA) sets out
<> the copyright law in South
Africa. The SACA was amended <> in
2013 to introduce provisions that protect any form of works recognised as
having indigenous origin by South Africa’s indigenous communities. The
Amendment Act of 2013 creates a National Trust Fund for Indigenous
Knowledge and a National Database for Indigenous Knowledge to further
protect these works.

Section 12 of the SACA prescribes exceptions to the general protection of
copyrighted works under the fair dealing doctrine. The practices permitted
are materially similar to the provisions under Indian law, with a few
distinctions. For example, while the use of a literary work for teaching
purposes is permitted, the law is not as liberally worded as India’s
provisions. In 2016, the Pretoria High Court decided a landmark case on the
interpretation of fair dealing in news reporting under the SACA, in the
case of *Moneyweb v Media24*. In this case, the appellants alleged that
seven articles published by the defendants were copies of their original
work. In the case, the Court ruled that only three of the seven articles
published by the appellants were original, and warranted copyright
protection. Of those three, the Court held that the portions used by the
defendants in two articles were not substantial portions, and would
therefore not be considered an infringement of copyright of the original
articles. The last article was copied verbatim, requiring a consideration
of fair dealing as a defence. The Court ruled that for the infringing
article, the defendants created a substitute of the original article,
published within seven hours of the original article being published, and
was reproduced verbatim. This ruling marks a beginning of fair dealing
jurisprudence in South Africa, and may be highly relevant for future cases
in discussing the ‘fairness’ of an infringement.

The ‘fair dealing’ provision in the SACA stays silent on adaptations made
for copyrighted works to be accessible for the differently abled. In 2021,
this was challenged in the Pretoria High Court by Blind SA, an advocacy
group in *Blind SA v Ministry of Trade, Industry and Commerce*. Blind SA argued
SACA was unconstitutional in its current form as it violated the rights of
those suffering from print or visual impairments by barring or severely
limiting parties from adapting work into more accessible formats, including
digital formats such as screen readers or audio descriptors for broadcasted
works. It argued that
less than 0.5% of all published works are available in accessible formats
like Braille. In this case, the High Court held the provisions of the SACA
that bar adaptations for the differently abled as unconstitutional. In
September 2022, the Constitutional Court of South Africa upheld
<> the High Court’s verdict, but
suspended the verdict for 24 months to allow the Parliament to cure this
defect. In the interim, the Constitutional Court read in
definitions and a new provision allowing persons with print or visual
impairments to immediately be able to make accessible adaptations without
being prosecuted for copyright infringement. This landmark verdict provides
an example of the urgency needed for approaching questions of access to
information and education.

The South Africa Copyright Amendment Bill, 2017 (SACAB) attempts a major
overhaul of the SACA, in line with the principles of equality and
protection to the marginalised under the South African Bill of Rights. The
SACAB introduces provisions
<> permitting
adaptations that allow increased accessibility for persons with print or
visual impairments, paving
<> the way for South
Africa to sign and ratify the Marrakesh Treaty to Facilitate Access to
Published Works for Persons who are Blind, Visually Impaired or Otherwise
Print Disabled, 2013, and to formalise the Constitutional Court’s
ruling in *Blind
SA*. Additionally, the SACAB introduces exceptions to copyright
infringement in terms of educational materials and resources, by permitting
course packs to be reproduced in the course of instruction. This clause
aims to provide widespread access
affordable education for marginalised groups in South Africa. Parallels can
be drawn between this clause and the ruling in the DU photocopy case by the
Delhi High Court in India, which cited similar reasons for its verdict
holding that course packs published in the course of instruction did not
amount to copyright infringement. In September 2022, the SACAB was passed
the National Assembly (the lower house of the SA Parliament), and has now
been sent to the National Council of Provinces.
Part II. Access to freedom of expressionBrazil

In Brazil, the CRF offers a regulatory framework for conduct on the
internet, setting out rights of users and obligations of intermediaries. As
discussed above, the CRF describes the parameters of intermediary liability
and offers safe harbour clauses for takedown of content when the
intermediary receives a court order. The CRF also protects victims of
revenge pornography, or other unauthorised publication of personal images.
Article 21 of the CRF states that
<> the
intermediary platform shall be held liable if it fails to block access or
take down any images, videos or other material containing nudity or sexual
acts, after it has so been notified by the person in question. In addition
to this existing internet regulatory framework, Brazil has had several
court decisions and key legislations proposed in recent years.

Brazil’s courts have taken an approach favouring freedom of speech and
expression online, particularly from the vantage points of journalistic
rights and public interest. In the case of Fernando Capez v. Juca Kfouri,
the appellant, an elected representative, had approached the court stating
that the defendant, a sports journalist, had written various blogs that
harmed his reputation and honour. The appellant requested the court to
order the defendant to abstain from writing any further offensive pieces.
While this prayer was granted by the trial court, it was overturned
the São Paulo Court of Appeals which reasoned that ordering the defendant
to abstain from future posts that may hypothetically be offensive would
amount to censorship.

Recent court decisions have expanded the right to freedom of expression in
response to issues raised by digital content. In the case of Manoel Conde
Neto v. Folha de São Paulo, the appellant, a politician, had obtained a
takedown order against a popular newspaper regarding an article published
by the latter. The São Paulo Court of Appeals reversed the injunction
holding that the publication of true and socially relevant news stories
could not be justified, and was akin to censorship. In 2021, the case of
Nelson Curi et al v. Globo Comunicação e Participações brought forward the
issue of the right to be forgotten against the public’s right to know. The
appellants requested takedown of images of their relative’s murder in 1958,
broadcast by the defendants in a television program. The court declined to
takedown order, stating that the relevant program was of public interest
and could not be suppressed under the right to be forgotten or the broader
right to privacy.

However, the recent elections in Brazil have brought forward interesting
developments regarding content takedown and misinformation. In October
2022, Brazil’s Supreme Electoral Court approved a resolution
itself to unilaterally order tech companies to take down online posts,
videos and attack advertisements that spread misinformation. Companies that
comply with these immediate takedown orders (issued where the content has
already been declared as misinformation, but has been published through a
new channel or account) within two hours risked suspension of their
services across Brazil. These checks have been defended
<> as attempts to reduce the
damage caused by the virality of misinformation. However, critics have noted
this dramatic expansion of the electoral court’s powers sets a concerning
precedent that may be used to censor legitimate criticisms.

Brazil has also introduced two legislative measures that may redefine this
space. On September 9, 2021, the president of Brazil signed off
new provisions to the CRF.Under these provisions, intermediaries were
restricted from acting against accounts that violated their terms of
service, and were only authorised to act against any account or digital
content in limited cases including violations of intellectual property
rights or in response to a court order. However, these new provisions have
been rejected by
National Congress of Brazil, and the changes introduced to the CRF under
the measure have been nullified.

Bill 2630/2020 (formally titled the Brazilian Law of Freedom,
Responsibility and Transparency on the Internet, and informally known as
the ‘Fake News Bill’ or ‘FNB’) seeks to curb misinformation on the
internet, by imposing a requirement on digital platforms to engage in
content moderation. Applicable to social media and messaging platforms
servicing over two million users in Brazil, Article 6 of the FNB proposes
platforms must curb the creation of fake accounts (including bot accounts)
and highlight sponsored content that the platform has received money to
promote. Article 7 of the FNB requires users to authenticate their
accounts. Citing the right to access information and freedom of expression,
Article 12 of the FNB requires platforms to moderate content based on
complaints received or *suo moto *in the event of breach of its terms of
use. Article 15 requires platforms that carry election-related
advertisement or electoral propaganda to furnish additional details to
election officials and courts, for greater transparency. Article 25 seeks
to set out an Internet Council responsible for monitoring compliance with
the FNB, and to draft codes of conduct for social networks and private
messaging services. Violations under this law are proposed to be fined up
to 10 percent of the income of the entity within Brazil in the past year.
The FNB is currently pending within the Chamber of Deputies (Brazil’s lower
house of the Parliament), the government having recently failed
vote to push this measure through in an expedited manner. While the outcome
of this legislation is pending, it offers insights into active forms of
digital content regulation.

In Russia, the Russia IT Law regulates online content and its
dissemination. Under Article 10.4.1, platforms offering encrypted messaging
services must provide federal agencies the ability to decode
encrypted messages. Blocking or requiring deletion of online content is
preceded by entry into an automated information system register, containing
all network addresses of sites and pages that carry prohibited
information.. Sites that distribute content relating to drugs, child
pornography, suicide, gambling and lotteries, e-commerce online sale of
restricted medicines or alcohol and home-made explosives are included in
this registry. Under Article 15.1, the hosting service provider must inform
owner of the website to take down the page containing the prohibited
content, failing which, the hosting service provider is obliged to block
access to the site. If the hosting provider fails to act within the stated
timelines, the internet service provider is required to block access to the
site. Within 3 days of removing the prohibited content or receiving a court
order cancelling the inclusion within the prohibited content registry,
access to the website shall be unblocked. In 2019, similar takedown
procedures have been implemented
Article 15.1-1 for information that shows ‘a clear disrespect for society,
the state or the bodies exercising state power in the Russian Federation’.
Such provisions, which are vaguely worded and present a danger of arbitrary
use, introduce a chilling effect to the freedom of expression within

In recent years, Russia has moved to curtail freedom of expression through
amendments in its criminal code pertaining to various current developments.
In December 2020, Russia introduced amendments punishing libel
<> committed in
the context of spreading false information to undermine someone’s
reputation, libel committed through the use of Internet, libel regarding
accusations of someone having committed crimes against sexual freedom of a
person and lastly, punishing slander regarding accusations of someone
suffering from a disease that poses a danger to others. These newly
criminalised forms of speech carry sentences ranging between two and five
years of imprisonment, fines ranging between 500,000 to 5 million roubles,
or forced labour. While it is not known whether the specific reference of
libels regarding allegations of sexual misconduct, the use of the Internet
or the effect on someone’s reputation are connected to the #MeToo movement,
these amendments result in criminalising the very form and freedom of
expression used throughout that movement. The timing of the amendment
regarding criminalising slander that accuses another person of having a
disease that may cause harm to others, may be noted in line with the
COVID-19 pandemic. In the same period, Russia introduced amendments
<> to its code on
administrative offences, inserting fines for entities that refuse to comply
with directions regarding deletion of web pages or blocking access to sites
or pages carrying prohibited content. For legal entities, these fines range
from 3 million roubles to 8 million roubles for the first offence, and
between 10 percent to 20 percent of the revenues of the entity in the
previous year for repeated violations. Pursuant to these amendments, the
*Roskomnadzor* has notified TikTok
<> and Russian social media
network VKontakte <> in January
2021, asking it to take down content that involves minors in illegal
actions, including calls to participate in ‘illegal’ or ‘unauthorised’
rallies. These instances clearly indicate the use of such amendments to
censor calls for rallying in favour of Russian opposition figures, and
evidence the use of civil and criminal penalties to censor and suppress
free expression.

In March 2022, Russia introduced amendments
<> in its
criminal code, adding new punishments for spreading false information or
discrediting the use of armed forces by Russia, along with publicly calling
for restrictive measures by foreign states/entities against Russia. These
provisions have ostensibly been included in the context of the
Russia-Ukraine conflict, and negatively affect the freedom of expression
and freedom of the press reporting from the frontlines in Russia. As
recently as October 2022, reports indicate
use of speech laws to fine platforms such as TikTok, Twitch and the
Wikimedia Foundation for offences such as ‘promoting LGBT propaganda’ and
discrediting the Russian war effort.

In India, digital content is regulated by the Information Technology Act,
2000 (IT Act). The IT Act sets out penal provisions for certain forms of
digital content. Section 66E of the IT Act protects
<> violations of privacy, by
punishing any sharing of intimate pictures of a person without their
consent with fine or imprisonment up to three years. Similarly, Section 67
and 67A punish
the sharing
any obscene, lascivious material or content involving sexually explicit
acts with fines or imprisonment of up to five or seven years. Section 67B
criminalises <> the creation or
sharing of child pornography content with fines or imprisonment of up to
five or seven years. Section 69A empowers
<> government agencies to direct
intermediaries to block access to contents on the grounds such as
sovereignty, security of the state, and preserving public order. Section 79
of the IT Act offers
harbour provisions for intermediaries, offering them refuge from liability
if they acted as a mere conduit. However, a platform may be held liable if
they have been notified by a government agency to delete or block access to
content hosted by the intermediary. The provisions relating to blocking
access to content or content takedown, in the context of intermediary
liability, have been actively used by the Indian government in recent
years. Information disclosed by Twitter for the period between July 2021
and December 2021 show India having been the world’s fifth largest requestor
removal of content. The figures released by Twitter also indicate the
proactive use of the IT Act for taking down content
by verified journalists and news outlets, with India leading the world with
114 such demands in that period, followed by Turkey (78) and Russia (55).

Section 66A of the IT Act sought to punish certain forms of digital content
such as grossly offensive messages or messages sent to cause annoyance,
inconvenience, danger, criminal intimidation, enmity, hatred or ill-will.
However, in the landmark verdict of Shreya Singhal v. Union of India, the
validity of this provision was appealed before the Supreme Court. The Court
remarked on the chilling effect caused by such over-broad grounds as well
as the subjective nature of offensive or annoying content. Stating that the
provision criminalises several forms of speech that are legal under the
right to free speech in the Indian constitution (Article 19(1)(a)), the
Court ruled this provision as wholly unconstitutional and void

In 2021, India issued the Information Technology (Intermediary Guidelines
and Digital Media Ethics Code) Rules (IT Rules 2021). The IT Rules 2021 set
out extensive obligations for digital platforms. Rule 3(1)(d) requires
intermediaries to delete or disable access to unlawful content within
thirty-six hours of receiving a court order or directions from a government
agency under Section 79 of the IT Act. Rule 4 sets out additional
requirements for large intermediaries, such as tracing the first originator
of information based on a court order or directions issued under Section 69
of the IT Act, having a local office in India, appointing nodal contact
persons for coordination with law enforcement agencies, and proactively
monitoring digital content for information regarding rape or child abuse.
Lastly, the IT Rules 2021 regulate dissemination of news by digital
publication houses or news aggregators, and sets out a code of conduct and
an oversight mechanism for news regulation by the central government. This
latter provision was stayed by various
<> high
courts across India, with the Madras HC expressing concerns
the potential of government censorship over journalism.

In October 2022, the IT Rules were amended to include
<> a grievance
appellate committee (GAC), which is responsible for handling grievances
against intermediaries and is chaired by three members appointed by the
central government. The GAC offers a direct route for aggrieved persons to
appeal decisions taken by the intermediary, as opposed to the alternative
route of approaching the court. However, the completely executive nature of
this committee overseeing all intermediary grievances, and the lack of
natural justice principles such as fair hearings for both parties, raise
concerns regarding the power of the central government to influence the
conduct of intermediary platforms.

China takes a restrictive approach towards content moderation, freedom of
expression and online anonymity. Under Article 24 of the Cybersecurity Law
of the People’s Republic of of China, 2017 (‘China Cybersecurity Law’ or ‘
CCL’), network operators that provide messaging services, social media
services or website registration services are required to obtain the real
their users prior to signing the agreement to use these services.

This measure was followed by the following regulatory changes regarding
internet posts, comments on internet posts, user accounts, and content
moderation requirements in 2017. Article 8 of the Provisions on the
Management of Internet Forum
<> Community Services,
2017 similarly impose a ‘real name on file, user choice on profile’
approach, requiring service providers to register and authenticate users in
order to make posts on their forum. In the same year, China issued the
Provisions on the Administration of Internet Thread Commenting Services
<> (China
Comments Regulation) to regulate content being shared through comments on
internet posts. Under Article 5(1), platforms must similarly register and
verify the real identity of the commenter, who may then choose to use any
name for their public-facing profiles. Users who do not submit their real
identities shall not be permitted to comment on internet posts. Article
5(3) requires all websites, apps and other intermediary platforms that
allow comments to be posted by users to pre-review comments on news-related
posts. This is a marked departure from the ‘mere conduit’ approach taken
towards several intermediary platforms across the world.

This requirement to review comments *ex ante* is currently limited to news
information threads. In June 2022, the Cyberspace Administration of China
issued draft guidelines
<> seeking to amend
the China Comments Regulation. In Article 5(4) of the draft guidelines, a
pre-review of comments on all online posts must be conducted by the entity
offering comment services. This obligation requires platforms to maintain
teams that can conduct this editorial exercise on user-submitted comments,
and report unlawful comments to the relevant authorities.

Additionally, Article 8 requires platforms to perform credit assessments
the users’ comments behaviour and to designate the scope of services and
functionality of users based on this credit score. Article 8 further
requires platforms to blacklist users with low credit scores and prevent
them from making new accounts to post comments. Such measures are very
susceptible to being misused in the context of internet censorship,
particularly with the vagueness attached to the phrase ‘unlawful’
information for comments to not be published, as opposed to defining a
clear criteria under which comments shall be rejected by the platform prior
to publication.

In 2021, China introduced Provisions on the Administration of Public
Account Information Services
<> for Internet
Users, aiming to regulate ‘public’ user accounts on social media platforms.
Under these regulations, public accounts refer to any user account whose
content is freely available to the public, and is not restricted to limited
people (such as the ‘private’ mode on Instagram). In addition to the
existing registration and authentication requirements for creating a public
account, platforms that allow public accounts must verify
<> the
descriptions, names and avatars used by public accounts to ensure its
accuracy. This move is designed to ensure that public accounts are not run
by impersonators and do not falsely describe themselves or affiliate
themselves with the government. Under Article 9, users creating public
accounts to post content relating to economics, education, medical health,
or justice must furnish proof
<> of their
professional background and qualifications to the platform. These
requirements are meant to tackle
<> the spread of
misinformation and to ensure the traceability of users publishing content
for public consumption.

Lastly, in August 2022, the Provisions on the Management of Internet User
Account <> Information
come into effect. This regulation seeks to create a regulatory framework
for user accounts across all internet services. Reiterating existing
real-name verification and registration requirements for creating user
accounts, Article 12 of this regulation states that ISPs shall display the
IP address of user accounts
<> on their profile
page, in order to ‘facilitate the public’s supervision in the public

These measures introduce a grave chilling effect to the freedom of
expression and limits the ordinary citizen’s access to only state-approved
information. With every aspect of user interaction (account creation,
public accounts, posts and comments) within a social media platform
structure being regulated, China has a robust, yet restrictive and
limiting, set of provisions pertaining to access to information and freedom
of expression.

In terms of protecting civil rights of internet users online, Article 36 of
the Tort Liability Law of the People’s Republic of China, 2010 states that
the users and intermediaries bear tort liability
<> for
infringing on the civil rights of other users. Under this law, the person
affected may inform the intermediary to take measures such as content
takedown, blocking access or suspension of services to the infringing
material. The intermediary shall be held liable if they do not act on such
information in a timely manner, or if they are aware of a user using their
services to infringe civil rights of other users without taking the
necessary measures. These provisions were further bolstered with respect to
minors through the Law of the People’s Republic of China on Protection of
Minors (China Minor Protection Law or CMPL), effective from
2021. Article 69 of the CMPL requires facilities providing internet access
to children to install child-friendly software and technological measures.
Digital service providers must obtain consent from parents and guardians in
the event of the child user being below the age of 14. Article 74 prohibits
digital service providers from providing products and services meant to
induce child users to indulge in the internet. Under this provision, video
game providers, social media and audio/video content providers are required
to build in mechanisms to curb usage by children, including through time
limits, additional authority and spending limits. Children playing online
games must be registered on a central electronic identity authentication
system. Lastly, Article 77 protects children from cyber-bullying and harms
to children by authorising affected minors or their guardians to require
the game provider to stop the harm through blocking, disconnection of
services or deletion of harmful content.
South Africa

The process for content takedown requests in South Africa are highlighted
in the Electronic Communications and Transactions Act, 2002 (ECTA). Under
Section 73 of the ECTA, ISPs are exempted
<> from
liability regarding content shared on their platform, provided that the
intermediary acted as a mere conduit and did not modify the data being
shared. Similarly, an ISP is not liable for content hosted
<> by
a user, provided that it was not aware of such hosting having infringed the
rights of a third party, and upon receiving a takedown notice from the
victim has acted quickly to delete or block access to that content. These
takedown notices can also be shared with representative bodies for ISPs
such as the ISPA (the Internet Service Providers’ Association), which is
the recognised representative body <> for
South African ISPs, and limits liability <> of
its members. Lastly, keeping in mind the notice-and-takedown structure of
the ECTA, ISPs are specifically exempted from any general obligation to
monitor data being transmitted or stored. This provision allows for greater
trust regarding privacy and data protection in communications within South
Africa, positively affecting the freedom of expression for South African
internet users.

In 2016, South Africa released
Prevention and Combating of Hate Crimes and Hate Speech Bill for public
comments. Clause 4 of this Bill defines hate speech as any communication
that, for a reasonable person, can be understood to incite harm or promote
hatred based on certain grounds including race, disability, gender
identity, nationality or religion. The provision punishes the creation or
spread of hate speech on the internet with fine and/or imprisonment up to 3
years for a first offence, and fine and/or imprisonment up to 5 years for
subsequent offences. However, this Bill continues to be in legislative
limbo, with parliamentarians stating, in September 2022, that they continue
to work
the provisions to finalise a working draft.

However, recent legislative introductions raise concerns regarding their
effect on the right to free expression. In March 2022, amendments to the
Films and Publications Act, 1996 (SAFPA) came into effect
seeking to regulate cyber-harms perpetrated through sharing of videos and
photos. Section 18F of the SAFPA prohibits sharing of any private sexual
photos or films without the consent of the individuals involved. Section
18G prohibits distribution of films depicting sexual violence and violence
against children while section 18H prohibits distribution of games, films
or publications that incite imminent violence, advocate hate speech or
serve as war propaganda. These prohibitions include distribution in any
medium, including the internet as a whole. Section 15A of the SAFPA
authorises compliance officers to pull access from any commercially
distributed films and games that fails to comply with classification or
legal requirements. Additionally, section 18E of the SAFPA allows any
person to complain to the FPB regarding any unclassified content,
prohibited content, or even for potentially prohibited content uploaded
online, including by non-commercial online distributors, i.e., content
creators sharing online content for personal purposes. If the FPB finds
merit in the complaint – in terms of legal violations or a lack of
classification – it may order the content to be taken down following the
process under the ECTA. For this reason, critics have raised concerns
the wide-ranging requirements of content creators having to seek
classification from the Films and Publications Board (FPB) for all video
content, which may include
streamed or shared on popular social media platforms such as Twitch,
Tiktok, Facebook or Instagram.

While more barriers exist to the right to the freedom of expression in
BRICS countries, the right to the freedom of information also faces
significant challenges. A Global North push in this direction seems to have
made the BRICS intellectual property landscape much more restrictive. As
for the freedom of expression, barriers seem to be growing but are
sometimes met with resistance. Much of the law in these areas has been
defined by courts, demonstrating that governments find it difficult to
frame law for all scenarios in the digital economy.

As this post brings our series to a close, we have realised that such
descriptive work relating to the digital economy in BRICS countries can be
useful. As more countries look to join the BRICS alliance and it rises in
prominence again, it is important for researchers to continue to study
these countries comparatively. Analysis of India’s policy actions in
comparison to countries that are placed somewhat similarly has also been
fruitful in different ways in comparison to analysis that positions India
alongside the US or the EU. We hope to continue this work in more detail,
and we welcome feedback and suggestions.

(*The authors would like to thank Prachi Mathur and Diksha Singh for their
valuable research assistance towards this article.*)






Dhruv is a Research Fellow with the Centre for Applied Law and Technology
Research at Vidhi. He is interested in the interplay between law,
technology and civil liberties. He graduated from the NALSAR University of
Law, Hyderabad in 2019. Prior to joining Vidhi, he worked with Majmudar &
Partners, Mumbai and has interned with the Centre for Communication
Governance, NLU-Delhi, and the Centre for Internet and Society. He enjoys
writing poetry, playing chess, and reading on Indian and world history in
his spare time.


Jai is a Senior Resident Fellow at the Centre for Applied Law and
Technology Research (ALTR) at Vidhi. Her work focuses on the economics of
platforms and its implications for regulation. Jai has previously worked at
IT for Change and the National Institute for Public Finance and Policy
(NIPFP) in the areas of fintech, cryptocurrency, data commons, platforms
and digital trade. Jai completed a Master of Public Policy from the
University of Oxford.

   Explained | The Amendments to the IT Rules, 2021

   Why did the Ministry of Electronics and IT invite feedback to the draft
   amendments of the the Information Technology (Intermediary Guidelines and
   Digital Media Ethics Code) Rules, 2021? What are the functions of the new
   Grievance Appellate Committees?
   31 Oct 2022

   What CCI’s Fines on Google Mean for Android Users

   The Competition Commission of India has imposed two hefty penalties on
   the tech giant for abusing its dominance to promote its payments app and
   in-app payment system
   30 Oct 2022

   - BLOG
   Access to Welfare in the Digital Economy in BRICS Countries

   Digital Economy Blog Series: Part 5
   22 Sep 2022


Privacy Overview
This website uses cookies to improve your experience while you navigate
through the website. Out of these cookies, the cookies that are categorized
as necessary are stored on your browser as they are essential for the
working of basic functionalities...
NecessaryAlways Enabled
Better lawsbetter governance

The Vidhi Centre for Legal Policy is an independent think-tank doing legal
research to make better laws and improve governance for the public good.
Subscribe <>

   - <>
   - <>
   - <>
   - <>
   - <>
   - <>
   - <>
   - <>
   - <>


   - <>
   - <>
   - <>
   - <>
   - <>
   - <>


   - <>


   - <>
   - <>


   - <>
   - <>

   - Terms and Conditions <>
   - Privacy Policy <>

© 2022 Vidhi Centre for Legal Policy

   - <>
   - <>
   - <>
   - <>
   - <>
   - <>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: text/html
Size: 100726 bytes
Desc: not available
URL: <>

More information about the cypherpunks mailing list