Расширенный поиск

Pavel A. Karasev
Institute of Information Security Issues,
Lomonosov Moscow State University

While the Internet as a global phenomenon is composed of different hardware devices, wired and wireless networks have become a domain where a huge amount of information is being created, transmitted and stored. People who create the content, however, pursue a variety of goals including destructive ones.

Today, Internet users as they submerge themselves into an endless flow of information often cannot fully perceive and understand consumed content, and consequently are unable to fully assess its impact on their minds and behavior. Furthermore, users of social networks and other services that allow to upload various content are often unaware of the fact that once they have uploaded their personal data for public use they will never be able to delete it completely. Thus, confidential information or personal data can end up in the hands of criminals who might use it to their advantage. Minors are especially vulnerable to such threat.

The essential prerequisite to prevent the impact of «destructive» information is to establish certain reasonable safeguards such as censorship, which could be implemented in one form or another through transparent mechanisms, provided there are under public control.

Paragraph 2, Article 29 of the Universal Declaration of Human Rights states: «In the exercise of his rights and freedoms, everyone shall be subject only to such limitations as are determined by law solely for the purpose of securing due recognition and respect for the rights and freedoms of others and of meeting the just requirements of morality, public order and the general welfare in a democratic society.» Article 19 of the International Covenant on Civil and Political Rights sets forth restrictions to human rights as follows: a) For respect of the rights or reputations of others, and b) for the protection of national security or of public order, health or morals.

The first priority in this issue is to maintain the balance between the rights and freedoms of Internet users and security of the information space (cyberspace). We must recognize that every person has the right for protection from the content that he/she deems harmful for himself/herself or his/her family or the society as a whole. At the same time we must preserve all the positive potential of information and communication technologies. Many countries of the world, including developed Western democracies, have recognized this problem and developed their own frameworks to classify certain content as harmful and methods of protection against such content.

Below there are examples of the content which is prohibited for distribution or filtered in one way or another. This review recognizes several types of content subject to filtering:

1. Discrimination and Defamation;

2. Content filtered to protect the youth

3. Content, filtered by Social or Cultural criteria, and seen as Contrary to Public Order and National Security.

Discrimination and defamation

We should begin with Article 2 of the Universal Declaration of Human Rights which states that “everyone is entitled to all the rights and freedoms set forth in this Declaration, without distinction of any kind, such as race, color, sex, language, religion, political or other opinion, national or social origin, property, birth or other status.”

The protocol to the Council of Europe Budapest Convention on Cybercrime requires the signatory states to “take steps to criminalize the content in a text or visual form or otherwise that justifies, encourages or causes hatred, discrimination or violence against an individual or a group of individuals based on his/her or their race, color, origin or religion.”

Section 320.1 of the Canadian Criminal Code makes dissemination of online hate propaganda a criminal offense.

The Indonesian law on information and electronic transactions of 2008 prohibits defamation and any content inciting hatred or hostility towards a person or group of people in view of his/her or their race, ethnicity or religion.

Content filtered to protect the youth

Child pornography is unconditionally recognized as crime and many states, including the United Kingdom, Sweden, Finland, Denmark, Germany, France and Italy filter websites offering such content by using a network of “hotlines” which pass data on harmful content to ISPs who then blacklist such websites. Another important challenge of the present day is to protect children from harmful information they might find on the Internet.

In 2006, the National Assembly of Venezuelan adopted a law concerning protection of children from harmful Internet content. Under the law, internet providers are required not only to monitor the content stored on their servers, but also provide users with content-filtering software.

Content, filtered by Social or Cultural criteria, and seen as Contrary to Public Order and National Security.

In a number of countries certain types of content are considered a threat to national security. Today a number of nations face the problem of erosion of traditional values which make the cultural and moral foundation of the society. Certain Internet content may be unacceptable in a particular socio-cultural environment.

In 2009 the Indian Parliament amended the law regulating information technologies and gave the government broader rights to block websites considered a threat to national security.

In 2010 the Chinese government amended the State Secrets Act that now requires cooperation of Internet service providers with the state whenever a state secret leaks to the Internet.

Venezuelan law on social responsibility on radio and television as amended in 2010 establishes responsibility and prohibits content, including Internet content, which aims to undermine public peace or induce actions posing a threat to national security.

Content related to terrorism or extremism is also recognized as dangerous. In 2003 Kenya passed Terrorism Act which prohibits “to collect, create and transfer (including via the Internet) information useful for terrorist organizations.”

The UAE Cybercrime Act (2006) prohibits content desecrating objects of worship or religious ceremonies, content in opposition to the teaching of Islam, content denying family values and principles, or content disturbing public peace or promoting the ideology of terrorism. The Law of 2006 criminalizes creation of websites promoting the ideology of terrorism, financing terrorist activities and spreading instructions on how to manufacture of explosives.

The South Korean Telecommunications Business Act prohibits dissemination of Content that violates the honor or rights of other persons or undermines public morals or social ethics.

Under the Films, Videos, and Publications Classification Act of New Zealand it is prohibited to distribute or possess materials considered to be “detrimental to the public good.”

Article 57 of the Bangladesh Information and Communication Technologies Act (2006) prohibits content that is “vulgar, slanderous or offending religious feelings”.

Article 211 of the Malaysian Communications and Multimedia Act (1998) prohibits posting on the Internet any content which is “indecent, obscene, defamatory or threatening.” The Russian Federation has also adopted a number of laws governing filtering of certain categories of content. On July 28, 2012 the Federal Law No. 139-FZ On Amending the Federal Law “On Protection of Children from Information Detrimental to Their Health and Development” and Other Legal Acts of the Russian Federation was adopted. The following significant changes were introduced into the Federal Law No. 436-FZ of February 29, 2010 On Protection of Children from Information Detrimental to Their Health and Development.

First and foremost, the law now requires an information product to be marked with a sign identifying the category of such information product, while information products themselves are now required to undergo an expert review in the manner prescribed by law.

The Federal Law No. 126-FZ of July 7, 2003 On Telecommunication was amended with a paragraph stating that “telecommunication operators who provide access to the information and telecommunication network “the Internet” are required to restrict and resume access to the information disseminated through the information and telecommunication network «the Internet» as regulated by the Federal Law No. 149-FZ of July 27, 2006 On Information, Information Technologies and Protection of Information.

The Federal Law No. 149-FZ On Information, Information Technologies and Protection of Information has also been amended. Now it requires that a unified website register is maintained with the purpose of restricting access to harmful content posted on certain websites. Websites are added to the register subject to a decision taken by a competent federal authority (the Federal Supervision Agency for Information Technologies and Communications, Roskomnadzor) or under a court ruling.

The jurisdiction of courts includes rulings on any category of content prohibited for distribution in the territory of the Russian Federation. The Federal Law No. 114-FZ On Countering Extremist Activities of July 25, 2002 (as amended in 2008) prohibits distribution of extremist materials, as well as their production or storage for the purposes of distribution. The Law No. 2124-1 of December 27, 1991 On Mass Media (as amended on July 28, 2012) prohibits disclosure of any information constituting a state secret or other secrets protected by law, distribution of materials containing public calls for terrorism or public justification of terrorism and other extremist materials, as well as materials promoting pornography, violence and cruelty.

The Internet must undoubtedly remain a space of freedom and common heritage of the mankind; everyone must have the right to access information and the right to freedom of expression. Article 19 of the Universal Declaration of Human Rights states that “everyone has the right to freedom of opinion and expression: this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.”

At the same it must be remembered that freedom does not mean total permissiveness, but rather means responsibility for the actions one undertakes both in the physical reality and within the virtual space. Absolute freedom to disseminate any information would have rendered joint residence of people impossible.

The table below results from the discussions at the Sixth and Seventh Conference of the International Information Security Research Consortium (IISRC).

Summary table illustrating approaches exercised by various countries and respective legislative instruments governing the filtering of harmful internet content:

Country Legal framework Filtered content Regulating authority Filtration framework
Russia Federal  law  of Russian Federation no. 139-FZ of

2012-07-28 «On Amend-ments to Federal Law On Protecting Children from Information Harmful to Their Health and Development and Certain Legislative Acts of the Russian Federation.»

•Child pornography or solicitation to participate in such;

•Information about meth-ods of making, using, get- ting or locating narcotic drugs and psychotropic substances or their precursors; or growing plants containing narcotic substances;

•Information aboutmeth- ods of suicide, and calls for suicide

Federal Service for Su- pervision in the Sphere of Telecom, Information Technologies and Mass Communications (ROSKOMNADZOR) In order to limit access to websites containing informationwhose dissemination is prohibited in the Russian Federation, the federal law provides for the establishment of a Unified Register of Domain Names, Universal Page Selectors and Internet Addresses that Allow for the Identification of Websites Containing Information whose Dissemination is Prohibited in the Russian Federation.

The websites would be included into the Registry based on decisions made by Federal Service for Supervision in the Sphere of Telecom, Information Technol- ogies and Mass Communications, or by ruling of the court.

Canada Section 320.1 of the Canadian Criminal Code Online hate propaganda Canadian Radio-Television and Telecommunications Commission Project Cleanfeed Canada Removal and blocking of content by a court order
Section 163 of the Canadian Criminal Code Child pornography
Venezuela Venezuela’s  2004 Law of Social Responsibility for Radio and Television (LSR) (amended in 2010) Content intended to adversely affect public order or promote crime; or incite actions detrimental to Ven- ezuela’s national security The Municipal Council of Children’s Rights
Ley para la Proteccion del Nino у Adolescentes en Salas de Uso de Internet Videojuegos у Otros Multimedias Content, harmful to youth
Kenya The Kenya Communications (Amendment) Act 2009 Content, harmful  to youth Communications Com- mission of Kenya
Bangladesh Article 57 of the Information and Communication and Technologies Act Defamatory content, content that may harm law and order, and con-tent that attacks religious beliefs
Pornography Control Act 2011 Pornography

Content, harmful to youth

Indonesia Law  of  the  Republic of Indonesia   Number 11 Year 2008, About Information   and  Electronic Transactions Defamatory content

Content intended to invoke hatred or hostility toward individuals or groups of people based on race, ethnicity, and reli- gion

Ministry of Communications    and InformationTechnology
Law  of  the  Republic of Indonesia Number 44 Year 2008 about Pornography Pornography

South Korea

Constitution of Korea Article 21


Telecommunications Business Act with Amendment №8867,

Year 2008

Content that violates the honor or rights of other persons or undermines public morals or social eth- ics

Content that aims at or abets a criminal act, aims at committing antistate activities, or impedes good customs and other aspects of social order

Ministry of Information and Communication

Korean Communications Standards Commission

National Election Commission

Removal and blocking of content by a court order at ISPs
Juvenile Protection Act Content, harmful to youth

Act on Protection of Youth from Sexual Ex- ploitation

Child pornography
Malaysia Communications       and Multimedia Act of 1998 Section 211 Online content that is indecent, obscene, false, menacing, or offensive in character with intent to annoy, abuse, threaten or harass any person. Malaysian Communications    and Multimedia Commission
Brazil Press  Law  #  5.250/67,

Article 75

Defamatory content National          Reporting Center of Cyber Crimes

Public Ministry

Removal and blocking of content by a court order at ISPs
Law# 11.829/2008 on Child Pornography  on the Internet Child pornography
Oman Article 29 of Oman’s constitution

1984 Press and Publication Law

Omantel’s Terms and Conditions

Content that leads to public discord, violates the security of the State or abuses a person’s dignity and his rights
The UAE Cyber-Crime

Law No. 2 of 2006

Content that defames Islamic places of worship and traditions, insults any recognized religion, or promotes «sinful acts». Telecommunication Regulatory Authority Removal and blocking of content at ISPs
Algeria Article  14  of  a  1998

Telecommunications decree

Material contrary to public order and morality Removal and blocking of content at ISPs



This article is based on a presentation delivered at the 7th Scientific conference of the International Research Consortium on Information Security, as part of the International Forum on «Partnership of state authorities, civil society and business community in ensuring international information security», held on 22-25 April 2013 in Garmisch-Partenkirchen, Germany. It is published on Digital.Report with an explicit permission from the conference organizers.

Об авторе

Институт проблем информационной безопасности МГУ имени М.В.Ломоносова.

Написать ответ

Send this to a friend
Перейти к верхней панели