+ All Categories
Home > Documents > AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis...

AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis...

Date post: 14-Aug-2019
Category:
Upload: trantu
View: 212 times
Download: 0 times
Share this document with a friend
188
European Parliament 2014-2019 Committee on Civil Liberties, Justice and Home Affairs 2018/0331(COD) 25.2.2019 AMENDMENTS 36 - 178 Draft report Daniel Dalton (PE633.042v01-00) Preventing the dissemination of terrorist content online Proposal for a regulation (COM(2018)0640 – C8-0405/2018 – 2018/0331(COD)) AM\1177907EN.docx PE636.146v02-00 EN United in diversity EN
Transcript
Page 1: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

European Parliament2014-2019

Committee on Civil Liberties, Justice and Home Affairs

2018/0331(COD)

25.2.2019

AMENDMENTS36 - 178

Draft reportDaniel Dalton(PE633.042v01-00)

Preventing the dissemination of terrorist content online

Proposal for a regulation(COM(2018)0640 – C8-0405/2018 – 2018/0331(COD))

AM\1177907EN.docx PE636.146v02-00

EN United in diversity EN

Page 2: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

AM_Com_LegReport

PE636.146v02-00 2/121 AM\1177907EN.docx

EN

Page 3: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Amendment 36Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationTitle 1

Text proposed by the Commission Amendment

Proposal for aREGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCILon preventing the dissemination of terrorist content onlineA contribution from the European Commission to the Leaders’ meeting inSalzburg on 19-20 September 2018

Proposal for aDIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCILon preventing the dissemination of terrorist content onlineA contribution from the European Commission to the Leaders’ meeting inSalzburg on 19-20 September 2018

(This amendment applies throughout the text. Adopting it will necessitate corresponding changes throughout.)

Or. en

Justification

It is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake.” (recital 7). Previously, the CJEU has negated the possibility to use Art. 114 TFEU for a measure that aims at contributing to public security (C-318/04, paras 54-61 and C-318/04, para 67). The legal basis should therefore be Art. 83 TFEU, which only allows for the adoption of directives.

Amendment 37Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationTitle 1

Text proposed by the Commission Amendment

Proposal for aREGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCILon preventing the dissemination of terrorist content onlineA contribution from the European Commission to the Leaders’ meeting inSalzburg on 19-20 September 2018

Proposal for aREGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCILon limiting the dissemination of terrorist content onlineA contribution from the European Commission to the Leaders’ meeting inSalzburg on 19-20 September 2018

AM\1177907EN.docx 3/121 PE636.146v02-00

EN

Page 4: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Or. en

Amendment 38Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationCitation 1

Text proposed by the Commission Amendment

Having regard to the Treaty on the Functioning of the European Union, and in particular Article 114 thereof,

Having regard to the Treaty on the Functioning of the European Union, and in particular Article 83 thereof,

(This amendment applies throughout the text. Adopting it will necessitate corresponding changes throughout.)

Or. en

Justification

It is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake.” (recital 7). Previously, the CJEU has negated the possibility to use Art. 114 TFEU for a measure that aims at contributing to public security (C-318/04 and C-318/04, paras 54-56 and para 67). The legal basis should therefore be Art. 83 TFEU, which only allows for the adoption of directives.

Amendment 39Eva JolyProposal for a regulationRecital 1

Text proposed by the Commission Amendment

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by addressing the misuse of hosting services for terrorist purposes and contributing to the investigation of criminal offences. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to ensure the rule of law and fundamental rights, in

PE636.146v02-00 4/121 AM\1177907EN.docx

EN

Page 5: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

particular the freedom of expression and information, the right to freedom and pluralism of the media, the freedom to conduct a business and the rights to privacy and the protection of personal data.

Or. en

Amendment 40Nicolas Bay, Gilles LebretonProposal for a regulationRecital 1

Text proposed by the Commission Amendment

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

(1) This Regulation aims at ensuring the security of Member States’ citizens in democratic societies, by preventing the misuse of hosting services for terrorist purposes.

Or. en

Amendment 41Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 1

Text proposed by the Commission Amendment

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening

(1) This Directive aims at ensuring the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights in an open and democratic society, by preventing the use of hosting services for spreading terrorist content. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting

AM\1177907EN.docx 5/121 PE636.146v02-00

EN

Page 6: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

safeguards to the freedom of expression and information.

service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards for the protection of fundamental rights, including the freedom of expression and information, as well as the respect for private and family life and the protection of personal data.

(The change of word “misuse/ abuse” to the term “use” should apply throughout the text.)

Or. en

Justification

The goal of this text is primarily to contribute to public security and the reference to the digital single market only seems to exist to justify using Art. 114 TFEU as a legal basis (and therefore the choice of a regulation instead of a directive).

Amendment 42Josef WeidenholzerProposal for a regulationRecital 1

Text proposed by the Commission Amendment

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the abuse of hosting services for the purposes of disseminating illegal terrorist content, by content providers. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards for the protection of fundamental rights, including the freedom of expression and freedom to receive and impart information, as well as the respect for private life and the protection of personal data.

Or. en

Amendment 43Cornelia Ernst, Barbara Spinelli, Marie-Christine Vergiat

PE636.146v02-00 6/121 AM\1177907EN.docx

EN

Page 7: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Proposal for a regulationRecital 1

Text proposed by the Commission Amendment

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to fundamental rights, in particular the right to freedom of expression and information, the right to privacy and to protection of personal data.

Or. en

Amendment 44Caterina ChinniciProposal for a regulationRecital 1

Text proposed by the Commission Amendment

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes and by delivering an efficient tool to achieve long term public security in our societies. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

Or. en

Amendment 45Michał Boni, Róża Gräfin von Thun und Hohenstein

AM\1177907EN.docx 7/121 PE636.146v02-00

EN

Page 8: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Proposal for a regulationRecital 1

Text proposed by the Commission Amendment

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to fundamental rights, in particular freedom of expression and information.

Or. en

Amendment 46Maite Pagazaurtundúa Ruiz, Cecilia Wikström, Nathalie Griesbeck, Gérard Deprez, Louis MichelProposal for a regulationRecital 1

Text proposed by the Commission Amendment

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to fundamental rights, including the freedom of expression and information.

Or. en

Amendment 47Monika BeňováProposal for a regulation

PE636.146v02-00 8/121 AM\1177907EN.docx

EN

Page 9: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Recital 1

Text proposed by the Commission Amendment

(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

(1) This Regulation aims at ensuring the smooth and transparent functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and information.

Or. en

Justification

It is essential that transparent use of the internet remain a priority. It is by default difficult to misuse the internet if high degree of transparency is maintained. As a result unverifiable content linked to extremism will stand out.

Amendment 48Eva JolyProposal for a regulationRecital 1 a (new)

Text proposed by the Commission Amendment

(1a) Regulation of hosting service providers can only complement Member States’ strategies and actions to address illegal terrorist offences, which must emphasize offline measures, such as criminal investigations and cross-border cooperation, as well as preventive measures, including investment in education, social cohesion, and violence prevention, among others.

As many studies show, the process of radicalisation very rarely happens online only. Actual violent radicalisation entails several complex processes, including person-to-person communication in conjunction with other offline factors.

AM\1177907EN.docx 9/121 PE636.146v02-00

EN

Page 10: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

The role that internet and social media may play in this process should however not be undermined.

Or. en

Amendment 49Auke ZijlstraProposal for a regulationRecital 1 a (new)

Text proposed by the Commission Amendment

(1a) Considering that measures making it beforehand impossible to publicate on the internet, such as uploadfilters, are in violation of Article 7.3 of the Dutch Constitution.

Or. en

Amendment 50Caterina ChinniciProposal for a regulationRecital 2

Text proposed by the Commission Amendment

(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online. Of particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.

(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens, providing learning opportunities and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are, in certain cases, abused by third parties to carry out illegal activities online which constitute a criminal offence according to the existing EU legal framework. Of particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.

PE636.146v02-00 10/121 AM\1177907EN.docx

EN

Page 11: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Or. en

Amendment 51Maria GrapiniProposal for a regulationRecital 2

Text proposed by the Commission Amendment

(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online. Of particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.

(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online. Of particular concern is the misuse of hosting service providers by terrorist groups, organisations and individuals and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.

Or. ro

Amendment 52Josef WeidenholzerProposal for a regulationRecital 2

Text proposed by the Commission Amendment

(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online. Of

(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused to carry out illegal activities online. Of particular concern is

AM\1177907EN.docx 11/121 PE636.146v02-00

EN

Page 12: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.

the abuse of the services of hosting service providers by terrorist groups and their supporters to disseminate illegal terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.

Or. en

Amendment 53Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 2

Text proposed by the Commission Amendment

(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online. Of particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.

(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online. Of particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to recruit and to facilitate and direct terrorist activity.

Or. en

Amendment 54Caterina ChinniciProposal for a regulationRecital 2 a (new)

Text proposed by the Commission Amendment

(2a) Social media platforms may be used as a driver of radicalisation leading to violent extremism as well as a tool to disseminate hate speech and illicit contents, especially among young internet

PE636.146v02-00 12/121 AM\1177907EN.docx

EN

Page 13: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

users.

Or. en

Amendment 55Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 3

Text proposed by the Commission Amendment

(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services.

deleted

Or. en

Amendment 56Caterina ChinniciProposal for a regulationRecital 3

Text proposed by the Commission Amendment

(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by

(3) Although the presence of terrorist content online does not constitute the only factor leading to violent extremism, it has serious negative consequences for users, especially the young ones, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service

AM\1177907EN.docx 13/121 PE636.146v02-00

EN

Page 14: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

terrorists and to help tackle terrorist content disseminated through their services.

providers, while fully safeguarding freedom of expression and pluralism, have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services.

Or. en

Amendment 57Nicolas Bay, Gilles LebretonProposal for a regulationRecital 3

Text proposed by the Commission Amendment

(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services.

(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services without any prejudice to the freedom of expression and information of Member States’ citizens.

Or. en

Amendment 58Eva JolyProposal for a regulationRecital 3

Text proposed by the Commission Amendment

(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers

(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers

PE636.146v02-00 14/121 AM\1177907EN.docx

EN

Page 15: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services.

hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and in proportion to the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help competent authorities to address terrorist offences committed through their services.

Or. en

Amendment 59Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 3

Text proposed by the Commission Amendment

(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services.

(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers could support competent judicial authorities to protect their services from being used by terrorists and to help tackle terrorist content disseminated through their services.

(The change of word “misuse/ abuse” to the term “use” should apply throughout the text.)

Or. en

Amendment 60Josef WeidenholzerProposal for a regulationRecital 3

AM\1177907EN.docx 15/121 PE636.146v02-00

EN

Page 16: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Text proposed by the Commission Amendment

(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services.

(3) The presence of illegal terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers could support competent authorities by protecting their services from abuse and to help tackle illegal terrorist content.

Or. en

Amendment 61Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 4

Text proposed by the Commission Amendment

(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need to be complemented by a clear legislative framework in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European Parliament to strengthen measures to tackle illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts.

(4) A clear legislative framework builds on voluntary cooperation between Member States and hosting service providers, which were reinforced by the Commission Recommendation (EU) 2018/3347.

_________________ _________________

PE636.146v02-00 16/121 AM\1177907EN.docx

EN

Page 17: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

7 Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).

7 Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).

Or. en

Amendment 62Josef WeidenholzerProposal for a regulationRecital 4

Text proposed by the Commission Amendment

(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need to be complemented by a clear legislative framework in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European Parliament to strengthen measures to tackle illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts.

(4) Efforts at Union level to counter terrorist illegal content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers. This legislative framework seeks to build on those efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347.

_________________ _________________7 Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).

7 Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).

Or. en

Amendment 63Caterina ChinniciProposal for a regulationRecital 4

Text proposed by the Commission Amendment

AM\1177907EN.docx 17/121 PE636.146v02-00

EN

Page 18: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need to be complemented by a clear legislative framework in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European Parliament to strengthen measures to tackle illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts.

(4) As it was highlighted by the report of the European Parliament special Committee on Terrorism1a, efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers were insufficient. Therefore the EU legal frameworks needs to be complemented by a clear legislative framework in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/334 and responds to calls made by the European Parliament to strengthen measures to tackle illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts.

_________________ _________________1a P8_TA-PROV(2018)0512

7 Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).

Or. en

Amendment 64Eva JolyProposal for a regulationRecital 4

Text proposed by the Commission Amendment

(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need to be complemented by a clear legislative

(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need to be complemented by a clear legislative

PE636.146v02-00 18/121 AM\1177907EN.docx

EN

Page 19: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

framework in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European Parliament to strengthen measures to tackle illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts.

framework in order to further reduce accessibility to terrorist content online, adequately address an evolving problem, and put in place necessary safeguards to ensure the rule of law and the protection of fundamental rights. This legislative framework seeks to build on and address some shortcomings of voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European Parliament to strengthen measures to tackle illegal content in line with the horizontal framework established by Directive 2000/31/EC and by the European Council to improve the detection and removal of terrorist content.

_________________ _________________7 Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).

7 Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).

Or. en

Amendment 65Maria GrapiniProposal for a regulationRecital 4

Text proposed by the Commission Amendment

(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need to be complemented by a clear legislative framework in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347and responds to calls made by the European Parliament to strengthen measures to tackle

(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need to be complemented by a clear legislative framework uniformly applicable in all EU Member States in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European

AM\1177907EN.docx 19/121 PE636.146v02-00

EN

Page 20: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts.

Parliament to strengthen measures to tackle illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts.

_________________7 Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).

Or. ro

Amendment 66Eva JolyProposal for a regulationRecital 5

Text proposed by the Commission Amendment

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

(5) This Regulation should lay down specific obligations of certain hosting service providers, and duties of care for those hosting service providers exposed to a significant amount of illegal terrorist content. The application of this Regulation should be without prejudice to Articles 14 and 15 of Directive 2000/31/EC8. In particular, the liability exemption granted to hosting service providers should not be affected by any measures they take in compliance with this Regulation, including any additional measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision, on the condition that they do not have actual knowledge of illegal activity or information or, upon obtaining such knowledge, they remove or disable access to that content expeditiously. As Article 15 of Directive 2000/31/EC prohibits general monitoring obligations on the information which they store as well as general obligations to actively seek facts or circumstances indicating illegal activity, this Regulation should not lead to

PE636.146v02-00 20/121 AM\1177907EN.docx

EN

Page 21: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

information transmitted by competent authorities to hosting providers that is vague about the status of legality of the content notified. Where the hosting provider is not informed by the competent authority whether the content notified is considered to be illegal, it might risk facing liability for failing to act expeditiously to remove the content. Therefore, this information needs to be provided in any case by the competent authority. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

_________________ _________________8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

Or. en

Amendment 67Michał Boni, Róża Gräfin von Thun und HohensteinProposal for a regulationRecital 5

Text proposed by the Commission Amendment

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national

(5) This Regulation should apply without prejudice to Article 14 and Article 15 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision and should not lead to the general monitoring obligation. This

AM\1177907EN.docx 21/121 PE636.146v02-00

EN

Page 22: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

_________________ _________________8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

Or. en

Amendment 68Andrejs MamikinsProposal for a regulationRecital 5

Text proposed by the Commission Amendment

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8. In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in those provisions or being imposed with an obligation to monitor. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

_________________ _________________8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular

8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular

PE636.146v02-00 22/121 AM\1177907EN.docx

EN

Page 23: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

Or. en

Amendment 69Maite Pagazaurtundúa Ruiz, Cecilia WikströmProposal for a regulationRecital 5

Text proposed by the Commission Amendment

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any voluntary proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

_________________ _________________8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

Or. en

Amendment 70Marietje Schaake, Sophia in ‘t Veld, Angelika MlinarProposal for a regulationRecital 5

Text proposed by the Commission Amendment

AM\1177907EN.docx 23/121 PE636.146v02-00

EN

Page 24: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

(5) The application of this Regulation should not affect the application of Article 14 and 15 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any additional measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

_________________ _________________8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

Or. en

Amendment 71Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 5

Text proposed by the Commission Amendment

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article

PE636.146v02-00 24/121 AM\1177907EN.docx

EN

Page 25: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

14 of Directive 2000/31/EC for liability exemption are not met.

_________________ _________________8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

Or. en

Amendment 72Josef WeidenholzerProposal for a regulationRecital 5

Text proposed by the Commission Amendment

(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met.

(5) The application of this Regulation should not affect the application of Articles 14 and 15 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 and 15 of Directive 2000/31/EC for liability exemption are not met.

_________________ _________________8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).

AM\1177907EN.docx 25/121 PE636.146v02-00

EN

Page 26: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Or. en

Amendment 73Maria GrapiniProposal for a regulationRecital 6

Text proposed by the Commission Amendment

(6) Rules to prevent the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market are set out in this Regulation in full respect of the fundamental rights protected in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.

(6) Rules to prevent the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market, providing a maximum degree of protection for users, thereby boosting their confidence, are set out in this Regulation in full respect of the fundamental rights protected in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.

Or. ro

Amendment 74Eva JolyProposal for a regulationRecital 6

Text proposed by the Commission Amendment

(6) Rules to prevent the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market are set out in this Regulation in full respect of the fundamental rights protected in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.

(6) Rules to address the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market are set out in this Regulation in full respect of the rule of law and the fundamental rights protected in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.

Or. en

Amendment 75Josef WeidenholzerProposal for a regulationRecital 6

PE636.146v02-00 26/121 AM\1177907EN.docx

EN

Page 27: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Text proposed by the Commission Amendment

(6) Rules to prevent the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market are set out in this Regulation in full respect of the fundamental rights protected in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.

(6) Rules to prevent the misuse of hosting services for the dissemination of illegal terrorist content online in order to guarantee the smooth functioning of the internal market are set out in this Regulation and should fully respect the fundamental rights as outlined in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.

Or. en

Amendment 76Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 6

Text proposed by the Commission Amendment

(6) Rules to prevent the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market are set out in this Regulation in full respect of the fundamental rights protected in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.

(6) Rules to limit the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market are set out in this Regulation in full respect of the fundamental rights protected in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.

Or. en

Amendment 77Eva JolyProposal for a regulationRecital 7

Text proposed by the Commission Amendment

(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes

(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the rule of law and the fundamental rights at stake.

AM\1177907EN.docx 27/121 PE636.146v02-00

EN

Page 28: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.

This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information and the rights to privacy and the protection of personal data, which constitute one of the essential foundations of a pluralist, democratic society, and one of the values on which the Union is founded. Measures taken to remove terrorist content online constitute an interference in the freedom of expression and information, and therefore should be strictly targeted, necessary, appropriate and proportionate to help the fight against terrorism, including investigation and prosecution of terrorist offences, in the sense that they must serve to address the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.

Or. en

Amendment 78Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 7

Text proposed by the Commission Amendment

(7) This Regulation contributes to the protection of public security while

(7) This Regulation contributes to the protection of public security while

PE636.146v02-00 28/121 AM\1177907EN.docx

EN

Page 29: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.

establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the right to freedom of expression and information as well as the right to privacy, which constitute one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law. This Regulation should not affect the applicable rules on the processing of personal data, notably Regulation (EU) 2016/679 and Directive (EU) 2016/680.

Or. en

Amendment 79Caterina ChinniciProposal for a regulationRecital 7

Text proposed by the Commission Amendment

(7) This Regulation contributes to the protection of public security while

(7) This Regulation contributes to the protection of public security while

AM\1177907EN.docx 29/121 PE636.146v02-00

EN

Page 30: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.

establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law as effective online counterterrorism measures and the protection of freedom of expression are not conflicting goals, but complementary and mutually reinforcing.

Or. en

Amendment 80Josef WeidenholzerProposal for a regulationRecital 7

Text proposed by the Commission Amendment

(7) This Regulation contributes to the protection of public security while establishing appropriate and robust

(7) This Regulation seeks to contribute to the protection of public security and should establish appropriate and robust

PE636.146v02-00 30/121 AM\1177907EN.docx

EN

Page 31: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.

safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary and proportionate in a democratic society, taking into account the particular importance accorded to the freedom of expression and freedom to receive and impart information, as well as the respect for private life and the protection of personal data, which constitute the essential foundations of a pluralist, democratic society, and are among the values on which the Union is founded. Measures constituting an interference with the freedom of expression and freedom to receive and impart information should be strictly proportionate and necessary, narrowly targeted, in the sense that they must serve to prevent the dissemination of illegal terrorist content only, and without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.

Or. en

Amendment 81Michał Boni, Róża Gräfin von Thun und HohensteinProposal for a regulationRecital 7

Text proposed by the Commission Amendment

(7) This Regulation contributes to the protection of public security while establishing appropriate and robust

(7) This Regulation contributes to the protection of public security while establishing appropriate and robust

AM\1177907EN.docx 31/121 PE636.146v02-00

EN

Page 32: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.

safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law. This Regulation should not affect the applicable rules on the processing of personal data, in particular as laid down in Regulation (EU)2016/679 and Directive (EU) 2016/680.

Or. en

Amendment 82Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 7

Text proposed by the Commission Amendment

(7) This Regulation contributes to the protection of public security while establishing appropriate and robust

(7) This Directive contributes to the protection of public security while establishing appropriate and robust

PE636.146v02-00 32/121 AM\1177907EN.docx

EN

Page 33: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.

safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, as well as the respect for private and family life and the protection of personal data, which constitute one of the essential foundations of a pluralist, democratic society, and are among the values on which the Union is founded. Measures constituting interference with the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.

Or. en

Amendment 83Maite Pagazaurtundúa Ruiz, Cecilia Wikström, Nathalie Griesbeck, Gérard Deprez, Louis MichelProposal for a regulationRecital 7 a (new)

Text proposed by the Commission Amendment

(7 a) This Regulation should not have the effect of modifying the obligation for Member States to respect fundamental rights and fundamental legal principles as

AM\1177907EN.docx 33/121 PE636.146v02-00

EN

Page 34: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

enshrined in Article 6 of the Treaty on the European Union and Article 11 of the Charter of Fundamental Rights, nor the applicable rules on the processing of personal data, such as Regulation 2016/679 and Directive 2016/680.

Or. en

Amendment 84Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 7 a (new)

Text proposed by the Commission Amendment

(7 a) Measures taken to remove terrorist content online should be targeted, necessary, appropriate and proportionate.

Or. en

Amendment 85Marietje Schaake, Sophia in ‘t Veld, Angelika MlinarProposal for a regulationRecital 8

Text proposed by the Commission Amendment

(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order.

(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular, in the context of this Regulation, the possibility for users to contest the removal of content resulting from measures taken by the hosting service provider as foreseen in this Regulation and informed of effective means of remedies, both internal and before the court of the Member State of residence. It also includes the ability for hosting service providers and content

PE636.146v02-00 34/121 AM\1177907EN.docx

EN

Page 35: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order, the court of the Member State where the hosting service provider is established or represented, or the court of the Member State of residence for the content provider.

Or. en

Amendment 86Maite Pagazaurtundúa Ruiz, Cecilia Wikström, Nathalie Griesbeck, Gérard Deprez, Louis MichelProposal for a regulationRecital 8

Text proposed by the Commission Amendment

(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order.

(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order. That right can be exercised before the court of the Member State where the hosting service provider has its main establishment or where the legal representative designated by the hosting provider pursuant to Article 16 resides or is established.

Or. en

Amendment 87Rachida Dati, Monika Hohlmeier, Roberta MetsolaProposal for a regulationRecital 8

AM\1177907EN.docx 35/121 PE636.146v02-00

EN

Page 36: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Text proposed by the Commission Amendment

(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order.

(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order. Moreover, hosting service providers should have the right to contest a decision imposing proactive measures or penalties before the court of the Member State whose authorities have adopted the decision.

Or. en

Amendment 88Eva JolyProposal for a regulationRecital 8

Text proposed by the Commission Amendment

(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order.

(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order, and the possibility for content providers to contest the results of

PE636.146v02-00 36/121 AM\1177907EN.docx

EN

Page 37: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

measures taken by the hosting provider.

Or. en

Amendment 89Axel VossProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected.

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council. Given the need to address the most harmful terrorist propaganda online, the definition should capture material that incites, encourages or advocates the commission or contribution to terrorist offences, or promotes the participation in activities of a terrorist group. The definition should also cover content that provides guidance for the making and the use of explosives, firearms, any other weapons, noxious or hazardous substances as well as Chemical, Biological, Radiological and Nuclear (CBRN) substances and any guidance on other methods and techniques, including the selection of targets, for the purpose of committing terrorist offences. Such information material includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences,

AM\1177907EN.docx 37/121 PE636.146v02-00

EN

Page 38: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, counter-narrative or research purposes should be adequately protected, by striking a fair balance between fundamental rights and public security needs. Especially in cases where the content provider holds an editorial responsibility, any decision as to the removal of the disseminated material should take into account the journalistic standards established by press or media regulation consistent with the law of the Union and the Charter of Fundamental Rights. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

_________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 90Josef WeidenholzerProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of illegal terrorist content online, this Regulation should establish a definition of illegal

PE636.146v02-00 38/121 AM\1177907EN.docx

EN

Page 39: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

terrorist content based on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9. Given the need to address the illegal terrorist content online, the definition should capture material and information that intentionally incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences when and insofar as they are committed with a specific terrorist aim, namely to seriously intimidate a population, to unduly compel a government or an international organisation to perform or abstain from performing any act, or to seriously destabilise or destroy the fundamental political, constitutional, economic or social structures of a country or an international organisation. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons or the respect for other fundamental rights, including the right to freedom of speech, and the rights to privacy and family life and to the protection of personal data. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions falls outside

AM\1177907EN.docx 39/121 PE636.146v02-00

EN

Page 40: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

the scope of this Regulation and, in particular, of the definition of public provocation to commit terrorist offences.

_________________ _________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 91Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Directive should establish a definition of terrorist content based on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council. Given the need to address the terrorist content online, the definition should capture material and information that intentionally incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences when and insofar as they are committed with a specific terrorist aim, namely to seriously intimidate a population, to unduly compel a government or an international organisation to perform or abstain from performing any act, or to seriously destabilise or destroy the fundamental political, constitutional, economic or social structures of a country or an international organisation. Such

PE636.146v02-00 40/121 AM\1177907EN.docx

EN

Page 41: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Directive, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons or the respect for other fundamental rights, including the right to freedom of speech, and the rights to private and family life and to the protection of personal data. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions, falls outside the scope of this Directive and, in particular, of the definition of public provocation to commit terrorist offences.

_________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 92Eva JolyProposal for a regulationRecital 9

Text proposed by the Commission Amendment

AM\1177907EN.docx 41/121 PE636.146v02-00

EN

Page 42: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities might take to address the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content that is in line with the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity, should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

_________________ _________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March

9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March

PE636.146v02-00 42/121 AM\1177907EN.docx

EN

Page 43: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 93Rachida Dati, Monika Hohlmeier, Roberta MetsolaProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences, guidance on methods or techniques on the making or use of explosives, firearms or other weapons or noxious substances as well as CBRN substances, or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby

AM\1177907EN.docx 43/121 PE636.146v02-00

EN

Page 44: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic, counter-narrative or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

_________________ _________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 94Ana GomesProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the

PE636.146v02-00 44/121 AM\1177907EN.docx

EN

Page 45: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic, research, artistic and editorial purposes should not however be considered as terrorist content, and should therefore be excluded from the scope of this Regulation. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

_________________ _________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 95Caterina ChinniciProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the (9) In order to provide clarity about the

AM\1177907EN.docx 45/121 PE636.146v02-00

EN

Page 46: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content. With this regards service providers and users in question should always be able to obtain judicial redress;

_________________ _________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing

9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing

PE636.146v02-00 46/121 AM\1177907EN.docx

EN

Page 47: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 96Nicolas Bay, Gilles LebretonProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in

AM\1177907EN.docx 47/121 PE636.146v02-00

EN

Page 48: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of any kind of views which can be perceived as radical, polemic or controversial in the public debate and especially on any kind of political questions should not be considered terrorist content.

_________________ _________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 97Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that manifestly incites the commission of terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether

PE636.146v02-00 48/121 AM\1177907EN.docx

EN

Page 49: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

content constitutes terrorist content within the meaning of this Regulation, competent authorities should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their demonstrable potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

_________________ _________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 98Michał Boni, Róża Gräfin von Thun und HohensteinProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the

AM\1177907EN.docx 49/121 PE636.146v02-00

EN

Page 50: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of an EU-listed terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

_________________ _________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 99

PE636.146v02-00 50/121 AM\1177907EN.docx

EN

Page 51: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Monika BeňováProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes, but is not limited to text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

AM\1177907EN.docx 51/121 PE636.146v02-00

EN

Page 52: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

_________________ _________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 100Marietje Schaake, Sophia in ‘t Veld, Angelika MlinarProposal for a regulationRecital 9

Text proposed by the Commission Amendment

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby

(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content based on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council. Given the need to address the illegal terrorist content online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides training for the commission of such offences or recruits for a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes illegal terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers must base their assessment on factors such as the notion of intention, the nature and wording of the statements, the context in which the statements were made and their demonstrable risk of provoking actions with harmful consequences, thereby

PE636.146v02-00 52/121 AM\1177907EN.docx

EN

Page 53: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

_________________9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

Or. en

Amendment 101Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 9 a (new)

Text proposed by the Commission Amendment

(9 a) Where the disseminated material is published under the editorial responsibility of a content provider, any decision as to the removal of such content can only be made based on a judicial order. This is necessary in order to fully respect the law of the Union and the right to freedom of expression and the right to freedom and pluralism of the media as enshrined in Article 11 of the Charter of Fundamental Rights.

Or. en

Amendment 102Eva Joly

AM\1177907EN.docx 53/121 PE636.146v02-00

EN

Page 54: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Proposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services whose main or one of the main purposes is to offer the storage of information provided by a recipient of the service at his or her request and in making the information stored available to the public, and who has general control of and access to the content data stored and processed, irrespective of whether this activity is of a mere technical, automatic and passive nature, and irrespective of whether a payment of the user is required. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other online services to the extent they make the information available to the public. The Regulation should also apply to hosting service providers established outside the Union but offering services to users who are within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services, and whether these services are specifically targeted at users in the Union. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one

PE636.146v02-00 54/121 AM\1177907EN.docx

EN

Page 55: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation. By contrast, services which consist of providing mere technical facility such as ‘cloud services’ which consist in the provision of on demand physical or virtual resources that provide computing and storage infrastructure capabilities on which the service provider has no contractual rights as to what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no technical capability to remove specific content stored by their customers or the end-users of their customers, or services that consist of selling goods on-line, delivery of goods as such, or the provision of services off-line, or private websites, including blogs, should not be considered as hosting service providers within the scope of this Regulation. Mere conduits and other electronic communication services within the meaning of Directive xxx/2019 of the European Parliament and of the Council [European Electronic Communication Code] or providers of caching services, or other services provided in other layers of the Internet infrastructure, such as registries and registrars, DNS (domain name system) or adjacent services, such as payment services or DDoS (distributed denial of service) protection services should also not be understood as hosting service providers. The same is the case for Interpersonal communication services that enable direct interpersonal and interactive exchange of information between a finite number of persons, whereby the persons initiating or participating in the communication determine its recipient(s).

Or. en

Amendment 103

AM\1177907EN.docx 55/121 PE636.146v02-00

EN

Page 56: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Rachida Dati, Monika Hohlmeier, Roberta MetsolaProposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information and material stored available to the public, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services, to the extent they make the information available to the public and websites where users can make comments or post reviews. Cloud infrastructure services which comprise the provision of on demand physical or virtual resources that provide computing and storage infrastructure capabilities on which the service provider has no contractual rights as to what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no technical capability to remove specific content stored by their customers or the end-users of their customers, should not be considered to fall within the scope of this Regulation. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same

PE636.146v02-00 56/121 AM\1177907EN.docx

EN

Page 57: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

Or. en

Amendment 104Michał BoniProposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to the public, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to the public. For the purpose of this Regulation cloud infrastructure services providers and services at other layers of the Internet infrastructure than application layer as well as webhosting service providers that provide the technical infrastructure to websites operators should not be considered as ‘hosting service providers’. The Regulation should apply to hosting

AM\1177907EN.docx 57/121 PE636.146v02-00

EN

Page 58: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

Or. en

Amendment 105Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third

(10) In order to cover those online hosting services where terrorist content is disseminated, this Directive should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to the public. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, to the extent they make the information available to the public. The Directive should also apply to hosting service providers established outside the Union but offering services within the

PE636.146v02-00 58/121 AM\1177907EN.docx

EN

Page 59: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Directive.

Or. en

Amendment 106Josef WeidenholzerProposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and

(10) In order to cover those online hosting services where illegal terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to the public, By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, to the extent they make the information available

AM\1177907EN.docx 59/121 PE636.146v02-00

EN

Page 60: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

to the public. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to illegal terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

Or. en

Amendment 107Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to the public. By way of example such providers of information society services include social media platforms, video streaming

PE636.146v02-00 60/121 AM\1177907EN.docx

EN

Page 61: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to the public. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

Or. en

Amendment 108Maite Pagazaurtundúa Ruiz, Cecilia WikströmProposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to the

AM\1177907EN.docx 61/121 PE636.146v02-00

EN

Page 62: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

general public, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to the general public. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

Or. en

Amendment 109Anna Maria Corazza BildtProposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the

PE636.146v02-00 62/121 AM\1177907EN.docx

EN

Page 63: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

service at his or her request and in making the information and material stored available to the public. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services excluding cloud IT infrastructure service providers, to the extent they make the information available to the public and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

Or. en

Amendment 110Ana GomesProposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply

(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply

AM\1177907EN.docx 63/121 PE636.146v02-00

EN

Page 64: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to the public and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

Or. en

Amendment 111Marietje Schaake, Sophia in ‘t Veld, Angelika MlinarProposal for a regulationRecital 10

Text proposed by the Commission Amendment

(10) In order to cover those online (10) In order to cover those online

PE636.146v02-00 64/121 AM\1177907EN.docx

EN

Page 65: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

hosting services where terrorist content is disseminated, this Regulation should apply to information society services whose main business activity consists in the storage information provided by a recipient of the service at his or her request and in making the information stored directly available to the public. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services, to the extent they make the information directly available to the public third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers hosting illegal content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.

Or. en

Justification

See also comments on the need to remove the reference to “third parties” by the Fundamental Rights Agency

Amendment 112

AM\1177907EN.docx 65/121 PE636.146v02-00

EN

Page 66: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Josef WeidenholzerProposal for a regulationRecital 11

Text proposed by the Commission Amendment

(11) A substantial connection to the Union should be relevant to determine the scope of this Regulation. Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering goods or services. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from providing local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities towards one or more Member State as set out in Article 17(1)(c) of Regulation 1215/2012 of the European Parliament and of the Council10 . On the other hand, provision of the service in view of mere compliance with the prohibition to discriminate laid down in Regulation (EU) 2018/302 of the European Parliament and of the Council11 cannot, on that ground alone, be considered as directing or targeting activities towards a given territory within the Union.

(11) A substantial connection to the Union should be relevant to determine the scope of this Regulation. Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from providing local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities towards one or more Member State as set out in Article 17(1)(c) of Regulation 1215/2012 of the European Parliament and of the Council10 . On the other hand, provision of the service in view of mere compliance with the prohibition to discriminate laid down in Regulation (EU) 2018/302 of the European Parliament and of the Council11 cannot, on that ground alone, be considered as directing or targeting activities towards a given territory within the Union.

_________________ _________________

PE636.146v02-00 66/121 AM\1177907EN.docx

EN

Page 67: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

10 Regulation (EU) 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).

10 Regulation (EU) 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).

11 Regulation (EU) 2018/302 of the European Parliament and of the Council of 28 February 2018 on addressing unjustified geo-blocking and other forms of discrimination based on customers’ nationality, place of residence or place of establishment within the internal market and amending Regulations (EC) No 2006/2004 and (EU) 2017/2394 and Directive 2009/22/EC (OJ L 601, 2.3.2018, p. 1).

11 Regulation (EU) 2018/302 of the European Parliament and of the Council of 28 February 2018 on addressing unjustified geo-blocking and other forms of discrimination based on customers’ nationality, place of residence or place of establishment within the internal market and amending Regulations (EC) No 2006/2004 and (EU) 2017/2394 and Directive 2009/22/EC (OJ L 601, 2.3.2018, p. 1).

Or. en

Amendment 113Monika BeňováProposal for a regulationRecital 11

Text proposed by the Commission Amendment

(11) A substantial connection to the Union should be relevant to determine the scope of this Regulation. Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering goods or services. The targeting of activities towards a Member State could also be derived from the availability of an application in the

(11) A substantial connection to the Union should be relevant to determine the scope of this Regulation. Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of substantial number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering goods or services. The targeting of activities towards a Member State could also be derived from the availability of an application in the

AM\1177907EN.docx 67/121 PE636.146v02-00

EN

Page 68: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

relevant national application store, from providing local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities towards one or more Member State as set out in Article 17(1)(c) of Regulation 1215/2012 of the European Parliament and of the Council10 . On the other hand, provision of the service in view of mere compliance with the prohibition to discriminate laid down in Regulation (EU) 2018/302 of the European Parliament and of the Council11 cannot, on that ground alone, be considered as directing or targeting activities towards a given territory within the Union.

relevant national application store, from providing local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities towards one or more Member State as set out in Article 17(1)(c) of Regulation 1215/2012 of the European Parliament and of the Council10 . On the other hand, provision of the service in view of mere compliance with the prohibition to discriminate laid down in Regulation (EU) 2018/302 of the European Parliament and of the Council11 cannot, on that ground alone, be considered as directing or targeting activities towards a given territory within the Union.

_________________ _________________10 Regulation (EU) 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).

10 Regulation (EU) 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).

11 Regulation (EU) 2018/302 of the European Parliament and of the Council of 28 February 2018 on addressing unjustified geo-blocking and other forms of discrimination based on customers’ nationality, place of residence or place of establishment within the internal market and amending Regulations (EC) No 2006/2004 and (EU) 2017/2394 and Directive 2009/22/EC (OJ L 601, 2.3.2018, p. 1).

11 Regulation (EU) 2018/302 of the European Parliament and of the Council of 28 February 2018 on addressing unjustified geo-blocking and other forms of discrimination based on customers’ nationality, place of residence or place of establishment within the internal market and amending Regulations (EC) No 2006/2004 and (EU) 2017/2394 and Directive 2009/22/EC (OJ L 601, 2.3.2018, p. 1).

Or. en

Amendment 114Eva JolyProposal for a regulationRecital 12

PE636.146v02-00 68/121 AM\1177907EN.docx

EN

Page 69: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Text proposed by the Commission Amendment

(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non-discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information.

(12) Without prejudice to Article 15 of Directive 2000/31/EC, hosting service providers that have been exposed to a significant number of uncontested removal orders should apply certain duties of care, in order to address the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non-discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information and freedom and pluralism of the media.

Or. en

Amendment 115Nicolas Bay, Gilles LebretonProposal for a regulationRecital 12

Text proposed by the Commission Amendment

(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non-discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to

(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non-discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of content has to

AM\1177907EN.docx 69/121 PE636.146v02-00

EN

Page 70: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

be undertaken in the observance of freedom of expression and information.

be undertaken in the absolute observance of freedom of expression and information and should always involve human verification for the first occurrence of removing or disabling said content.

Or. en

Amendment 116Josef WeidenholzerProposal for a regulationRecital 12

Text proposed by the Commission Amendment

(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non-discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information.

(12) Hosting service providers should apply certain duties of care, in order to address the abuse of their services for the purposes of dissemination of illegal terrorist content by content providers. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non-discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not illegal. The removal has to be undertaken in the observance of freedom of expression and freedom to receive and impart information

Or. en

Amendment 117Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 12

Text proposed by the Commission Amendment

(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. These duties of care should not amount to a general

(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. These duties of care should not amount to a general

PE636.146v02-00 70/121 AM\1177907EN.docx

EN

Page 71: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non-discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information.

monitoring obligation. Duties of care should include that hosting services providers act in a diligent, proportionate and non-discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information.

Or. en

Amendment 118Josef WeidenholzerProposal for a regulationRecital 13

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

(13) The procedure and obligations resulting from legal removal orders requesting hosting service providers to remove illegal terrorist content, following an assessment by the competent authorities, should be harmonised. Given the speed at which illegal terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that illegal terrorist content identified in the removal order is removed without undue delay after receiving the removal order.

Or. en

Amendment 119

AM\1177907EN.docx 71/121 PE636.146v02-00

EN

Page 72: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Jeroen LenaersProposal for a regulationRecital 13

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Given the various practices in the Union, Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services the effectiveness of removal of terrorist content depends on the swiftness of its removal. Therefore, this provision imposes obligations on hosting service providers following a legal removal order to ensure that terrorist content identified is removed or access to it is disabled immediately and in any event within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

Or. en

Amendment 120Marietje Schaake, Sophia in ‘t Veld, Angelika MlinarProposal for a regulationRecital 13

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following a legal assessment by the

PE636.146v02-00 72/121 AM\1177907EN.docx

EN

Page 73: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

competent authorities, should be harmonised. Member States should designate the competent authorities among their independent administrative and judicial authorities with that task. Given the speed at which illegal terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that illegal terrorist content identified in the removal order is removed or access to it is disabled without undue delay. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union based on the definition of illegal terrorist content, the implementation of effective redress mechanisms and generally taking as a basis of their decision any other applicable provisions from this Regulation.

Or. en

Amendment 121Maria GrapiniProposal for a regulationRecital 13

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised and uniformly implemented at European level. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to

AM\1177907EN.docx 73/121 PE636.146v02-00

EN

Page 74: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union and inform the authorities of such incidents.

Or. ro

Amendment 122Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 13

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative or judicial authorities with that task, as long as they are independent and impartial public authorities. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that the access to the terrorist content identified in the removal order is disabled without undue delay.

Or. en

Amendment 123Rachida Dati, Monika Hohlmeier, Anna Maria Corazza Bildt, Roberta MetsolaProposal for a regulationRecital 13

Text proposed by the Commission Amendment

PE636.146v02-00 74/121 AM\1177907EN.docx

EN

Page 75: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. Respecting this one hour delay is crucial to avoid a widespread dissemination of such content. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

Or. en

Amendment 124Maite Pagazaurtundúa Ruiz, Cecilia Wikström, Nathalie GriesbeckProposal for a regulationRecital 13

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should freely designate their competent authority. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour

AM\1177907EN.docx 75/121 PE636.146v02-00

EN

Page 76: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

Or. en

Amendment 125Eva JolyProposal for a regulationRecital 13

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

(13) Competent judicial authorities of the Member States should assess whether content is terrorist content, and whether to issue legal orders requesting hosting service providers to remove terrorist content or disable access to it. Member States should designate judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled expeditiously after receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

Or. en

Amendment 126Anna Maria Corazza BildtProposal for a regulationRecital 13

PE636.146v02-00 76/121 AM\1177907EN.docx

EN

Page 77: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should designate a single competent authority with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

Or. en

Amendment 127Ana GomesProposal for a regulationRecital 13

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate judicial authorities with that task that could be supported by administrative and law enforcement bodies. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that

AM\1177907EN.docx 77/121 PE636.146v02-00

EN

Page 78: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

Or. en

Amendment 128Michał Boni, Róża Gräfin von Thun und HohensteinProposal for a regulationRecital 13

Text proposed by the Commission Amendment

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should designate their competent judicial authority responsible for that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled expeditiously depending on the capacities of the service provider in question. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.

Or. en

Amendment 129Rachida Dati, Monika Hohlmeier, Roberta MetsolaProposal for a regulationRecital 13 a (new)

Text proposed by the Commission Amendment

PE636.146v02-00 78/121 AM\1177907EN.docx

EN

Page 79: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

(13 a) Where a hosting service provider has been unable to succeed in removing or disabling access to the relevant content within one hour, the hosting service providers should take all the necessary measures to execute the removal order without undue delay. The hosting service provider should report to the issuing authority on the reasons which made it impossible to execute the order. In case of delays, the nature and size of the hosting service providers should be taken into account, particularly in the case of micro enterprises or small-sized enterprises within the meaning of the Commission recommendation 2003/361/EC1a.

_________________1a Commission Recommendation of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises, OJ L 124, 20.5.2003, p. 36.

Or. en

Amendment 130Maite Pagazaurtundúa Ruiz, Cecilia WikströmProposal for a regulationRecital 14

Text proposed by the Commission Amendment

(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014

(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, in line with the rules protecting personal data. Competent authorities should therefore use qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 .

AM\1177907EN.docx 79/121 PE636.146v02-00

EN

Page 80: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

of the European Parliament and of the Council12 .

_________________ _________________12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).

12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).

Or. en

Amendment 131Eva JolyProposal for a regulationRecital 14

Text proposed by the Commission Amendment

(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 .

(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the identity of the sender, the accuracy of the date and the time of sending and receipt of the order, such as by electronically signed email or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services with qualified electronic signatures as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12.

_________________ _________________12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257,

12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257,

PE636.146v02-00 80/121 AM\1177907EN.docx

EN

Page 81: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

28.8.2014, p. 73). 28.8.2014, p. 73).

Or. en

Amendment 132Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 14

Text proposed by the Commission Amendment

(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 .

(14) The competent authority should transmit the removal order directly to the addressee in the same Member State and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 .

_________________ _________________12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).

12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).

Or. en

Amendment 133Josef WeidenholzerProposal for a regulationRecital 14

Text proposed by the Commission Amendment

AM\1177907EN.docx 81/121 PE636.146v02-00

EN

Page 82: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 .

(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules regarding protection of personal data. This requirement may notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 .

_________________ _________________12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).

12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).

Or. en

Amendment 134Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 15

Text proposed by the Commission Amendment

(15) Referrals by the competent authorities or Europol constitute an effective and swift means of making hosting service providers aware of specific content on their services. This mechanism of alerting hosting service providers to information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility its own terms and conditions, should remain available in

deleted

PE636.146v02-00 82/121 AM\1177907EN.docx

EN

Page 83: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

addition to removal orders. It is important that hosting service providers assess such referrals as a matter of priority and provide swift feedback about action taken. The ultimate decision about whether or not to remove the content because it is not compatible with their terms and conditions remains with the hosting service provider. In implementing this Regulation related to referrals, Europol’s mandate as laid down in Regulation (EU) 2016/79413 remains unaffected.

_________________13 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).

Or. en

Amendment 135Josef WeidenholzerProposal for a regulationRecital 15

Text proposed by the Commission Amendment

(15) Referrals by the competent authorities or Europol constitute an effective and swift means of making hosting service providers aware of specific content on their services. This mechanism of alerting hosting service providers to information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility its own terms and conditions, should remain available in addition to removal orders. It is important that hosting service providers assess such referrals as a matter of priority and

deleted

AM\1177907EN.docx 83/121 PE636.146v02-00

EN

Page 84: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

provide swift feedback about action taken. The ultimate decision about whether or not to remove the content because it is not compatible with their terms and conditions remains with the hosting service provider. In implementing this Regulation related to referrals, Europol’s mandate as laid down in Regulation (EU) 2016/79413 remains unaffected.

_________________13 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).

Or. en

Amendment 136Eva JolyProposal for a regulationRecital 15

Text proposed by the Commission Amendment

(15) Referrals by the competent authorities or Europol constitute an effective and swift means of making hosting service providers aware of specific content on their services. This mechanism of alerting hosting service providers to information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility its own terms and conditions, should remain available in addition to removal orders. It is important that hosting service providers assess such referrals as a matter of priority and provide swift feedback about action taken. The ultimate decision about whether or not to remove the content because it is not

(15) Referrals by Europol as laid down in Article 4(1)(m) of Regulation (EU) 2016/79413 remain unaffected.

PE636.146v02-00 84/121 AM\1177907EN.docx

EN

Page 85: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

compatible with their terms and conditions remains with the hosting service provider. In implementing this Regulation related to referrals, Europol’s mandate as laid down in Regulation (EU) 2016/79413 remains unaffected.

_________________ _________________13 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).

13 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).

Or. en

Amendment 137Rachida Dati, Monika Hohlmeier, Anna Maria Corazza Bildt, Roberta MetsolaProposal for a regulationRecital 15

Text proposed by the Commission Amendment

(15) Referrals by the competent authorities or Europol constitute an effective and swift means of making hosting service providers aware of specific content on their services. This mechanism of alerting hosting service providers to information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility its own terms and conditions, should remain available in addition to removal orders. It is important that hosting service providers assess such referrals as a matter of priority and provide swift feedback about action taken. The ultimate decision about whether or not to remove the content because it is not compatible with their terms and conditions remains with the hosting service provider. In implementing this Regulation related to referrals, Europol’s mandate as laid down

(15) Referrals by the competent authorities or Europol constitute an effective and swift means of making hosting service providers aware of specific content on their services. This mechanism of alerting hosting service providers to information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility its own terms and conditions, should remain available and continue to be developed, in addition to removal orders. It is important that hosting service providers assess such referrals as a matter of priority and provide swift feedback about action taken. The ultimate decision about whether or not to remove the content because it is not compatible with their terms and conditions remains with the hosting service provider. In implementing this Regulation related to referrals, Europol’s mandate as

AM\1177907EN.docx 85/121 PE636.146v02-00

EN

Page 86: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

in Regulation (EU) 2016/79413 remains unaffected.

laid down in Regulation (EU) 2016/79413 remains unaffected.

_________________ _________________13 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).

13 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).

Or. en

Amendment 138Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 16

Text proposed by the Commission Amendment

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

deleted

PE636.146v02-00 86/121 AM\1177907EN.docx

EN

Page 87: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Or. en

Amendment 139Josef WeidenholzerProposal for a regulationRecital 16

Text proposed by the Commission Amendment

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

deleted

Or. en

Amendment 140Marietje Schaake, Sophia in ‘t Veld, Angelika MlinarProposal for a regulationRecital 16

Text proposed by the Commission Amendment

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using

(16) Hosting service providers may, where appropriate, take voluntary additional measures to protect their services against the dissemination of

AM\1177907EN.docx 87/121 PE636.146v02-00

EN

Page 88: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

terrorist content. The measures should be effective and proportionate, taking into account the risk and level of exposure to terrorist content, paying particular regard to the fundamental rights of the users, and the fundamental importance of the freedom of expression and information in an open and democratic society. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

Or. en

Amendment 141Maite Pagazaurtundúa Ruiz, Cecilia WikströmProposal for a regulationRecital 16

Text proposed by the Commission Amendment

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess on a voluntary basis whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine, with the help of the

PE636.146v02-00 88/121 AM\1177907EN.docx

EN

Page 89: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

Member States, what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content and therefore should not be covered by Article 6.

Or. en

Amendment 142Caterina ChinniciProposal for a regulationRecital 16

Text proposed by the Commission Amendment

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content. Proactive measures should empower users to flag online terrorist content, and encourage service providers to mobilize adequate capacity to receive, review,

AM\1177907EN.docx 89/121 PE636.146v02-00

EN

Page 90: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

process and respond to flagged content.

Or. en

Amendment 143Eva JolyProposal for a regulationRecital 16

Text proposed by the Commission Amendment

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

(16) The vast majority of hosting service providers are never exposed to illegal terrorist content. Given the complexity of effectively identifying and removing terrorist content at scale, and the potential impact on fundamental rights, duties of care that go beyond the mere removal of terrorist content online following removal orders from competent authorities could be taken by those hosting service providers that have been subject to a significant number of uncontested removal orders. Those hosting service providers should assess whether it is appropriate to take additional measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, those hosting service providers should only put appropriate, effective, necessary and proportionate additional measures in place. This requirement should not imply an obligation of general monitoring. In the context of this assessment, the absence of removal orders and referrals by Europol addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

Or. en

Amendment 144Rachida Dati, Monika Hohlmeier, Anna Maria Corazza Bildt, Roberta MetsolaProposal for a regulationRecital 16

PE636.146v02-00 90/121 AM\1177907EN.docx

EN

Page 91: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Text proposed by the Commission Amendment

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, based on cooperation and exchange of best practice, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

Or. en

Amendment 145Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 16

Text proposed by the Commission Amendment

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take

(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures may be used to tackle terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of

AM\1177907EN.docx 91/121 PE636.146v02-00

EN

Page 92: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.

exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders addressed to a hosting provider, is an indication of a low level of exposure to terrorist content. This Regulation does not create any obligation on hosting service providers to put in place proactive measures.

Or. en

Amendment 146Marietje Schaake, Sophia in ‘t Veld, Angelika MlinarProposal for a regulationRecital 17

Text proposed by the Commission Amendment

(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of

(17) When putting in place additional measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not illegal terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the

PE636.146v02-00 92/121 AM\1177907EN.docx

EN

Page 93: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

the underlying technology and the ensuing impact on fundamental rights.

reliability of the underlying technology and the ensuing impact on fundamental rights. Private databases that contain hash fingerprints of audio and video of terrorist content, as defined by the Community Guidelines or the Terms of Services of these companies should be accessible to independent third parties in order to avoid any unintended and erroneous decision leading to removal of content that is not illegal terrorist content.

Or. en

Justification

Proactive automated measures require a database of allegedly terrorist content. The most important database today of such content is owned by Facebook and used by all major tech companies. Content is labelled as terrorist content on the basis of the Terms of Service of a company, not on the basis of a legal standard. Currently, no independent external party is able to scrutinize the content of this database, which is highly problematic in order to assess the quality of the content of the database.

Amendment 147Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 17

Text proposed by the Commission Amendment

(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated

(17) If they decide to put in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is guaranteed. They should therefore perform and make public a risk assessment on the level of exposure to terrorism content that is also based on the number of removal orders and referrals received, as well as draw up a remedial action plan to tackle terrorist content proportionate to the level of risk identified. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards,

AM\1177907EN.docx 93/121 PE636.146v02-00

EN

Page 94: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.

including notably human oversight and verifications to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means taken by the hosting service provider should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.

Or. en

Amendment 148Nicolas Bay, Gilles LebretonProposal for a regulationRecital 17

Text proposed by the Commission Amendment

(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.

(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications at least for the first removal of a content, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect or remove terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights. The use of automated means to remove terrorist content should only be

PE636.146v02-00 94/121 AM\1177907EN.docx

EN

Page 95: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

limited to the replication of content that has already been at least once verified and deleted by a human.

Or. en

Amendment 149Eva JolyProposal for a regulationRecital 17

Text proposed by the Commission Amendment

(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.

(17) When putting in place additional measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - as well as the rights to privacy and protection of personal data, are preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. A final decision to remove or disable access to content should always be taken by a natural person. Any decision to use automated means should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights. In any case, hosting service providers should undertake a fundamental rights audit for any automated means for detecting terrorist content they use.

Or. en

Amendment 150Josef Weidenholzer

AM\1177907EN.docx 95/121 PE636.146v02-00

EN

Page 96: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Proposal for a regulationRecital 17

Text proposed by the Commission Amendment

(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.

(17) When exercising duty of care, hosting service providers should ensure that users’ right to freedom of expression and freedom to receive and impart information is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data and protection of privacy, the hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, as well as periodic review of actions taken to avoid any unintended and erroneous practices leading to removal of content that is not illegal terrorist content. The hosting service providers should also ensure that their exercising due care does not produce disproportionate, discriminatory, untargeted, unspecific or unjustified effects.

Or. en

Amendment 151Maite Pagazaurtundúa Ruiz, Cecilia WikströmProposal for a regulationRecital 17

Text proposed by the Commission Amendment

(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with

(17) When putting in place voluntary proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers

PE636.146v02-00 96/121 AM\1177907EN.docx

EN

Page 97: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.

should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.

Or. en

Amendment 152Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 18

Text proposed by the Commission Amendment

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective

deleted

AM\1177907EN.docx 97/121 PE636.146v02-00

EN

Page 98: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

Or. en

Amendment 153Josef WeidenholzerProposal for a regulationRecital 18

Text proposed by the Commission Amendment

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if

(18) In order to ensure that hosting service providers exposed to illegal terrorist content take appropriate measures to prevent the misuse of their services by content providers, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the measures taken.

PE636.146v02-00 98/121 AM\1177907EN.docx

EN

Page 99: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

Or. en

Amendment 154Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 18

Text proposed by the Commission Amendment

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting

(18) The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In particular, the hosting service provider shall provide the competent authorities with all necessary information about the automated tools used to allow a thorough public oversight on the effectiveness of the tools and to ensure that the latter do not produce discriminatory, untargeted, unspecific or unjustified results.

AM\1177907EN.docx 99/121 PE636.146v02-00

EN

Page 100: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

Or. en

Amendment 155Eva JolyProposal for a regulationRecital 18

Text proposed by the Commission Amendment

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to address the misuse of their services, the competent authorities may recommend to hosting service providers having received a significant amount of removal orders, which have become final, additional measures to be taken. These could consist of measures to that employ the use of reliable technical tools to detect and identify new terrorist content, but that should leave the final decision to removal or disabling access to a decision by a natural person. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

PE636.146v02-00 100/121 AM\1177907EN.docx

EN

Page 101: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

Or. en

Amendment 156Caterina ChinniciProposal for a regulationRecital 18

Text proposed by the Commission Amendment

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures,

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures,

AM\1177907EN.docx 101/121 PE636.146v02-00

EN

Page 102: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union). Long term proactive measures may include the integration of media and information literacy and internet use into national education systems in order to empower young citizens with the tools to learn how to use the internet responsibly, with a view to avoiding possible risks of radicalisation.

Or. en

Amendment 157Maite Pagazaurtundúa Ruiz, Cecilia WikströmProposal for a regulationRecital 18

Text proposed by the Commission Amendment

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate

(18) In order to ensure that hosting service providers exposed to terrorist content due to the nature of the services they provide take appropriate measures to prevent the misuse of their services, the competent authorities should request these hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-appearance of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account

PE636.146v02-00 102/121 AM\1177907EN.docx

EN

Page 103: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

Or. en

Amendment 158Ana GomesProposal for a regulationRecital 18

Text proposed by the Commission Amendment

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the

AM\1177907EN.docx 103/121 PE636.146v02-00

EN

Page 104: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union) as well as the level of exposure of the host service provider to terrorist content.

Or. en

Amendment 159Nicolas Bay, Gilles LebretonProposal for a regulationRecital 18

Text proposed by the Commission Amendment

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the

PE636.146v02-00 104/121 AM\1177907EN.docx

EN

Page 105: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

necessary abilities for systematic human oversight and verification for the first removal of a content. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

Or. en

Amendment 160Rachida Dati, Monika Hohlmeier, Anna Maria Corazza Bildt, Roberta MetsolaProposal for a regulationRecital 18

Text proposed by the Commission Amendment

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and

(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and

AM\1177907EN.docx 105/121 PE636.146v02-00

EN

Page 106: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their size and economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).

Or. en

Amendment 161Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 19

Text proposed by the Commission Amendment

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights

deleted

PE636.146v02-00 106/121 AM\1177907EN.docx

EN

Page 107: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

Or. en

Justification

Recital 19 as well as Art. 6 of this proposal explicitly break with the limited liability principles set out in Directive (EC) 2000/31 (the ‘e-commerce directive’): Art. 14 stipulates that companies can host users’ content without assuming liability for the legality of that content (host only has to take action once notified about illegal content); Art. 15 prohibits a general monitoring obligation for companies, i.e. Member states cannot oblige ISPs to systematically scan and filter uploaded content. The proposed regulation does not provide sufficient evidence and assessment against objective criteria that would justify derogating from this well-established principle in EU law.

Amendment 162Josef WeidenholzerProposal for a regulationRecital 19

Text proposed by the Commission Amendment

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which

deleted

AM\1177907EN.docx 107/121 PE636.146v02-00

EN

Page 108: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

Or. en

Amendment 163Marietje Schaake, Sophia in ‘t Veld, Angelika MlinarProposal for a regulationRecital 19

Text proposed by the Commission Amendment

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of

(19) The measures taken by the hosting service provider should not lead to the imposition of a general monitoring, as provided in Article 15(1) of Directive 2000/31/EC. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

PE636.146v02-00 108/121 AM\1177907EN.docx

EN

Page 109: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

expression and information and the freedom to conduct a business, and provide appropriate justification.

Or. en

Amendment 164Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 19

Text proposed by the Commission Amendment

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

(19) The competent authority should enter into a dialogue with the hosting service provider about the proactive measures adopted by the hosting service provider. Such dialogue should not lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. The competent authority and the hosting service provider should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the right to freedom of expression and information, the right to privacy and the freedom to conduct a business, and provide appropriate justification.

Or. en

AM\1177907EN.docx 109/121 PE636.146v02-00

EN

Page 110: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Amendment 165Eva JolyProposal for a regulationRecital 19

Text proposed by the Commission Amendment

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

(19) Following the recommendation, the competent authority should enter into a dialogue with the hosting service provider about the necessary additional measures to be put in place. Such measures should not lead to the imposition of a general obligation to monitor uploaded content, as provided in Article 15(1) of Directive 2000/31/EC. Before adopting such recommendations, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information, the freedom of the media, the rights to privacy and to the protection of personal data, and the freedom to conduct a business, and provide appropriate justification.

Or. en

Amendment 166Maite Pagazaurtundúa Ruiz, Cecilia WikströmProposal for a regulationRecital 19

Text proposed by the Commission Amendment

PE636.146v02-00 110/121 AM\1177907EN.docx

EN

Page 111: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

(19) Following the request, the competent authority should enter into a constructive dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where there is a manifest lack of cooperation coming from the hosting service provider. A decision to impose such specific proactive measures should not lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, the economic capacity of the hosting service provider, and provide appropriate justification.

Or. en

Amendment 167Andrejs MamikinsProposal for a regulationRecital 19

Text proposed by the Commission Amendment

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent

AM\1177907EN.docx 111/121 PE636.146v02-00

EN

Page 112: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. The competent authority should only impose proactive measures that the hosting service provider can reasonably be expected to implement, taking into account, among other factors, the hosting service providers’ available human resources. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

Or. en

Amendment 168Ana GomesProposal for a regulationRecital 19

Text proposed by the Commission Amendment

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of

PE636.146v02-00 112/121 AM\1177907EN.docx

EN

Page 113: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC, and has to guarantee that it does not conflicted with existing EU law, in particular the latter E-Commerce Directive and its Article 15(1) . Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent judicial authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent judicial authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

Or. en

Amendment 169Rachida Dati, Monika Hohlmeier, Anna Maria Corazza Bildt, Roberta MetsolaProposal for a regulationRecital 19

Text proposed by the Commission Amendment

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet

(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet

AM\1177907EN.docx 113/121 PE636.146v02-00

EN

Page 114: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

the risks taking into consideration the economic and technical capacity of the platform. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.

Or. en

Amendment 170Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 20

Text proposed by the Commission Amendment

(20) The obligation on hosting service providers to preserve removed content and related data, should be laid down for specific purposes and limited in time to what is necessary. There is need to extend the preservation requirement to related data to the extent that any such data would otherwise be lost as a consequence of the removal of the content in question. Related data can include data such as ‘subscriber data’, including in particular data pertaining to the identity of the content provider as well as ‘access data’, including for instance data about the date

deleted

PE636.146v02-00 114/121 AM\1177907EN.docx

EN

Page 115: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

and time of use by the content provider, or the log-in to and log-off from the service, together with the IP address allocated by the internet access service provider to the content provider.

Or. en

Amendment 171Birgit Sippel, Anna Hedh, Josef WeidenholzerProposal for a regulationRecital 20

Text proposed by the Commission Amendment

(20) The obligation on hosting service providers to preserve removed content and related data, should be laid down for specific purposes and limited in time to what is necessary. There is need to extend the preservation requirement to related data to the extent that any such data would otherwise be lost as a consequence of the removal of the content in question. Related data can include data such as ‘subscriber data’, including in particular data pertaining to the identity of the content provider as well as ‘access data’, including for instance data about the date and time of use by the content provider, or the log-in to and log-off from the service, together with the IP address allocated by the internet access service provider to the content provider.

(20) The obligation on hosting service providers to preserve removed content and related data, should be for the sole purpose of proceedings of administrative or judicial review and remedy and limited in time to what is strictly necessary, but generally not exceeding six months. Where there is legitimate need to extend the preservation requirement to related data, it can only be preserved to the extent that any such data would otherwise be lost as a consequence of the removal of the content in question. Related data should be limited to ‘subscriber data’, i.e. data pertaining to the identity of the content provider as well as ‘access data’, i.e. data about the date and time of use by the content provider, or the log-in to and log-off from the service, together with the IP address allocated by the internet access service provider to the content provider.

Or. en

Amendment 172Josef WeidenholzerProposal for a regulationRecital 20

Text proposed by the Commission Amendment

(20) The obligation on hosting service providers to preserve removed content and

(20) The obligation on hosting service providers to preserve removed content and

AM\1177907EN.docx 115/121 PE636.146v02-00

EN

Page 116: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

related data, should be laid down for specific purposes and limited in time to what is necessary. There is need to extend the preservation requirement to related data to the extent that any such data would otherwise be lost as a consequence of the removal of the content in question. Related data can include data such as ‘subscriber data’, including in particular data pertaining to the identity of the content provider as well as ‘access data’, including for instance data about the date and time of use by the content provider, or the log-in to and log-off from the service, together with the IP address allocated by the internet access service provider to the content provider.

related data, should be for the sole purpose of proceedings of administrative or judicial review and remedy and limited in time to what is strictly necessary, but generally not exceeding six months. Where there is legitimate need to extend the preservation requirement to related data, it can only be preserved to the extent that any such data would otherwise be lost as a consequence of the removal of the content in question. Related data should be limited to ‘subscriber data’, i.e. data pertaining to the identity of the content provider as well as ‘access data’, i.e. data about the date and time of use by the content provider, or the log-in to and log-off from the service, together with the IP address allocated by the internet access service provider to the content provider.

Or. en

Amendment 173Josef WeidenholzerProposal for a regulationRecital 21

Text proposed by the Commission Amendment

(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity. Where companies remove material or disable access to it, in particular through their own proactive measures, and do not inform the relevant authority because they

(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure.

PE636.146v02-00 116/121 AM\1177907EN.docx

EN

Page 117: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

assess that it does not fall in the scope of Article 13(4) of this Regulation, law enforcement may be unaware of the existence of the content. Therefore, the preservation of content for purposes of prevention, detection, investigation and prosecution of terrorist offences is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can therefore contribute to prosecuting terrorist offences or to preventing serious risks to public security.

Or. en

Amendment 174Cornelia Ernst, Barbara Spinelli, Marie-Christine VergiatProposal for a regulationRecital 21

Text proposed by the Commission Amendment

(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity. Where companies remove material or disable access to it, in particular through their own proactive measures, and do not inform the relevant authority because they assess that it does not fall in the scope of Article 13(4) of this Regulation, law enforcement may be unaware of the existence of the content. Therefore, the preservation of content for purposes of prevention, detection, investigation and

(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure.

AM\1177907EN.docx 117/121 PE636.146v02-00

EN

Page 118: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

prosecution of terrorist offences is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can therefore contribute to prosecuting terrorist offences or to preventing serious risks to public security.

Or. en

Amendment 175Marietje Schaake, Sophia in ‘t Veld, Angelika MlinarProposal for a regulationRecital 21

Text proposed by the Commission Amendment

(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity. Where companies remove material or disable access to it, in particular through their own proactive measures, and do not inform the relevant authority because they assess that it does not fall in the scope of Article 13(4) of this Regulation, law enforcement may be unaware of the existence of the content. Therefore, the preservation of content for purposes of prevention, detection, investigation and prosecution of terrorist offences is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can therefore contribute to prosecuting terrorist offences or to preventing serious risks to

(21) The obligation to preserve the content for proceedings of independent administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity. Where companies remove material or disable access to it, in particular through their own proactive measures, and do not inform the independent administrative or judicial authority because they assess that it does not fall in the scope of Article 13(4) of this Regulation, law enforcement may be unaware of the existence of the content. Therefore, the preservation of content for purposes of prevention, detection, investigation and prosecution of terrorist offences is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can

PE636.146v02-00 118/121 AM\1177907EN.docx

EN

Page 119: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

public security. therefore contribute to prosecuting terrorist offences or to preventing serious risks to public security.

Or. en

Amendment 176Eva JolyProposal for a regulationRecital 21

Text proposed by the Commission Amendment

(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity. Where companies remove material or disable access to it, in particular through their own proactive measures, and do not inform the relevant authority because they assess that it does not fall in the scope of Article 13(4) of this Regulation, law enforcement may be unaware of the existence of the content. Therefore, the preservation of content for purposes of prevention, detection, investigation and prosecution of terrorist offences is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can therefore contribute to prosecuting terrorist offences or to preventing serious risks to public security.

(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity, and for prosecuting and convicting terrorists. Where companies remove material or disable access to it, in particular through their own additional measures, law enforcement may be unaware of the existence of the content. Therefore, the preservation of content for purposes of prevention, detection, investigation and prosecution of terrorist offences, which should be initiated after the authorities have been informed pursuant to Article 13(4) of this Regulation, is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can therefore contribute to prosecuting terrorist offences or to preventing serious risks to public security.

Or. en

AM\1177907EN.docx 119/121 PE636.146v02-00

EN

Page 120: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

Amendment 177Josef WeidenholzerProposal for a regulationRecital 22

Text proposed by the Commission Amendment

(22) To ensure proportionality, the period of preservation should be limited to six months to allow the content providers sufficient time to initiate the review process and to enable law enforcement access to relevant data for the investigation and prosecution of terrorist offences. However, this period may be prolonged for the period that is necessary in case the review proceedings are initiated but not finalised within the six months period upon request by the authority carrying out the review. This duration should be sufficient to allow law enforcement authorities to preserve the necessary evidence in relation to investigations, while ensuring the balance with the fundamental rights concerned.

(22) To ensure proportionality, the period of preservation should be limited to six months to allow the content providers sufficient time to initiate the review process. However, this period may be prolonged for the period that is necessary for the sole case that the review proceedings are initiated but not finalised within the six months period upon request by the authority carrying out the review.

Or. en

Amendment 178Jeroen LenaersProposal for a regulationRecital 22

Text proposed by the Commission Amendment

(22) To ensure proportionality, the period of preservation should be limited to six months to allow the content providers sufficient time to initiate the review process and to enable law enforcement access to relevant data for the investigation and prosecution of terrorist offences. However, this period may be prolonged for the period that is necessary in case the review proceedings are initiated but not finalised within the six months period upon request by the authority carrying out the review. This duration should be

(22) To ensure proportionality, the period of preservation should be limited to one year to allow the content providers sufficient time to initiate the review process and to enable law enforcement access to relevant data for the investigation and prosecution of terrorist offences. However, this period may be prolonged for the period that is necessary in case the proceedings are initiated but not finalised within the two years period upon request by the authority carrying out the review process or the investigation and

PE636.146v02-00 120/121 AM\1177907EN.docx

EN

Page 121: AM_Com_LegReport - europarl.europa.eu  · Web viewIt is questionable if the choice of legal basis (114 TFEU, internal market) is adequate in light of the stated goal: “This Regulation

sufficient to allow law enforcement authorities to preserve the necessary evidence in relation to investigations, while ensuring the balance with the fundamental rights concerned.

prosecution of terrorist offences. This duration should take into consideration the special nature of the removed content, which has already been deemed terrorist content after an assessment by the competent authority, and should therefore be sufficient to allow law enforcement authorities to preserve the necessary evidence in order to effectively investigate and prosecute, while ensuring the balance with the fundamental rights concerned.

Or. en

AM\1177907EN.docx 121/121 PE636.146v02-00

EN


Recommended