EDRi-gram 15.7, 5 April 2017

EDRi-gram 15.7, 5 April 2017

Read online: https://edri.org/edri-gram/15-7/


1. Social media companies launch upload filter to combat extremism
2. Denmark: Weakening the oversight of intelligence services
3. UK government attacks encryption … again
4. Reckless social media law threatens freedom of expression in Germany
5. What to do with the online platforms – the academics point of view
6. ENDitorial: Transparency and law-making – mutually exclusive?
7. Recommended Action
8. Recommended Reading
9. Agenda
10. About

1. Social media companies launch upload filter to combat extremism

A database set up jointly by Facebook, Microsoft, Twitter and YouTube
aims to identify “terrorist and radicalising” content automatically and
to remove it from these platforms.

The prototype of a mechanism to prevent the publication of violent
terrorist content on platforms such as Facebook and Twitter commenced
operations last week. This was announced by European Commissioner for
Migration, Home Affairs and Citizenship, Dimitris Avramopoulos, who met
representatives from Facebook, Twitter and YouTube on 10 March in order
to discuss the progress made so far with regard to the “removal of
terrorist content online”.

Support our work with a one-off-donation!

It appears that no research whatsoever has been done on the likely
impact of this initiative, including no review mechanisms on its impact
and no way of establishing whether the initiative has counter-productive

This prototype is a database operated jointly by Facebook, YouTube,
Twitter and Microsoft that gathers “digital fingerprints” (hashes) of
content marked as “terrorist” or “extremist”. Once designated as such,
photos or videos can, in theory, no longer be uploaded to these
platforms. The upload filters are intended to ensure that undesirable
content is identified and removed more swiftly. The role of judicial and
law enforcement authorities in this process has, unsurprisingly, not
been mentioned.

The participating companies are part of what is known as the EU Internet
Forum. With this initiative, the European Commission intends to
encourage internet companies to, among other things, monitor content on
their platforms more intensively, outside of an accountable, law-based

Alongside the removal of content online, the EU Internet Forum discusses
further measures in the area of cyber security and the production of
electronic evidence. The ministers of the interior of the EU member
states are calling for greater numbers of direct inquiries to be
submitted to companies in the future, thereby circumventing the often
laborious route that is international judicial assistance.

This primarily applies to the operators of cloud servers in the US. The
Commission is currently assessing whether US companies could fall under
the remit of the European Investigation Order. This directive could be
extended to include operators that, while headquartered in a third
country, offer their services in the European Union.

Since the establishment of the EU Internet Forum in December 2015,
access by investigative authorities to encrypted telecommunication has
been on the agenda. According to the German Federal Ministry of the
Interior, the European Commission had initially kept a low profile in
this area. According to a Commission press release, the issue of
encryption was, however, discussed at the last meeting of the Forum.

The EU’s Counter-terrorism Coordinator, Gilles de Kerchove, who has
called for assistance with decryption by companies in a number of papers
over the last two years, was also in attendance. His post was
established in order to present new priority areas for action with
respect to fighting terrorism and extremism on a biannual basis.

Likewise, under the umbrella of the EU Internet Forum, the Commission is
currently launching an EU Civil Society Empowerment Programme (CSEP).
This is overseen by the European Commission’s Radical Awareness Network
(RAN), which became fully operational as a “Centre of Excellence” in 2016.

In previous press releases, the Commission announced that the programme
would receive financial support to the tune of ten million euros. It is
intended to help “civil society, grassroots groups and credible voices”
to fill the internet with “alternative narratives”. A particular focus
is on “capacity and/or resources” for disseminating messages to achieve
this end. The aim here is for participants to develop campaigns in
cooperation with internet companies.

Little information is available regarding the EU Civil Society
Empowerment Programme. On the Commission’s website, it appears that an
opening event was to take place at the beginning of March 2017, attended
by internet companies, “marketing experts” and “civil society”.
Following this event, campaigns were launched, but no details have been
disclosed. In 2016, it was announced that Twitter could accord
“counter-narratives” greater visibility without charging the usual fee
for this service.

Support our work – make a recurrent donation!

Update: Statewatch published the 2017-2019 Joint Activity Plan (JAP) for
the Civil Society Empowerment Programme at

The article was originally published at

EDRi: The tale of the fight for transparency in the EU Internet Forum

Council conclusions on improving criminal justice in cyberspace

Progress Report following the Conclusions of the Council of the European
Union on Improving Criminal Justice in Cyberspace

(Contribution by Matthias Monroy, Bürgerrechte & Polizei/CILIP, Germany)

2. Denmark: Weakening the oversight of intelligence services

A draft law to amend the data protection provisions of the law on the
oversight of the Danish Security and Intelligence Service (PET) was
submitted for public consultation in September 2016. In their
consultation responses, several NGOs including EDRi member IT-Pol
Denmark, as well as the Danish Intelligence Oversight Board (TET)
criticised the proposal. The amendments would legalise PET’s existing
data processing practices, removing any obligation to regularly assess
whether the information collected on citizens is still necessary, as
well as the obligation to delete personal data, in some circumstances.

Support our work – make a recurrent donation!

The Danish Security and Intelligence Service (PET) is part of the Danish
National Police. The main responsibility of PET is prevention and
prosecution of offences under chapters 12 and 13 of the Danish Penal
Code, which cover national security and terrorism. Compared to the rest
of the Danish National Police service, PET is subject to much weaker
data protection standards. For data collection, the main rule is that
PET can collect information on citizens, unless it can be ruled out
beforehand that the information is relevant. Upon request, all Danish
public authorities are required to provide information on citizens to
PET without a court order, if PET believes that the information can be
assumed to be relevant for PET’s tasks in connection with chapters 12
and 13 of the Penal Code. Furthermore, most of the provisions of the
Data Protection Act do not apply to PET. Denmark is currently
transposing the Law Enforcement Data Protection (LEDP) Directive
2016/680 into national law. In the draft law, PET is completely exempted
based on the national security exemption in Article 2(3)(a) and recital
14 of the LEDP Directive, even though PET regularly exchanges
information with police authorities in other EU Member States.

Since 2014, independent oversight of PET is provided by the Danish
Intelligence Oversight Board (TET). The oversight of PET covers the
provisions in the special PET law on data collection and internal
information processing, including the rules for deletion of personal
data when it is no longer necessary or when the statutory retention
period of 10-15 years is exceeded. All citizens can ask TET to
investigate whether PET processes information about them unlawfully. If
the investigation shows that information is processed unlawfully, TET
can order PET to delete the information, but the citizen will not be
notified of this decision. TET can also investigate the data processing
practices of PET on its own initiative. Last but not least, TET
publishes an annual report about its oversight of PET.

The annual TET reports for 2014 and 2015 contained substantial criticism
of PET. Even though the legal standards for processing personal data on
citizens are very weak, PET apparently has severe problems living up to
these standards. For the 2014 report, TET looked at a sample of persons
registered by PET and found that information about roughly half of them
should have been deleted because retention periods were exceeded or
because the information was no longer necessary. For the 2015 report,
TET conducted a more detailed investigation of PET’s data processing
practices which confirmed the conclusions of the 2014 report. The
databases of PET contained a substantial amount of personal data which
should have been deleted, at least under the interpretation of the PET
law used by TET.

The TET report for the year 2015 also revealed that TET and PET did not
agree on the interpretation of the law governing the operations of PET.
The main controversy was related to personal data which was part of
another document. TET interpreted the PET law as saying that if
information about a citizen was no longer necessary, the information
should be deleted, irrespective of whether the personal data in question
was a full document or part of another document. PET interpreted the law
differently and refused to delete the personal data if it was part of
another document which was still necessary for PET’s tasks.

For oversight investigations that are not linked to complaints from
citizens, TET can only make recommendations to PET and the Ministry of
Justice. In May 2015, TET informed the Ministry of Justice of the
disagreement with PET, but despite several requests to the Ministry of
Justice for a reply to the letter, TET had not received a reply by May
2016. Shortly after the TET report for 2015 with the substantial
criticism of PET was published in May 2016, the Minister of Justice
announced that he would propose amendments to the PET law in the next
parliamentary year to clarify to legal issues raised by TET.

A draft law amending the PET law was submitted for public consultation
in September 2016. Two specific amendments “clarified” the legal
situation for PET by simply removing the two specific data protection
obligations which had given rise to the criticism in the 2014 and 2015
annual reports from TET. The first amendment removes any obligation on
PET to regularly assess whether the information collected on citizens is
still necessary. Under the amendment, PET is only required to delete
documents and cases that are no longer necessary, if PET discovers this
during other information processing tasks. The second amendment provides
that PET has no obligation to delete personal data which is no longer
necessary if this personal data is part of another document which is
still necessary for PET’s tasks. Only full documents and cases must be
deleted if they are no longer necessary, not partial elements of documents.

In essence, the two amendments legalise the existing data processing
practices of PET which TET had concluded were unlawful in the annual
report for 2015. The Danish government justified the amendments on
grounds that following the interpretation by TET of the existing law
would require too many resources and reduce PET’s counterterrorism
capabilities. Apparently, the IT systems used by PET do not support
partial removal of information from documents unless it is done as a
manual, time-consuming task. By the end of December 2016, the amendments
of the PET law were passed with an overwhelming majority in the Danish
Parliament, and there was almost no mention of the political debate (or
rather, lack thereof) in Danish media.

Consultation responses from several NGOs including EDRi member IT-Pol
Denmark were quite critical of the government’s proposal. However, TET
provided the by far most serious criticism in its consultation response.
First, TET pointed out that the concept of “document” in the systems
used by PET to an increasing extent meant whole databases or electronic
files with considerable amounts of information, rather than single
documents in the traditional sense. This would severely limit the number
of situations where personal data that was no longer necessary for PET
would actually be deleted. Secondly, TET stated somewhat cryptically
that the existing oversight activities of TET would have limited
relevance in the future since the only task left for TET will be to
assess whether full documents and cases are deleted when they are no
longer necessary for PET.

The oversight of the Danish intelligence services was further weakened
in February 2017 when the Minister of Defence proposed the same data
protection amendments for the law governing the Danish Defence
Intelligence Service (DDIS). TET is also responsible for the oversight
of DDIS, but the annual reports on data processing by DDIS for 2014 and
2015 do not contain any noticeable critical remarks. Nonetheless, the
Minister of Defence proposed to weaken the data protection provisions
and, indirectly, the oversight of DDIS. The amendments of the DDIS law
have not yet been passed by the Danish Parliament, but there was no real
opposition to the proposal during the initial public debate.

Support our work with a one-off-donation!

Homepage of the Danish Intelligence Oversight Board, annual reports
(only in Danish)

Law to amend the data protection provisions of the PET law (only in
Danish, 09.11.2016)

IT-Pol consultation response on law to amend the data protection
provisions of the PET law (only in Danish, 21.10.2016)

IT-Pol consultation response on law to amend the data protection
provisions of the DDIS law (only in Danish, 30.01.2017)

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

3. UK government attacks encryption … again

In the aftermath of the attack in London in March 2017, the UK
government has, again, indicated that it wants to force companies to
weaken encryption. The government wants to be able to access messages
sent via services that use end-to-end encryption.

The Home Secretary Amber Rudd stated on BBC One’s Andrew Marr Show that
it was “completely unacceptable” that the government is not able to see
messages sent via services such as WhatsApp. She also said that
companies should do more to prevent the sharing of extremist materials

Support our work with a one-off-donation!

The Investigatory Powers Act, which became law in December 2016, gives
the UK government the power to compel companies to re-engineer their
products if this is reasonable and technically feasible. This can be
ordered through secret Technical Capability Notices (TCNs). Although
WhatsApp is based in the US, its parent company Facebook has offices and
assets in the UK and the government may use this to apply pressure on
the company.

It is likely that companies will push back on attempts to weaken
encryption. However, they may be persuaded to take more action against
extremist content. This has implications for freedom of expression and
there should be a legal and transparent process for government take-down

Rudd met with tech companies on 30 March. EDRi member Open Rights Group
(ORG) and other NGOs, including EDRi member Privacy International (PI),
have called for transparency of the discussions. There should be no
secret arrangements, and civil society should be included in these

WhatsApp must not be „place for terrorists to hide” (26.03.2017)

EDRi: UK Draft Investigatory Powers Bill: Missed opportunity (18.11.2015)

Letter from NGOs re: Encryption and censorship measures (29.03.2017)

(Contribution by Pam Cowburn, EDRi member Open Rights Group, the United

4. Reckless social media law threatens freedom of expression in Germany

At the end of March 2017, with Federal elections on the horizon, the
German Justice Minister Heiko Maas proposed a law on ill-defined “social

Minister Maas has proposed the law which places a variety of obligations
on the companies, in the apparent hope that this will lead
profit-motivated companies to take over private censorship measures.
Following years of deletions of perfectly legal content by, for example,
Facebook, Minister Maas seems to believe that this will lead to outcomes
that are appropriate in a democratic society based on the rule of law.

Support our work with a one-off-donation!

In short, the law, in the absence of any real evidence to suggest that
this is the case, appears to be based on a vague hope that a number of
coincidences will happen, as a result of the law being adopted. In
– the hoped-for responses that the private (often foreign) companies
will not be counter-productive (by allowing extremists gain support
themselves as victims of censorship, for example);
– the hoped-for responses of the private companies will not lead to
unpredictable restrictions of freedom of expression, contrary to the
Charter of Fundamental Rights of the European Union (Article 52.1), the
European Convention on Human Rights (Article 10.2), the International
Covenant on Civil and Political Rights (Article 19), etc;
– the hoped-for responses of the private companies will achieve the
above results in a durable way, despite the ever-changing nature of
online communications;
– the companies will make appropriate choices with regard to the data
protection rights of individuals whose content they delete and whose
personal data they share with complainants;
– that, by extraordinary coincidence, it is (as required by
international law) necessary and proportionate to address 24 very
diverse offences (civil and criminal) in the proposed measures, in the
hope that the impact will be identical.

The law would require social networks to offer users “an easily
recognisable, immediately accessible and always available process for
registering complaints about illegal content”. Platforms could be
heavily fined if they fail to remove notified content from their sites
within 24 hours – or up to seven days for less clear-cut cases.
Obviously, the companies will use their terms of service to delete
content and avoid risk. Minister Maas appears to hope that they will do
so in the easiest way (by deleting legal content) and will assess each
of the complaints on its merits.

The draft network enforcement law (“Netzwerkdurchsetzungsgesetz”),
initially claimed to be a measure to fight hate speech and “fake news”,
is now broadened to include pornographic content and many other
offences. What is worse, the new draft law also contains a clause
requiring social networks to retain and make individuals’ personal data
available. There is, of course, no obligation on the German state to
take any action whatsoever or use that data, even in cases of dangerous
criminal activity, despite Minister Maas’ assertions that the problem(s)
being addressed are so serious that the law needs to be rushed through
before the elections, generating positive headlines for himself.

A provision on upload filters for the prevention of the re-uploading of
notified content has been removed from the initial draft, probably
because this is contrary to existing EU law (E-Commerce Directive,
Article 15). However, the proposal still contains the possibility to
establish content filters to identify and delete already existing content.

EDRi-member Digitale Gesellschaft harshly criticised the fact that the
draft law turns social networks into an unpredictable, profit-motivated
police force of the internet and privatises law enforcement. The
expansion of the criminal catalogue and the inclusion of the possibility
to request personal data of private individuals further intensifies the
damage to freedom of expression. Moreover, the organisation warned
against the introduction of “real-name policies” (or more active
enforcement of such policies) through the legislative backdoor.

In Germany, every request to disclose the identity of a user requires
two legal bases – one granting the right to information and one
authorising the handing over of the information. While the German
Telemedia Act (§ 14, TMG) regulates disclosure, the Federal Court
deduces the right to information from the principle of good faith in
case of violations of personality rights. Critics fear that if the new
law introduces a binding clause in the Telemedia Act, the path for
information inquiries is open to everybody. German citizens have
suffered for years from excessive access to personal data under the
transposition of the EU’s Intellectual Property Rights Enforcement
Directive (IPRED), which has been used to coerce individuals that are
accused of copyright infringements to either pay or face lengthy and
costly court cases.

The proposed new law means that anybody – under the pretext of violation
of personal rights and without the intervention of a judge – could
potentially make inquiries about the identity of internet users. This
would lead to a chilling effect and thus to considerable restrictions of
the freedom of expression and communication as well as increased threats
to whistleblowers. In addition, access to personal data of individuals
would make it very easy to abuse the law for other forms of hate crimes.
Users could, for example, request access to home addresses. Since no
court nor prosecutor has to check whether an infringement on personal
rights has been reported, the online service is made responsible for the
assessment whether to hand the information over or not.

Another addition to the draft law is a procedure to prohibit the
distribution of pornography. The effects on group chats, such as
WhatsApp which might also be affected by the law, depending on the
scope, will be interesting as partially public exchanges of legal
content such as pornography would suddenly become the focus of deletions.

In total, 24 criminal offences have been added to the latest draft,
including counterfeiting and fake news for the purpose of treason
against the nation, defamation of the state and its symbols, as well as
insults to the Federal President.

Finally, the draft was criticised for its bad definition of “social
networks”, which in case of doubt could also include e-mail platforms
and other services. The latest draft mentions a user threshold of two
million, which is open to interpretation. The question arises whether
many other platforms would fall under the new regulation if unregistered
users who simply visit a site are taken as a basis for the definition.

It is also unusual in the legislative process that the opinions of
organisations have not been sought. Finally, the legislative process is
now being fast-tracked, ignoring the public consultation phase, as the
Ministry of Justice already hastily notified the EU Commission of the
draft without waiting for the passing of the deadline for submissions.

German hate speech law: The scope is broadened before its adoption (only
in German, 29.03.2017)

Terrorism, Pornography, treacherous counterfeiting: German Ministry of
Justice broadens the draft network enforcement law drastically (only in
German, 28.03.2017)

Draft network enforcement law (only in German)

Parliamentary question on how the German state deals with hate speech
(only in German, 14.10.2016)
(Translation is part of a blogpost, available at

(Contribution by Kirsten Fiedler and Joe McNamee, EDRi)

5. What to do with the online platforms – the academics point of view

The rise of “platform economy” with rapid growth of online intermediary
platforms, such as Airbnb, Uber, Amazon Marketplace, and also dating,
gaming and other services is bringing new challenges not only for
existing business models, but also for European legislation. In a
meeting of Internal Market and Consumer Protection (IMCO) Committee
Working Group on the Digital Single Market on 22 March, the question
raised was how to adapt the regulatory framework to the digital age. The
aim of the meeting was to engage in a discussion with the academic
community on the topic.

Support our work with a one-off-donation!

Professor Aneta Wiewiórowska presented the work of researchers from the
European Legal Studies Institute, who drafted a paper titled Discussion
Draft of a Directive on Online Intermediary Platforms. She emphasised
the importance of online platforms as a necessary structural part of the
economies currently occupying the market, such as sharing economy,
collaborative economy, peer-to-peer economy, trust economy and data economy.

Online platforms are a marketplace of exchanges between the supply and
demand side. At the EU level, aspects of these exchanges through online
platforms are included in several legal frameworks: E-Commerce
Directive, Services Directive, Consumer Rights Directive, Unfair Terms
in Consumer Contracts Directive, Unfair Commercial Practices Directive,
and Comparative Advertising Directive. However, the academics from the
European Legal Studies Institute argue that online platforms are not
comprehensively covered by European legislation. Regulatory framework
should be adjusted in order to move from the chain model (producer –
supplier – consumer) to a triangular model, which considers all
relationships between platforms, suppliers, and customers. In these
relationships, the terms and conditions of the online platforms enable
exclusion of liability of the platform for contracts between the
supplier and customer.

The aim of the European Legal Studies Institute’s paper was to open a
discussion and inspire regulatory solutions on national and European
level, which would consider the new triangular model of relations
between online platforms, suppliers and customers, the duties of the
online intermediary platforms, reputational feedback systems and
platforms’ liabilities. The paper proposes a “light touch” regulation,
which includes three main points:

– Firstly, there is a need for the clarification of the function of the
platform operator. This means answering the question on whether the
platform is a marketplace or just an intermediary – is it just a
platform or also providing services, is it taking part in performing the
contract or not.

– Secondly, a differentiation between consumers, traders, professionals,
private parties and prosumers (people who are consumers and producers of
the content) should be clarified. It has to be made clear whether the
service is delivered by non-professionals, examined if the suppliers are
in fact traders acting as consumers, and what is the platform’s
involvement and liability.

– Finally, there is a need to establish reputational systems, which
enable the evaluation, essential for trust economy. These systems must
be objective and transparent and represent real consumers’ experience,
and the possibilities for standardisation should be explored.

In response to the presented paper, the representative of the European
Commission expressed reservation towards more legislation that deals
with online platforms on top of the existing framework. Instead, she
stated, what is needed is more clarity and better enforcement.
Interpretations of the provision in the E-Commerce Directive and Unfair
Commercial Practices Directive, together with judicial decisions, could
bring more clarity on the issues of online platforms. This would mean
that existing regulation could be sufficient, with some necessary
adaptations, to the reality of digital environment. There is a need for
interpretation of existing rules, their application and legal guidance,
the representative argued.

Faced with the European Commission’s drive-by shooting of the E-Commerce
Directive in the proposed Copyright and Audiovisual Media Services
(AVMS) Directives, as well as its implausible consumer “redress”
proposals in both instruments, the Commission appears to be keen to both
uphold and destroy the existing legal framework.

Discussion Draft of a Directive on Online Intermediary Platform

(Contribution by Zarja Protner, EDRi intern)

6. ENDitorial: Transparency and law-making – mutually exclusive?

Transparency should be a core principle for an open democracy. According
to the European Union (EU) founding treaties, in order to have a
democratic decision-making process, the EU institutions “shall maintain
an open, transparent and regular dialogue with representative
associations and civil society”. However, by following the legislative
process on the copyright directive, one can draw the conclusion that the
European Commission (EC) understands transparency and law-making as
mutually exclusive concepts.

Support our work – make a recurrent donation!

The current copyright system is undoubtedly broken. Instead of fixing
it, the EC decided to undermine the legitimacy of the decision-making
process as a whole. When issuing the new copyright directive, the
Commission failed to respect its own better regulation agenda and its
obligation to be transparent in relation to the legislative process of
the copyright directive proposal.

Before proposing legislative acts, the Commission must consult widely
and “take into account the regional and local dimension of the action
envisaged”. Despite the high participation in the last copyright
consultations, the EC decided to ignore the results and legislate in
favour of a minority of stakeholders instead. This has been done without
taking into consideration the big picture of the internet as a complex
ecosystem. Thus, in the last public consultation on ancillary copyright,
the majority of the individual responses were not in favour of this
measure. However, the EC decided to put this measure in Article 11 of
the draft legislation, ignore citizens’ opinion and transform the
copyright reform into a „patchwork of concessions to lobbyists’
demands”. As if that was not enough, the EC also proposed the
implementation of an upload filter (aka “censorship machine”) in Article
13 and recitals 38 and 39. That one article proposes, in addition to
upload filtering, primary and secondary liability for internet hosting
companies, content recognition technologies, a meaningless “redress”
scheme for people whose uploads have been deleted, and mandatory
cooperation between internet companies and rightsholders on the deletion
of uploads.

After the period of consultations, the EC has also managed to limit
transparency, a key element in EU decision making, by unreasonably
limiting access to public documents. EDRi filed a request to access the
correspondence between Commissioners, cabinets and services on the
proposal for a copyright Directive. In the first reply, EC responded
they only had one email that met the criteria. That email could not be
revealed because “the disclosure of the document would seriously
undermine the institution’s decision-making process, unless there is an
overriding public interest in disclosure”. EDRi then filed confirmatory
applications to review the handling of the request. As a response, the
EC sent some partially grayed out documents, but refused to reveal the
most important parts of these documents because there was no overriding
public interest.

There are serious doubts about the legality and adequacy of some of the
measures proposed in the Copyright Directive proposal, especially
regarding Articles 11 and 13. Therefore, the possibility for citizens to
find out how the provisions on the new right for publishers or the
upload filter ended up in the proposed directive is a precondition for
the effective exercise of their democratic rights.

Support our work with a one-off-donation!

According to the rules in the Regulation (EC) No 1049/2001, the right of
access to documents is not an absolute fundamental right, so exceptions
could be imposed. However, those exceptions must be interpreted and
applied strictly and must be proportionate. In this case, the EC invoked
the exception in Article 4(3) relating to situations where a decision
has not yet been taken by the institution. In order for this provision
to apply, the EC should prove the risk of the interest being undermined
is reasonably foreseeable and not purely hypothetical. The only given
explanation is that “the release of the document at this stage would
prejudice the position of the Commission during the current
interinstitutional negotiations that have not yet resulted in the
adoption of the legislative proposal concerned.” It is difficult not to
raise an eyebrow and argue that there is an overriding public interest
to avoid the application of the exception in Article 4(3) mentioned by
the EC.

The EU institutions need to do much more if they want to gain the trust
of EU citizens. The fact that the EC is restricting access to key
documents on some of the most harmful aspects of the Copyright Directive
proposal undermines the foundations of the entire proposal. If the EC
believes that installing a censorship machine for all uploads to the
internet in Europe is a valid proposal, it should let people know why.
The EC needs to prove that such a proposal is well-founded on the basic
legal pillars of the EU, including the Charter of Fundamental Rights of
the European Union, and that there were solid arguments for such
measure. Up to now, it has failed to prove so.

Access to document requests by EDRi on the correspondence between
Commissioners, cabinets and services on the proposal for a Copyright

Access to documents request on Ancillary Copyright law (09.03.2017)

Communia’s article on the request for the EC’s legal opinion on
copyright (14.02.2017)

Copyright document pool (12.06.2016)

The copyright reform: A guide for the perplexed (02.11.216)

Opposition against link tax gets big ally from Spain (24.03.2017)

(Contribution by Romina Lupseneanu, EDRi intern)

7. Recommended Action

Help make copyright right right now!
Sign the petition and support the campaign to ensure the freedom to
teach without breaking the law! This campaign is part of COMMUNIA’s
project Copyright Reform for Education, and it aims to raise awareness
about copyright in education and the need to reform the copyright system
among educational institutions and educators.

8. Recommended Reading

The Future of Free Speech, Trolls, Anonymity and Fake News Online

How To Talk About the Right to Privacy at the UN: A Brief Guide

Big brother getting bigger? The privacy issues surrounding Aadhaar are

Fake News and Fake Solutions: How Do We Build a Civics of Trust?

Restricted Eurojust report highlights use of intelligence in terrorism
court cases across the EU

Famed Hacker Kevin Mitnick Shows You How to Go Invisible Online

Did Reddit’s April Fool’s gag solve the issue of online hate speech?

9. Agenda

22.04.2017, Paris, France
Quadrature Communication Camp 2017

24.04.2017, Sarajevo, Bosnia and Herzegovina
Digitised security: How to read the surveillance discourse and fight it!

05.05.2017, Bielefeld, Germany
Big Brother Awards Germany 2017

08.05.2017, Berlin, Germany
re:publica 2017

20.05.2017, Belgrade, Serbia
European Journalism in the Digital Age: Rights and Risks

06.06.2017, Tallinn, Estonia
EuroDIG 2017

04.08.2017, Zeewolde, the Netherlands

13.10.2017, Brussels, Belgium
Big Brother Awards Belgium 2017

12. About

EDRi-gram is a fortnightly newsletter about digital civil rights by
European Digital Rights (EDRi), an association of civil and human rights
organisations from across Europe. EDRi takes an active interest in
developments in the EU accession countries and wants to share knowledge
and awareness through the EDRi-gram.

All contributions, suggestions for content, corrections or agenda-tips
are most welcome. Errors are corrected as soon as possible and are
visible on the EDRi website.

Except where otherwise noted, this newsletter is licensed under the
Creative Commons Attribution 3.0 License. See the full text at

Newsletter editor: Heini Jarvinen – edrigram@edri.org

Information about EDRi and its members: http://www.edri.org/

European Digital Rights needs your help in upholding digital rights in
the EU. If you wish to help us promote digital rights, please consider
making a private donation.



Lasă un răspuns

Te rog autentifică-te folosind una dintre aceste metode pentru a publica un comentariu:

Logo WordPress.com

Comentezi folosind contul tău WordPress.com. Dezautentificare / Schimbă )

Poză Twitter

Comentezi folosind contul tău Twitter. Dezautentificare / Schimbă )

Fotografie Facebook

Comentezi folosind contul tău Facebook. Dezautentificare / Schimbă )

Fotografie Google+

Comentezi folosind contul tău Google+. Dezautentificare / Schimbă )

Conectare la %s

%d blogeri au apreciat asta: