Tutorial “The EU Digital Services Act – Overview and Central Features”
General DSA Architecture and Approach
Martin Senftleben, University of Amsterdam
Formally, the safe harbor system is still in place for mere
conduit, caching and hosting services for third-party information they transmit
and store. For ©, voluntary own initiative investigations done in good faith
shouldn’t lead to more liability. More intermediary services are covered under
DSA. Points of contact for authorities/legal representatives are existing
obligations but new are terms of service/content moderation policies and tools
disclosure requirements, including algorithmic decision-making and human
review, as well as rules of procedure of internal complaint handling. How much
information must be disclosed? Must be balanced against trade secrets. Another new
obligation: diligent, objective and proportionate application to safeguard
fundamental rights. Obligation is on the service provider.
Transparency: reports on content moderation, including
judicial orders, notices, number and type of content moderation, use of automated
means and related training, and number of complaints.
Special rules for hosting providers, including online platforms;
online platforms, including distance contracts, and very large online platforms
and very large online search engines.
What about the copyright interface w/ the DSA? Art. 2(4):
this is all w/o prejudice to rules laiid down by the © laws. So on top of all
those distinctions, there are OCSSP obligations from Art. 17: online content
sharing services. So where the © regime is silent, use the DSA. And there’s where
the safe harbors have been gotten rid of: using the Art. 17 rules of required
licensing—there’s direct liability if you’re unlicensed, but you can decrease
the risks by using filtering to police the borders of your licensing deals.
OCSSPs are a controversial category in itself. YouTube is an
OCSSP, but other than that much is unclear—are all different types of social
media OCSSPs? We don’t have definite caselaw.
DSA requires VLOPs and VLOSES to have reasonable,
proportionate and effective mitigation measures tailored to specific systematic
risks w/particular consideration to the impacts on fundamental rights. Art. 17
is more specific and thus a further addition to DSA requirements.
“Sufficiently substantiated notice from the rightsholders”
and rightsholders should “duly justify the reasons” for their requests—opening for
more specific requirements like Art. 16 of DSA which say that notices have to
be sufficiently precise/adequately substantiated; so to statements of reasons
must be “clear and specific.”
DSA Art. 23: suspend, for reasonable period of time, and
after warning, users who have violated rules; so too processing of notices and
complaints by complainants that frequently submit notices or complaints that
are manifestly unfounded—not present in © framework so additive to it. Also Art.
22 requirement to ensure that notices submitted by trusted flaggers w/in their
designated area of expertise, should be processed w/o undue delay. Must comply
w/quality standards—status awarded by Digital Services Coordinator of Member State
if flagger has particular expertise of detecting/notifying illegal content; is
independent of any provider of online platforms (but does not need to be
independent of © rightsholders); and carries out its activities diligently,
accurately and objectively.
DSA also requires VLOPs and VLOSEs to get annual independent
audit opinions, reporting to Commission, potential for fines, so they can’t just
wait for court cases to identify whether they’re doing it right.
Fred von Lohmann: For enforcement: other than the
intervention of the Commission through the audit process—how will enforcement
be done? Private rights of action?
A: DSA enforcement is very much by the Commission itself in
his understanding. Drafter of DSA, Irene Roche-Laguna: it depends on the kind
of platform—hosting service, platform, or VLOP. EC has exclusive powers for
enforcement of obligations that apply only to VLOPs/VLOSEs—auditing/risk
assessment. But content moderation requirements for platforms in general are
for country of establishment (if in Europe) and where the legal representative
is (if not). If a platform neglects notices it becomes a systemic issue; the
Commission can jump in. [I wasn’t sure whether she meant that was true even for
non-VLOPs.]
Q: can a recipient of the service receive money damages for
violation of these obligations? Is that governed by Member State law?
A: In DSA 54, there is an avenue for private actors to bring
damage claims to complement audit-based or other executive activities.
Pam Samuelson: will audits be publicly available?
Roche-Laguna: Audit reports will be published w/transparency
reports every 6 months, and audit implementation reports have to be published.
They will have a bit of time to respond.
A: given the transparency requirements, the audit report
should be able to provide more insights, but we don’t know for sure yet.
Q: what makes an online platform?
A: central aspect is dissemination of information. [Husovec’s piece
says Blogger and Wordpress aren’t platforms, which seems bizarre to me.] What
is a public? It may not have to be a large public. In © we accept that
dissemination to a circle like 12 or 50 people can already be a public, so
would that also apply under the DSA? The Court of Justice could set a
relatively low threshold.
Roche-Laguna: there is a gray area. A hosting service that
disseminates to the public as ancillary feature, like a comments section on a
website, that wouldn’t be a platform. But we have questions about messenger
services with open chats/groups—100,000 people in chat. Or music service that
has licensed content and user-uploaded content: what is the platform part? [That
doesn’t seem to answer questions like, is email dissemination to the public?]
Q: do platform rules have to be observed for the ancillary
service? It’s a lawyer’s paradise. [Roche-Laguna is laughing but I’m not
finding it funny that they actually can’t seem to answer those questions.]
Rules for Hosting Providers, Online Platforms and Very Large
Online Platforms
Martin Husovec, London School of Economics
DSA is about content moderation, construed very broadly. Not
just removal of content as w/© infringement but decisions about violations that
aren’t based on legality but are purely contractual—FB and breastfeeding images;
suspending individual users. Also about institutions lower in the stack like app
stores. Hiding/demonetizing content is also covered. DSA covers almost
everyone, though advertising providers and payment providers are touched only
tangentially. Infrastructure providers like access providers, transit services,
DNS/VPN services, domain registrars and registers, generally have very few obligations
other than transparency. But distribution and content layer have more
obligations.
Part of the second generation that replaced the first
generation like section 230 designed to create breathing space for speech and
industries; focus on is regulation of risks posed by services.
For content moderation: fairness in design ex ante, due
process in implementation, transparency, and risk management. Due process
constrains the moderation of both illegal content and contractual breaches.
Risk management for very large players: required to think about product design
and operations; w/some risk management obligations for smaller platforms.
Online platforms are covered if 50+ employees/EU turnover of
10 million euros. VLOPs/VLOSEs: Alphabet, Microsoft, Meta, Bytedance, Snap,
Pinterest, Twitter, Amazon, Wikipedia as the only nonprofit.
Decide what rules are; open to notification from third
parties about content that might be illegal or violate terms & conditions;
make decisions (hosting services and any-size companies); allow appeals/internal
contestation (midsize and above); allow external contestation by users (VLOP/VLOSE);
transparency. Any content restrictions has to be codified in terms and
conditions, and decisions must be made on the basis of the rules. But the rules
must be diligent, objective and proportionate, including in their design
according to the recitals. What does it mean for design of rules? His take:
only extreme outlier policies would have an issue.
Core of DSA: tries to discipline providers in how they make
decisions. They must issue a statement of reasons, including for visibility,
monetization, etc. Specific explanation of reasons required, useful enough to
allow user to argue against them. They can be automated, but DSA says you need
to provide a free opportunity to appeal where a human has to be present in some
step of the process. Internal appeal must be easy to access and user-friendly;
not solely automated; must occur in timely, diligent, and objective manner.
Who can complain or appeal? External players: trusted
flagger, regular notifier, content creator, NGOs.
Possibility of external dispute resolution/ADR: Regulators
certify entities independent of platforms and users; FB’s Oversight Board is
not independent. Content creators and notifiers and reps can use the option.
ADR provider is complainants’ choice; no need to exhaust appeals. ADR decisions
are nonbinding; platforms must engage in good faith. Plaintiff compensates
complainants who win (fees and possibly costs). Complainants that lose pay
their own fees and costs. Trying to improve quality of decisionmaking within
company.
The proceduralist approach constrains implementation more
than rule formulation. Consider: for 5 EUR a month, you can say whatever you
want on the service as long as it’s legal in your country—disinformation,
nudity, etc. all ok. Everyone else is moderated on ToS violations and illegality.
Is this a violation of art. 14(1)? No if disclosed. Art. 14(4): probably no—paying
5 EUR to be unmoderated doesn’t seem disproportionate. What if you have a list
of VIPs whose content is not moderated at all b/c they are leaders of
countries? Again, 14(1) is ok if properly described. But 14(4) would be a
problem in his view due to the impact of illegal content.
Open issues: big guys can easily automate statement of
reasons; what about small providers? Licensed content moderation solutions from
third parties? What about users of services, like newspaper’s FB page—are they
considered to be a separate entity for purposes of people whose content they
moderate? How will there be standard transparency database if different people
are sending their reports in nonstandard format and with anonymization?
Risk management for VLOPs/VLOSEs will likely not be subject
to private enforcement—mostly extended or intensified reporting obligations;
researcher data access; unique obligations for profiling-free choice on
recommender systems (whether organic or advertising), or advertising archives. Subject
to regulatory dialogue w/whole community, including national regulator, NGOs,
researchers, where main thing is to figure out what companies are doing w/r/t
certain types of risks on their platforms. Dialogue b/c of opacity of system
and info asymmetry: regulators can’t instruct on what to do w/out knowing what’s
going on. Must assess systemic risks stemming from design or functioning of
services including algorithms and use made of their services, as well as back
end governance, taking into account the risks’ severity and probability. Risks
include: illegal content; fundamental rights; public security and elections;
health and well-being (including gender-based violence, public health, minors,
physical well-being, and mental well-being). Nothing of concern to civil
society is left out. If a social network adopts ChatGPT into design, that
becomes regulated along with anything else the platform uses.
Metaphor: authorities can partly restrict how and when
protest activities take place (streets, hours, use of amplification tools) and
take measures to prevent harm to protestors or others (e.g. by boosting police
presence) but can’t select speakers or dictate content.
Q: why is business to business included here? Why can’t
Amazon say “one strike and you’re out” to a business?
A: b/c the design was about risks of hosting services, not
just harms to individuals or individuals as consumers. Advertising marketplaces
are available to the public.
Q: what about risk mitigation for humans conducting content
moderation/labor harms?
A: sure.
Interplay with OCSSP Rules in the Directive on Copyright in
the Digital Single Market
João Quintais, University of Amsterdam
Background of lots of attempts to interpret previous rules under
InfoSoc Directive. Primary liability is harmonized. But secondary liability was
mostly unharmonized; ended up with mostly a notice and takedown regime. Court of
Justice expanded right of communication to the public to the point where there
was a Q of whether YT’s own services triggered direct liability. Poster child was
YouTube/claims of “value gap” (sigh).
Six states haven’t met the implementation deadline; still
waiting to figure out what the rules actually should be. Poland decision: CJEU said
Art. 17 was ok but had to give due regard to uploaders’ interests. OCSSPs cover
UGC platforms with large amounts of works that organize and promote them and
have a commercial/competitive effect. Exclusions: encyclopedias, ecommerce,
B2B/cloud hosting: Wikipedia and GitHub/ArXiv.org, Skype, Dropbox, and eBay are
excluded—partly down to who had good lobbyists. But you are still covered by
InfoSoc directive plus DSA. Startup provisions: under 3 years and 10 million
Euros, only notice & takedown, but if above 5 million visitors also notice
and staydown. Not much there. Could be excluded from DSA but covered by Art.
17.
A bipolar copyright system/an employment program for EU
lawyers. You might have to look service by service to figure out whether you
are an OCSSP. Etsy? Wordpress?
Non-OCSSP: default no direct liability/hosting safe harbor
and moderated duties of care, based on YT case and national law. OCSSP: default
is direct liability w/exemptions tied to best efforts licensing and filtering.
Most online platforms for DSA will be OCSSPs so you have to
figure out what applies—DSA may cover things only partially, and it’s not very
clear/depends on your normative preferences. Bonanza for lawyers! If you’re a
non-OCSSP, DSA will probably apply to almost everything.
Von Lohmann: Secondary liability seems to have disappeared.
It’s not just OCSSP/non-OCSSP—there’s no harmonization for secondary liability,
and safe harbors are interpreted as not precluding injunctive relief in most
jurisdictions—so don’t you also have an entire category of secondary liability
injunctions that are outside of both of these regimes or they displaced by DSA/Art.
17?
A: you are right, there might be that category. YouTube case
is designed for YouTube, but is now covered by Art. 17: need to apply that to
other platforms, and how to do so is unclear. German courts are deciding how to
modulate duties of care and will continue to do so unless ECJ tells them to
stop. In all cases, you assess liability also on the basis of compliance
w/duties of care, but DSA duties are different: obligations regarding
user-uploaded content are about your role as platform.
Comment by Senftleben: may not be as bad as that—the CJEU
has extended primary liability so far that there’s not much room left for
platform secondary liability. The crucial question, and an open one, is what
can be expected from a reasonably diligent operator in this situation—will this
be influenced by the new DSA duties?
A: court will likely look at whether this is a good faith
player—will reason backwards to find no liability if so. The “good guy” theory
of EU copyright law: court finds a way around liability for a good faith
player, and a way for direct liability if there’s not. You can’t really get
from the text to the outcomes; the court just makes up a bunch of conditions.
Comment: primary and secondary liability are no longer
distinct categories. Now all under the umbrella of primary, but if you can’t
get authorization you have to filter, so the duty of care is now embedded in
primary liability.
Beyond Copyright Infringement: DSA Review, Moderation and
Liability Rules Compared to Previous National Review and Takedown Approaches
for Illegal Content
Matthias Leistner, LMU Munich Faculty of Law
Need to know whether national rules are preempted, and they
also provide evidence of experience that might be useful for implementing DSA.
There were differences in applying and enforcing rules; national authorities
might have leeway in applying DSA rules, which might lead to forum shopping
about where to have a company seat.
German NetzDG regulates allegedly criminal content on social
media (obvious v. non-obvious are treated differently). Austria has a similar
law; France’s law was partly invalidated b/c of extremely short blocking
periods for certain content, but it also has a law against fake news in 3
months before a national election; Italy has a law against “cybermobbing” and
dealing with parents/minors; Baltic states have their own.
NetzDG has limited scope: social networks, video sharing
platforms v. DSA’s comprehensive/four-tiered approach.
NetzDG: catalogue of substantial/hard-core criminal
offenses, continuously extended v. DSA’s all illegal content including minor violations
(including consumer protection law), plus special notification duties for criminal
acts etc.
NetzDG: persons affected in their rights can notify; DSA:
everyone can notify.
NetzDG: expeditious blocking: 7 days or 24 hours in obvious
cases; DSA: expeditious/without undue delay.
NetzDG: redress for both sides; another human in the loop;
DSA: role of complainant surprisingly unclear, no standing for the other party
in the complaint.
NetzDG: No further prioritization b/c of limited scope; DSA:
system of trusted flaggers given priority; dynamic adaptation through
provisions on misuse and suspension.
NetzDG: limited and crisp; DSA: wide and wobbly.
Provisions on transparency of user contracts are similar to
German case law; reasonable and cost-efficient b/c contracts exist anyway. But
may not be easy to implement b/c some transparency obligations relate to what
the algorithms actually do. [I think this may undersell the
difficulty of figuring out what is, for example, abusive or hateful speech,
especially for non-dominant communities.] Protection of trade secrets is also a
huge issue.
Most probably, NetzDG will be repealed in 2023, but does the
DSA entirely preempt Member State laws? Depends on the harmonized subject
matter—access for research purposes? Research access for other public policy
purposes?
Was there overblocking b/c of NetzDG? No empirical evidence
under the procedural mechanism, but a backlash to unjustified blocking under
different user policies. Was preventive blocking under user policies/community
standards perhaps indirectly due to NetzDG? Avoiding cumbersome statutory
mechanism by overcompliance? Hard to assess w/o access for research.
Indirect regulatory effects have probably worked quite well
w/o overblocking; proposed self-regulation boards for notice and takedown don’t
seem to have worked. Platforms didn’t want to cooperate and platforms have
different policies w/regard to user communities.
CJEU has responsibility for hard questions, along with
national coordination; resources and coordination are issues and GDPR
experience is disheartening though antitrust experience is better.
Q: regulations require clear and understandable rules in
terms that are neither clear nor understandable: how are platforms going to
establish such rules for billions of pieces of content?
A: it’s not new that the legislator can do things that a
private entity can’t. German FB case established certain basic standards for
typical user policies that would be easy to comply with. There must be certain
reasonable rules on what is allowed and what isn’t. DSA requires examples for
rules of what is prohibited and what is not. Pretty much about procedures. [But
people will always be able to argue that their slur is closer to the not-prohibited
side than the prohibited side, if their slur isn’t in the list of examples.
This reassurance might work for legal/illegal but it is not helpful at all for
lawful but awful.] It’s about the bad guys with self-contradictory community
policies that block without reasonability.
Midsize platforms can game the system by choosing a friendly
jurisdiction like Ireland. Carveouts don’t protect enough small providers and
will be problems for Baltic startups.
Q: it’s nice to say that Germany doesn’t require too much
detail but this is a new regime, and France is going to be different.
A: agree to a certain extent; also important that some
standards are new—certainty will take 10 years and that’s not great. But the
transparency obligation doesn’t come from a vacuum—consumer protection law/transposing
concepts of standard terms to the platforms [where the subject matter of the
transaction, the user’s speech, is pretty different!]. While we were discussing
DSA, it made sense to object and discuss our concerns, but now we have it, so
it’s time to think about ways to make it workable to the extent possible. The only
way forward is to implement it.
Daphne Keller: share concerns about private litigation in
different member states, lack of harmonization. How many states might be
interested in allowing this?
A: We had this problem before, w/27 different standards—how many
of them really matter? Here we have to distinguish b/t public and private
enforcement. Public enforcement, the Commission will matter/establish standards
that might trickle down from VLOPs to smaller platforms. Germany, France,
Netherlands, Scandinavian countries will matter.
Keller: DSA is silent on question of what injunctions can
say.
A: private enforcement: new provision on damages and
injunctions might be possible; would have to look at different states’ laws.
Largest dangers are Germany and France national unfair competition laws, where
competitors might be able to sue—real problem w/GDPR that we are still
grappling with. Practically limited to Germany, France, maybe Austria and some
others. So far, Austria legislated specifically that compliance obligations don’t
qualify for injunctions, period; caused quite a number of problems in the
market.
Free Speech Challenges and Potential Risk Reduction in the
DSA
Eleonora Rosati, Stockholm University
Balancing freedom to conduct a business and the need to
protect recipients of services through transparency. Enhanced obligations for
VLOPs and special provisions for micro/small enterprises. Art. 17 tries to
protect users with certain exceptions.
Copyright and free speech: EU Charter of Fundamental Rights
is a primary source of EU law, and recognizes IP within right to property; also
recognizes free speech. Also recognizes freedom of artistic expression, freedom
to conduct a business, respect for private and family life, and protection of
personal data—not just integration of markets but integration of morals.
Balancing framework, though how the balancing is to be done is the key. Level
of protection for © needs to be high—what does that mean? High level of
protection for whom and of what? InfoSoc Directive refers to authors,
performers, producers, and culture industry alike, as well as consumers and
public at large. There are other kinds of rightsholders. ECJ has been clear
that high level of protection does not mean highest level of protection; the
goal is a fair balance of different interests. Increasingly characterized by a
fundamental rights discourse.
Many concerns about free speech from Art. 17, resulting in
action by Poland:. Resulting judgment: ECJ didn’t say Poland was wrong that it
restricted free speech—free speech is not absolute and can be restricted under
certain conditions. ECJ said there were enough safeguards: filtering tools must
be capable of adequately distinguishing b/t unlawful and lawful content; users
must have rights; rightsholders must provide relevant and necessary
information; no general monitoring obligation; procedural safeguards must
exist; and there should be stakeholder dialogue and fair balance.
Inconsistent approaches at the national transposition level,
so Art. 17 won’t be the stopping point. Also, lex specialis/lex generalis
relationship is likely not to be frictionless. CJEU will have to smooth things
out.
Comment: if you overblock, all you face is user complaints.
If you underblock, you get sued and it’s costly. That’s where the balance is.
A: mostly the remedy for overblocking is to let the thing
up; though national law might have some other remedies.
Comment: German law has a sort of must-carry obligation if a
user asserts the applicability of an exception. But it’s the minority approach.
A: Italy does the opposite—if there’s a complaint, content
must be disabled during the dispute. From platform point of view, you may have
to divide treatment of Germany and Italy/geoblock.
Q: is Italy compliant with the Poland CJEU decision?
A: neither Italy nor Germany is—you can’t have a blanket
approach. [I don’t understand what that means—I would appreciate hearing what a
provider is supposed to do.]
Exceptions and limitations might mean something different in
terms of whether they establish user rights—or maybe not! The new data mining
provisions are characterized differently from each other, but not clear whether
that makes a difference to whether, for example, one is entitled to remuneration
and one isn’t.
Comment: if you take auditing seriously, maybe there can be
changes/real examination of the practices.
Von Lohmann: but who selects and pays the auditors?
Comment: they’re supposed to be independent!
VL: that’s not the question.
Q: will this get rid of small platforms b/c all rights have
to be equally honored?
A: it depends. ECJ is adamant that fair balance depends on
circumstances of the case. Some rights might take precedence on a particular
platform; not all content is created alike. There might be situations in which
certain concerns override other types of rights. France would maybe answer your
question in the affirmative, but not all would.
Comment: situating different categorization options under
DSA, might see cases that the more you diversify the types of content you host,
the more obligations you have.
Q: for US lawyers, we look at the Texas and Florida laws
that are clearly politically motivated attempts to suppress content moderation
in favor of a political agenda. Hard time understanding how that’s going to be
handled in Europe, but the language of nondiscrimination suggests that it can
and will happen in Europe too, which also has culture wars [and backsliding democracies].
A: indeed, the concerns around © are also in line with those
concerns. The discourse is broader than ©--Afghanistan papers case, where
German gov’t attempted to repress publication of military reports about
operations of German army in Afghanistan. Used © as a tool. Advocate General
and ECJ emphasized that © can’t be used to suppress free speech and one should be
careful to limit © to its purposes.
No comments:
Post a Comment