Session 3 | Marianne Dahlén (Uppsala University,
Sweden), Moderator
Design and Copyright: An Open Question?
Stina Teilmann-Lock (University of Southern Denmark)
Commentator | Jessica Silbey (Northeastern University)
Openness in the law for fair followers? Design in law in Denmark from 1908: fair
followers can copy, despite existence of copyright protection—a balance b/t
copying to develop and proliferate design and control/remuneration for designers. Balance is very hard to achieve; pendulum
swings between 1908 and 1960 due to categorical shifts in legal reasoning. Rely on foundational assumptions about art,
purposes of ©, and people © is supposed to benefit.
One approach: deny © to industrial design of any kind. Platonic idea of art. Other trend is protect original
ornamentation wherever found, but make it thin/identical copying only. Similar
trend in US/Bleistein’s anti-discrimination principle. Danish SCt, 1908, ruled that coffee pot was
uncopyrightable—it was meant to proliferate in the market despite its
decorations/artistic qualities. It had a
prototype.
Modernist art: fashion and painting, font and furniture, art
and politics were all linked. When Demnark amends copyright act in 1908 to
cover prototypes for industrial art and handicrafts, whether or not
reproduction takes place w/purely artistic purpose or industrial purpose/practical
use: a kind of leveling, bringing more artists under the copyright tent.
Similar to US: the copy is an individual reaction to something in nature;
virtually any personal reaction will be copyrightable. Also comports
w/political movements of modernism/industrial capitalism.
Kantian influence over Danish © was strong and resisted this
impulse. Talent or genius, used to reinforce hierarchy of fine art. Not clear whether paper celebrates this
exclusion, though calls it a chance for “openness.” Clearly based on an
elitism/certain objects are only marginally protected, while jewelry and other
ornamental features were protected.
Bauhaus chairs from 1930s: experts found them worthy of
protection, but Danish courts didn’t—unique features came from style/particular
materials and thus not artistic worthy of protection. Functionalist furniture + rhetoric of
modernist movement made applied art even more problematic in © terms—the binary
of fine art/craft is based on this fiction of individuality/common practice
also being a binary. Mythology of
originality/genius that denies/suppresses evolutionary practices while
simultaneously denying to everyday creators the benefit of legal protection.
So broadening the categories of protection to add applied
art in 1961 didn’t flatten the hierarchy.
What does this mean for industrial designs? Simple/common forms could be copyrighted when
they existed in harmonious unity, selected by the designer. Where then is the openness? Disappearance of
shared community, mutual following. Savior of openness in Denmark isn’t an
antidote to romance of individuality, but rather a doctrine of thin
copyright. Low originality requirement
must have as its concomitant a narrow scope of protection. But then does that perpetuate hierarchy of
copyright genres?
Should always ask who benefits from regimes and where the
harm is. Openness allowed: benefits of
IP equality/leveling down extend beyond traditional authorship to audiences and
fair followers. Progress comes through
copying; fair followers need as much openness as possible. Category of design generally is a problem
across IP disciplines. Whether design is
protected in © is also imported into TM and patent. It is a special category in
almost every statutory IP regime and its specialness is confusing—a sui generis
category w/o justification where there is so much overlap.
Teilmann-Lock: Double status of design comes from its basis
in engineering as well. We mean different things when we say design—technical and
artistic. See it too in the Berne Convention, where it’s left to individual
nations to deal. Conceptions of design
have been very different across time/place.
Actors w/the most to say in defining design in Denmark have been
graduates of Academy of Fine Arts—furniture architects, a very loaded term in
Danish b/c its connotation is a particular generation of designers who made
Danish Modern design globally known.
Ornament is a naughty word in the design world, which is
trained that form follows function. It
means bad taste. Illusion that object can be stripped down to its
function.
Farley: in terms of history, Danish design’s heyday was post
WWII to late 1960s. Change in law of
1961, if consequential, comes at an interesting time. Is there a consequence for design? Danish courts became good at seeing the art
in functional design; not all courts did.
American/individualistic approach—we don’t ask, can I see art in that
chair? We say instead: what were the alternative designs? Were artistic choices
made? Danish ideal: reduce to its essence;
that would take the US choice approach off the table. Danish court says artistic design is “naturally”
motivated where US court would see that as a reason not to grant protection.
When the Q is alternatives, there are almost always alternatives; but if ct is
forced to assess art, it may deny protection from fear of having to make
artistic judgments.
A: Court-appointed experts play a role in the Danish
cases. Experts had managed to persuade
courts of their understanding of aesthetics.
Almost lecturing the courts for the first half of the century about
modernist aesthetics and finally courts took it in. (Similar thing arguably w/appropriation art
and the last 50 years of fair use in the US courts.)
Danish designers always have their names on products: a French press is called a Bonum (sp?) because of the name of the designer. A TM too, of course.
Challenging The Black Box: On the Accountability of
Algorithmic Law Enforcement
Maayan Perel and Niva Elkin-Koren (University of Haifa)
Commentator | Maria Lillà Montagnani (Bocconi University,
Italy)
Algorithmic © enforcement by online intermediaries. How/are
they held accountable for what they do?
Framework against which we can judge them. Tech has always assisted legal enforcement;
not aware of it most of the time.
Speeding cameras. In traditional
enforcement situations, decisionmakers make a decision and the tech just helps
implement it. But online, private
entities translate the law itself into an algorithm. Functions that were actually discrete are now
carried out by the same entity: law enforcement & adjudication. And the algorithm is unknown, black box. We
can’t know if the law is actually being complied with, esp. in situations of
fair use. Dangerous effects on public sphere.
Proposal: public scrutiny, not judicial scrutiny. (Could in the alternative have an ASCAP-style
antitrust control.) How to distinguish
from content management decisions made by intermediaries as part of their
business operations? Does the public
have a sufficient opportunity to challenge decisions? Can they correct
erroneous decisions?
Proxies for this: public literacy through transparency. Due process—ability to challenge decision/have
a voice. Public oversight: ability to
contest removal/restore content.
Regulated algorithmic © enforcement, as distinguished from
unregulated. The DMCA doesn’t achieve
these goals. DMCA: uploader is not
informed when link is removed.
Counternotice doesn’t preserve due process b/c the content is
immediately removed: an extrajudicial TRO based only on © owner’s allegations.
If this is true of statutory © enforcement, it’s much worse with completely
unregulated/private/voluntary regimes—e.g., filtering that prevents publication
in the first place—no transparency; no notice; no due process; no
counternotice.
Google: ex post and ex ante measures, filtering as a
business model. Enables © owners to monetize other parties’ uploading. But that
doesn’t meet the framework for judging accountability.
Barriers to oversight: Technical barriers linked to
nontransparent algorithmic mechanisms as such.
Legal barriers: bars on reverse engineering, research: anticircumvention
laws; requirement that you aver ownership in good faith (so you can’t test the
operation of the system); user-generated barriers b/c users tend not to send
the counternotices b/c the wording of a notice is so scary.
We know that these systems don’t work from an accountability
perspective. Public oversight would be
better than nothing, but why not change the approach: if © and tech don’t work
together, start from scratch, and think of something that works. Doesn’t make sense to try to make the tech
fit ©; make © that fits tech, like privacy by design. More collaborative approach b/t regulators
and intermediaries.
Elkin-Koren: purpose was to map issues surrounding
algorithmic enforcement, and offer theoretical framework for thinking about it.
Not necessarily providing ready-made solutions. Context of a larger effort on
algorithmic enforcement. We have data for 3 years about enforcement by
algorithm and enforcement in court—Israel is a small environment that allows a
population study. Software for example
operates only by notice and takedown, not in court; other types of works are in
court and not by takedown. Most
interesting finding: algorithmic enforcement is 7000-8000 notices compared to 100
lawsuits over the same 6 months. We are also looking at the notices. 50% of notices are actually related to the
right to be forgotten, though filed under DMCA.
Google’s Transparency Report doesn’t give this info—requires a lot of
analysis. History can help understand
where we are.
Perel: Empirical study that spurred this paper: we tried to
learn systematically about how online platforms in Israel enforce ©; whether
they verify rights claims; whether they correctly remove only infringing
content. Tried to upload different materials, some clearly noninfringing and
some clearly infringing (like an episode of House),
and some fair use (baby dancing in Lenz-style
video). Recorded results of sending
takedown notice. Algorithmic enforcement
is chaotic in Israel: most platforms did nothing to verify rights; some took
down noninfringing content and some didn’t take down infringing content. Couldn’t
do this experiment in the US b/c of the DMCA legal barriers (anticircumvention
and oath requirement in takedown; CFAA). But anecdotal evidence of same errors
in the US.
Elkin-Koren: skeptical both about privacy by design and © by
design: doesn’t help us avoid the challenge. Don’t give priority to people who
design the wires. The community should decide, using an appropriate
decisionmaking process. Legislators would have difficulty designing the system
too. Challenge: how to design legal
interventions that would be more appropriate for this dynamic environment. We
could set standards, but those are problematic as well—can create
distortions. We are dealing with
continuously changing tech. Ongoing
legal intervention is therefore required.
Platform behavior is constantly changing. Some of our data we’ve shown to Google; they
had no clue.
RT: I’m going to do the lawyer thing and ask for specific solutions. So frustrating; we’ve been saying these
systems don’t work for years and now, instead of any proposals for improvement,
from © owners we get the Frank Luntz-style phrase “notice and staydown” as their
new euphemism for filtering. What if
anything is the most effective way of making these concerns persuasive to
non-IP scholars? Multistakeholder in my
experience means: we are going to wear you down with procedure and time, b/c
you aren’t getting paid by the hour to represent the public interest. Experience of recent
DMCA best practices statement “multistakeholder process” hosted by
PTO/NTIA: most anodyne results possible (only anodyne results were possible). Multistakeholder process only works where
there is ground for compromise: what is that ground?
Elkin-Koren: Multistakeholder regimes are not what we’re
trying to advocate—there are indeed many problems. © is just a test case for online
disputes/algorithms. Maybe some
solutions can be achieved more broadly across regimes.
Q: FB algorithms that vote down stories that don’t help FB
financially—serve no purpose other than as a mask for unaccountable power/to
encourage brands to pay FB for access.
Similar issues.
Q: YT’s US terms of use don’t cover reverse engineering of
takedown algorithm, but Israeli terms of use specifically say you agree not to
interfere w/© security or inspection mechanism. Another type of legal barrier.
Difference b/t jurisdictions is interesting.
User-generated barriers: has been approached by musicians that received
takedowns after trying to monetize their channels. They are very scared. If you counternotify on grounds of fair use;
the algorithm shuts down; maybe that’s another way to challenge (this is not
universally true! I know people whose fair use counternotifications have simply
been ignored).
Jaszi: at least two black boxes. One is the algorithm. Another
Q is why the counternotice provisions are so dramatically underutilized. You categorize some possibilities in the paper:
fear, risk of exposure, simple lack of information, lack of solidarity (people
feel like exposed isolates rather than part of a group), indifference (it was
one of many videos; it’s already up on other platforms). There is a practical
point of intervention: we need information about why people don’t counternotify;
we have wonderful hypotheses but need real empirical work. If enough made use
of it would throw system into disarray.
Perel: it’s impossible to send 1 million counternotices a
day.
Jaszi: but why are we sure that resistance would require
numerical proportion? 1000 might be
enough.
Elkin-Koren: there is no one you can deal with if you have a
FB problem in Israel.
Q: responses to EU consultation on ©: a lot of responses
from intermediaries on increasing responsibility for blocking infringing
content. Adopted discourse of economic transaction costs—burdensome introduction
of new requirements. Intermediaries are
reticent to handle interactions w/public. Why won’t intermediaries reject
transparency on grounds of transaction/operating costs?
Q: © as test case for other areas of law—surveillance machinery
is the same.
Elkin-Koren: transparency is insufficient; comes with costs.
I wouldn’t require more reports/more information: intermediaries should allow
their systems to be more transparent, open to inspection/monitoring by
outsiders as we are being monitored by them. Not necessarily the same type of
cost. Maybe the reason © disputes turned
into political disputes is the fact that © enforcement infrastructure is the
infrastructure for other types of surveillance and control. When you want to remove a documentary on rape
in India, you convince BBC to file a takedown.
That was the only way to remove it.
It’s not b/c © fits but b/c © is perceived as neutral.
No comments:
Post a Comment