Friday, April 03, 2026

Commemorating 50 Years of the Copyright Act, part 2

The 1976 Copyright Act: Mostly Evolutionary, Not Revolutionary Tyler Ochoa

1790 Act adopted the Statute of Anne—not radical even though it was the first for the US. Similar here—major changes, but not radical. Expanded subject matter; protection on creation/fixation instead of publication and preemption of state law; automatic protection without preconditions (by amendment), reduced formalities, codified idea/expression.

But 1909 Act covered “all the writings of an author.” “Original works of authorship” aren’t that different. Likewise, lawyers didn’t necessarily take common-law copyright as seriously as federal copyright, but there was common-law copyright before the Act preempted it for fixed works. Moving the federal protection back to the date of fixation isn’t a huge change.

Formalities: some change.

Codifying idea/expression: It’s a good thing that we wrote it down because of the textualist turn, but it was an existing principle. Not a radical break.

Fair use: same thing.

Exclusive rights: again, evolutionary.

WFH: a reasonably big change in defining what can be WFH, rejecting instance & expense test (which he thinks was a mistake by the courts in the first place).

Duration/renewal: Motion pictures and musical works were heavily renewed, and most other things weren’t. Life plus 50 isn’t just an extension, it’s a fundamental change in measurement. And it’s a change that we haven’t felt the full impact of yet b/c we’re only 50 years out. None of the © granted in 1978+ have yet entered the PD (w/exception of some unpublished works by deceased authors).

Terminations: same idea as renewal term, but new opportunity b/c of unitary term.

Compared to what happened since the 76 Act: rise of computer programs, though the definition of literary works was written to allow them. BCIA: elimination of formalities; elimination of registration requirement for foreign works (subject to limited remedies). Architectural works; VARA; automatic renewal for works 1964-1977—works no longer go into public domain for failure to renew. NAFTA; TRIPS; Uruguay Round restoration—a radical change. Public performance rights for sound recordings. Term extension; DMCA adding 512, 1201, and 1202. Cumulatively this is a radical break with the past, but mostly after 1976. The 1976 act did change from incentives to publish towards natural right of author, but that was a change in principle with fewer radical changes than came later.

Moral Rights, 50 Years Later Xiyin Tang

Poised for a resurgence among concerns for human creators in a world of AI slop. Entrepreneurs are using TM, even trade secret, to protect rights far beyond VARA. Monty Python case is a classic example using TM—but the cause of action was a stand-in for moral rights, by the court’s own admission.

VARA has had limited success—limiting to fine arts likely fell well short of Berne requirements. Also US rights are waivable, though nonassignable. Only 84 adjudications of VARA claims, 71.4% of which dismiss the claims, often on procedural grounds. More concerning are cases that effectively chip away at the scope altogether, further neutering it. First Circuit held that VARA didn’t protect site-specific works of art at all—where the location is integral to meaning—based on the public presentation exception. But this was the type of art that formed the basis of the public justification for VARA in the first place—Richard Serra’s Tilted Arc. SDNY denying injunctive relief in Five Pointz; money is not a good remedy. And the 7th Circuit found that site-specific art wasn’t just ineligible for VARA, but ineligible for © generally. But site-specific art is the exemplar of postmodern artistic practice.

Nonetheless, predicts that moral rights are posed for a resurgence. AI has led to a new insistence on human creation. Private deals w/AI: authors don’t just get a per-generation payment for summarizing their work, but a link to purchase—attribution. Decision holding that Wu-Tang Clan’s album was a trade secret—independent economic value comes from ability to exploit exclusivity to create an experience that competitors can’t. Secret art could therefore be trade secret art.

Terry Fisher: what about fan fiction and other interventions?

A: Artists care about originals. Maybe expansion of moral-esque rights could impinge on what people do w/a copy of the original, but the Wu-Tang example is about works that exist in an edition of one, and that’s what she’s concerned about.

Keynote Address: Authorship in the Shadow of the 1976 Act Paul Goldstein

AI challenges: believes that transaction costs are the key and that private or public deals can achieve them. But copyright dilution—competition with AI-generated works—should not be actionable because the ideal is that works should be priced as low as they can be without destroying incentives.

Points out that Jane Ginsburg showed that French copyright was initially utilitarian/incentive concerned; moral interests became important later. US: the author-protective provisions that Congress introduced in 1976 are important—a shift in the philosophical base of ©, according to Barbara Ringer, to make the primary beneficiaries of © individual authors.

What do authors want? To be recognized as the authors of their works. Consider Creative Commons, which found that attribution was so commonly desired that attribution became the default. Audiences also want this, which is why they go to concerts instead of watching from the comfort of home. They also want community—shared passion—but the main desire is authenticity, the knowledge that the tiny well-lit figure on the stage is their favorite performer, not a hologram. Authenticity is the consumer-side counterpart of the desire for attribution. Author & audiences meet and form a bond. But aura can be attached to multiple copies of identical objects—ubiquitous reproduction hasn’t led to the withering of aura, but strengthened our desire for it and created new strategies to produce aura.

But TM may overprotect attribution at the expense of popular culture—missing limited term and public interest exceptions. It’s not enough to exclude protection for generic elements; its exceptions for parody and the like aren’t vigorous enough. Presumably this was a concern in Dastar.

So now we need attribution in ©. 106A (VARA) should be replaced w/an exclusive right for all authors to claim authorship and object to distortion/mutilation/modification of the work, taken from Berne. Generative AI: a user who asks for a story in the style of Raymond Chandler justifies an attribution right. [Hunh? Doesn’t the user, by definition, know?] Style is not copyrightable, but we could draw the line more generously if there was only an attribution right, not a control right; the limited term would avoid the parade of horrors and parody would be allowed.

Lemley: there are lots of circumstances in which creativity requires that you not keep integrity. Tang’s answer was focused on individual copies/single copy works. But your moral right is broader. What do you do about fan fiction?

A: look to what other countries do. They valorize commentary and individual creation as much as the US does. French law has a robust exception for parody and pastiche.

RT: I don’t think that would work. Litigation culture is a big deal.

A: the motion picture studios are why we don’t have a moral right in this country. France, Germany, Canada etc. have moral rights and moviemaking—but the answer was the litigation culture in the US. Would love to see some empirical work on that. On the attribution right, would love to see a full-fledged empirical study of asking © litigants why they sued; integrity touches on vital interests. Turns in part on what the remedies are. One could fiddle with remedies like injunctive relief rather than monetary relief, though then you lose the contingent fee bar.

Barton Beebe: Scalia dismissed tracing the origin of Nile and all its tributaries—what if we recognized moral rights of author and all of those who preceded her and put a burden on her to state her sources? Religious traditions might support this.

Sprigman: other attribution systems exist, but Earth is for the living: part of art’s responsibility is to free itself from the past despite its influence. You are taking a side on what art should do; our perspective is from writers, but may not serve readers very well.

Panel 1: The Copyright Act and Technological Change

R. Anthony Reese: legislative response to digital technologies was the key in the midpoint period; last couple of decades saw less of that other than Music Modernization Act. Focusing on civil, not criminal, amendments. In 1980 we added a definition of computer program; not a big change. We’ve only added one exclusive tech based right—digital audio public performance for sound recordings. Not a response to tech but to political failure of the 1976 Act.

Some expansions to limitations—110 for distance education (not very helpful in pandemic); 111 Family Movie Act allowance for skipping naughty scenes hasn’t created a flourishing business model. But 114’s limits on nonsubscription broadcast transmission enabled HD over the air radio. Also bars on record rental, etc.

We’ve done a bunch of compulsory licensing, mostly tinkering to provisions already there like adding low-power TV to the cable ones already there. DAT licensing; some compulsory license for the digital sound recording public performance right; satellite retransmission licenses; modernization of mechanical licenses to include a blanket license for streaming & download. Copyright Office was not a fan of compulsory licenses but Congress is enamored of them in specific circumstances.

Remedies: 512 limits have been incredibly important. 408 preregistration, maybe has some effect (about 700 works/year); 504(c) allows willfulness to be presumed if you provide false domain name contact info in connection with © infringement.

Sui generis provisions: AHRA (desuetude/written so narrowly as to exclude general purpose computers just as they were about to become the way music was enjoyed), 1201/1202.

Amendments motivated as much by political/legal/market/social developments as by the new technologies themselves. Record Rental Act passed not b/c of CDs but because of shops that rented out tapes and encouraged copying; the worry was that it would get worse w/CDs but the business model already existed. Similarly, low-power TV was added not b/c it was new but b/c the FCC hadn’t previously authorized its deployment.

Jessica Litman: substantive approach is shaped by drafting method—to invite many of the stakeholders who know they’re interested to work out their differences and embody their compromises in statutory language. Overreliance on compulsory license is b/c it’s easier to reach a compromise. Even in the 1909 Act, courts had devised separate rules for different kinds of works; they wanted to incorporate exclusive rights shaped to works, but then that turned out to work really badly when new tech like movies came around. So 1976 Act tried for one size fits all rights, but then everyone needed bespoke exceptions to continue doing the legitimate things they’d done every day. They tried to make exceptions as narrow as possible so they couldn’t be used for anything else (e.g., jukebox exceptions). Insiders make the rules that they and © outsiders will have to follow—unfriendly to outsiders.

How does this method work for insiders? Their efforts make a lot of assumptions that may not pan out. DMCA is a good example. Implicitly incorporate promises about how insiders will treat one another. But none of the promises are legally or morally enforceable, so many get broken. Promises of publishers & distributors to creators turned out to be particularly vulnerable to breakage. Also: If you exclude outsiders from negotiations for as long as you can, you’ll miss important issues on the horizon that aren’t central to anyone sitting around the table.

Result: insiders came out believing that they’d significantly fortified themselves against scary new technologies, and as those have failed, we’ve seen © insiders adopting more combative negotiating postures & developing deep resentment about the interests they believe are eating more of the pie than they should be permitted to eat. Legacy © owners are earning more money than they ever have. But they are nonetheless looking at © with grievance and resentment b/c technical services delivering material are also earning a ton of money—instead of saying “big pie, wow, awesome!” they want that whole pie. That’s led some to believe they’re entitled to hoard rights, money, control, and market share by any means they can manage.

When you call © insiders together to write a statute to fix things, you get the MMA. Has some good things and some bad things buried in the middle—but most of this hoarding is currently coming at the expense of creators. They’re earning less, unlike the legacy industries. When we tell them to compromise on AI, this is also what we may expect.

Pam Samuelson: A happy story on the scope of software copyrights. Wasn’t initially clear whether machine executable code was copyrightable. Initial attacks on copyrightability had to be overcome, but scope was unclear—Paul Goldstein wrote early article expressing concerns about risks of monopolizing functionality. Suggested borrowing patent misuse doctrine.

Whelan v. Jaslow then gave super-broad © protection—early expectations about thin © to avoid protecting functionality were totally ignored. All of the “structure sequence and organization” was protectable if there was a modicum of creativity; everything should be protectable unless there was only one way to achieve it.

6 years of effort followed (Samuelson in the lead!) to get the results right. David Nimmer also wrote an article suggesting that more filtration was required. 2d Circuit adopted this in Altai requiring abstraction, filtration, and comparison. Filtering out unprotectable elements was a really significant advance and got us back to a relatively thin scope. Compatibility is unprotectable.

Fed. Cir. agreed w/dct that we should have © hearing like Markman hearing to do claim construction—another important development. So © turns out to accommodate software well, after struggle.

Mark Lemley: Imagine a collection of model weights that gives a possibility, but not a certainty, of generating an infringing copy: is that a copy? The statute’s answer is incoherent. The right to reproduce the work in copies; copies are copies when they’re fixed; fixed means the copyright owner authorized it. That doesn’t make any sense so nobody pays any attention. We use the same definitions for protectability and infringement, causing the problem. But we do use the part that says fixation means it has to be perceived, reproduced, or otherwise communicated.

The parties in AI cases take some extreme positions—models don’t include content; output is just a collage of intputs—neither of these things are true. Can extract Harry Potter verbatim w/a four word prompt from Llama 3.1. Extraction is possible for some works/parts and some models but not others. New work shows you can expand extraction if you go beyond verbatim extraction.  But again it depends.

Jane Ginsburg shows that you can retrain models to force them to regurgitate a work, and that makes it more likely that they’ll regurgitate other works, suggesting other works are latent.

Is a work latent in a model a tangible copy? It’s complicated. Models don’t store works directly, but encode weights reflecting relationships b/t words or syllables. You can make a copy of a picture using ones and zeros even though the ones and zeros aren’t the picture—deterministically, these are copies. But Microsoft Word doesn’t encode War and Peace even though all the necessary parts are in Word waiting to be put in the right order.

Information theory: can the work be extracted with less than the same information you already have? Compression algorithms: sometimes we can store less than all the info & use it to generate a work. Lossless compression is clearly a copy; lossy compression probably creates non-identical copies that are nonetheless still copies. Compression algos are still deterministic, though—same imperfect copy every time.

AI extraction is rarely deterministic. We can get a result 10% of the time; are the other 90% also stored in the model? That means more works are in the model than there are atoms in the universe. © hasn’t dealt much with nondeterministic copies. Kelly v. Chicago Park District comes closest—a garden is not ©able b/c it’s not deterministically fixed. Compare to video game output cases: you infringe by making a map used w/existing video game to cause it to be played in a new map even though the images aren’t contained in the map.

Kelly is probably wrong: most people would say a garden could be sufficiently replicable to be fixed. Game cases: predictable and replicable—the same map and characters would show up. Though now there are procedurally generated games where the map changes on the fly, but that still involves output. Not all the possible maps exist already in the game.

Challenge: sometimes it’s easy to get out—high degree of predictability and replicability. Sometimes it’s not really possible. Sometimes it requires a lot of work but can be done.

We should say that if it’s easy to get a work out, it’s probably worth saying it’s in the model. But if you have to know what you’re looking for, and keep trying until you get it, then we probably should not call it a copy.

Why this matters: if a copy is in the model, then making a copy of the model infringes the copyright in the underlying work. Meta is distributing the model weights to lots of people.

Maybe those models are fair use—probably be the right result, but harder to reach, especially after Warhol. Fact of intermediate copying might be important (even though he doesn’t think it should be).

In response to Tony Reese: maybe we could say they’re unfixed but derivative works which don’t require fixation. Reese” points out that the RAM copy cases about duration of presence in computer—not a great way to do it but we did do it. Also compare Google Books: how much of a book you can assemble by doing searches was relevant to the fair use analysis.

A: maybe that functional approach is practically the best way to go. If I can prove that it’s in there somewhere but it costs $10,000s to do that each time, that’s not a real way to get a “copy.” It is also really hard to resolve such questions as a matter of class actions—courts would like to have an answer, and an answer depending on a work by work, model by model analysis is not going to be desirable to them. Even if it’s the truth.

Tang: incidental copies in the course of streaming music, for example, raise similar questions.

Samuelson: a new round of cases about using YouTube performances—allegations of 1201 violations to say that scraping the video to use that data for training. Even if the stream itself wasn’t a copy, if you can make a copy from it, what to do? Should there be a meaningful distinction b/t access controls and copy controls? Reese said yes! (And he was right.)

Gellis: is vibe-coded software copyrightable?

Lemley: depends on how vibey you were! Maybe a selection and arrangement ©, though for code that will be less broad.

Samuelson: divinely authored works cases are also relevant.

RT: For Samuelson: courts often seem to divide over whether there are 2 ways to have a thin ©: the first way is that anyone who does the same thing can make a nearly identical work, but they can’t engage in reproduction; but the second and more controversial one is that anyone can copy chunks of the work as long as they don’t copy nearly the whole thing: Sedlik concurrences and Thomas’s dissent in GvO. Do you think that these are both examples of thin ©?

Samuelson: we’re close to a thin © but not as thin as Goldstein suggested (which was an exact copying standard). Dennis Karjala suggested a rule of exact copying being needed for infringement; emulating functionality is really important for freedom. Lotus v. Borland—that’s actually a case allowing copying the interface (that is, a chunk) as a method of operation. You still have to write/code the functionality independently. Fed Cir ©ability ruling in GvO was a real step backwards—wrong as a matter of law.

Lemley: Thin ©: if the test is virtual identity, one way to test is for really close identity (99% or close), but © in general protects protectable expression. One approach would be to say that identical copying of even a small fragment is infringing under this virtual identity standard; the other approach would require copying the whole work verbatim or close to verbatim. That will matter a lot where most of what I copied was not protectable.

Reese: part of the problem is that most courts dealing with this look to Feist, but Feist isn’t about either of those things. In these kinds of cases, not very much of what you put into your work is ©able, b/c you’re using facts/all you get is selection, coordination, and arrangement. Infringement means parsing carefully which thin part of your work is protected by ©, but that doesn’t say anything about whether partial copying infringes a thin ©. There’s no original understanding to look back to!


No comments: