Fair Use on the Internet: Non-Expressive Uses and the Fairness of Opt-Outs
Abstract | Paper
Sag categorizes various theories of fair use and concludes they all ultimately depend on normative concepts of the good, even market failure (as evidenced by Wendy Gordon’s categorization of anti-dissemination motives as producing market failures). Someone needs to make decisions about how markets are going to be structured – spectrum auctions, the problem of anticommons in Russia, etc. Entitlements should encourage private ordering, facilitating individual autonomy, which should be a fundamental part of a copyright/property system.
He likes opt-out devices on the internet and for other mass aggregation projects for creating the advantages of property rights – people get to make their own decisions about who gets to enter or link to a site – but we don’t get enormous waste in transactions costs. He’d be fine with a blanket rule that scanning books is fair use, but if it’s conditioned on an easy opt-out we get to have the cake and eat it too. The people who really object get to opt out, but not ruin it for everyone or strategically hold systems for ransom in order to extract rents under the pretext of opposing something that, if they were honest, most authors would favor.
My Qs: Any international issues? Isn’t this just another word for formalities?
A: This exception would do fine at the WTO because it’s a limited exception covering only certain uses. And international fears shouldn’t be driving our policy decisions because it’s the right thing to do.
Formalities are bad; they benefit large corporations, and force you to make choices without knowing the future – I can modify robot exclusion headers as things change and exclude images both going forward and remove my images from the cache. That’s different from clumsy formalities. Private development of opt-outs is preferable to legislation.
My response: but robot exclusion headers also require knowledge and resources, and opt-outs inherently benefit large corporations that can send big lists of works to Google (and monitor new services as they develop). Robot exclusion headers are okay, but they really don’t help with scanned books. Though I agree with you on the merits, I don’t see that the critique of formalities differs from the critique of opt-outs.
A: There is a difference in the magnitude of the burden.
Julie Cohen: Wants to disentangle various issues – formalities are opt-in, but Sag advocates opt-out. Given that, it would be very helpful to discuss the literature on default rules generally and how to compare them. For example, Sag suggests no formalities for the beginning of protection, then opt-out later, which may have specific effects.
Implicit in all our work are inchoate theories about how the law evolved; he wants to pay more attention to them because they can help direct our work, which is after all usually directed to shaping or at least understanding the law. Should we address courts? Legislatures? Something else? There are simplistic stories: e.g., legislatures just follow the money, so we should appeal to the judiciary and a conception of the law. We understate the importance of the legislative process and the importance of convincing individual legislators on some point.
We are not really living in a time of all-pervasive copyright where subject matter has expanded to everything. The fashion designers, for example, want a sort of auxiliary copyright (comment: I think that’s just an opening bid). And music is a counterexample to the theory that copyright expands with tech; we’ve been making music since we’ve been human, but copyright rights didn’t include music for a while. Why in 1831 were the words “musical composition” added to the Act? It was one legislator’s conviction that such rights were important that was key to its establishment in the US. He was not a musician, but was friends with artists.
Public choice theory is important, but it’s not the only explanation for how policy is made.
Sag: What would we learn by looking at instances in which people lobbied for extra rights and got rejected? We need to look at who rejected such calls as well as who accepted them.
Carroll: Good point, though here he’s found no evidence that anyone was even asking for new music rights at that time.
By codification Burk means conversion of tacit knowledge into explicit knowledge. Codification can be costly; it requires language/code; recording/inscription; and reading/decoding. It’s the difference between a faculty member who knows all about how “we” do things and a faculty handbook known by and accessible to all of us that fully specifies things we care about (an impossible ideal, of course, just as it’s impossible for the faculty member to avoid injecting preferences of one kind or another). When knowledge is codified, that may tip the balance towards commodifying it – but there’s always tacit knowledge surrounding the code, at a minimum how to read it.
Patent encourages codification, which speeds commodification and may make it easer to do outside-the-firm deals. Internally, codification affects employee motivation and mobility (which is related to tacit knowledge and which firms attempt to control with trade secrecy and noncompete agreements). It has implications in open source, whose projects are usually organized more loosely than in firm structures, but the licenses are attempts to be rigorous in codifying rights.
Julie Cohen: Coded v. tacit – not quite clear on the distinctions. The formula for Coca-Cola is not tacit; it has precision, as does the code for CSS. Trade secrets are perhaps easier to protect legally when they’re written down. On the copyright side, scenes a faire is a doctrine about universally/culturally shared tacit knowledge (e.g. a scene in a German beer hall is a standard feature of a plot with Germans), mediating between what can be owned and not owned. Likewise with the PHOSITA, who has a lot of tacit knowledge.
A: There’s no such thing as entirely codified knowledge; there’s always a penumbra of tacit knowledge. Often in licensing a party wants not just the patent but the knowhow. Even in trade secret, there’s often a lot of codification, and even in patent, there’s a lot of penumbral knowhow.
My thoughts: codification can also be problematic by forcing the resolution of disputes, which is why laws so often punt, or refer to a topic while leaving much more undecided than decided. (You see this in Creative Commons licenses and debates over what counts as noncommercial use for CC purposes.) And I can see how that translates in patent into incentives to write claims broadly or hold back best methods of using the patent. Speculating wildly: GIs as a way of avoiding or refusing codification of what it is that’s special about a product, even as they codify certain production standards?
A: TM is a slightly different animal. TM reputation might be tacit knowledge; that’s more work for a later paper.
Comment: Many things are both codified and not: linguists codify grammar, but no one learns it that way.