Friday, February 27, 2026

WIPIP Panel 4: Emerging Technologies

The European Accent of U.S. Digital Platform Speech (Brian Downing)

We are often told that self-governance by corporate platforms is better than government control, but his experience was that freedom of action wasn’t free. US gov’t defers to platforms, but they in turn defer to the EU. Thus, deregulation doesn’t promote freedom of action by platforms, who instead subordinate speech and security values. US is abdicating values it might want to inculcate into platforms.

Many things in EU regulation are good and should provide a US model—DSA rights of appeal are desperately needed. But codes of conduct etc. have no bargaining dynamic with US free speech principles. We should craft regulation to bargain with the EU vision. Our involvement in laws helps shape future regulation in ways that are better than our absence.

US approach, Moody v. Netchoice: few greater dangers to free expression than allowing gov’t to “change the speech of private actors in order to achieve its own conception of speech nirvana.” Corporate speech is protectable. Content moderation: we will have a light touch so we won’t break the internet.

EU: GDPR, DSA, DMA, AI Act—all implemented by many platforms globally. Even if you could segregate as a practical matter, you can’t build out a system like that just for Europe. You build it and implement it worldwide.

Also, technolibertarians have turned into compliance departments who just give a bottom line and they implement that globally.

Where the US stands back and doesn’t want to mess with US companies, what has actually happened is the DSA codes of conduct have removal policies that have a pseudo balance of privacy and speech, but the real way it works is that speech is subordinated to privacy or other interests. Companies are told they need to “voluntarily” agree.

If we want different rules, we have to bargain with this dynamic. We should have privacy regulation as a bargaining chip for alternatives to GDPR. You may hear “the EU is stepping back b/c they see their lack of competitiveness” but they are pretty modest; do not touch DMA or DSA. Just change some consent definitions for GDPR and timelines for AI Act, but not fundamentally changing balance of power. Changed definition of PII. Anonymization is helpful, but doesn’t cause a company that was setting its privacy standards based on the GDPR to ignore the right to be forgotten.

Q: what about economic dimension of Data Act? Applicable to internet of things, wider scope—does this affect platform economy?

A: lots of confusion about Internet of Things applying to mobile phone OSes. Platforms think that the definitions were broadened to future-proof them, but that means the rules don’t seem to match mobile OS. His guess: this will be private room meetings, nonenforcement agreements (rule says only imports of data are allowed, not exports; that will frustrate users of Android phones—do you really want that? Quiet nonenforcement agreement).

Right posture: enter regulatory game to influence it. AI: product liability approach w/safe harbors, not mandates to stifle user questions on sensitive topics.

Interoperability requirements threaten security and we need pressure on that. Maybe there should just be competition, not forced openness.

RT: in the abstract, very persuasive, but hard to agree when you see what the current US gov’t is doing: deliberately boosting right wing content outside the US. Also: bargaining requires a reliable partner, which we may not be capable of right now. Idea of passing federal legislation is a bit of a stretch.

Separately: current regime has differential effects on SMEs versus Meta, Apple and Alphabet (including differential requirements imposed by European regulators). [That is, the SMEs may not be building the worldwide systems that those big companies are.] Mismatch of understanding: The quiet nonenforcement agreement is understood as the operation of the rule of law in Europe but corruption in the US. That may make it hard to speak in the same register.

Bargaining chip implies that we’d want to continue to influence/control global rules. Compare mandating geofencing. (Blake Reid’s Jawbreaking and counterboning).

A: might be talking about a world that doesn’t exist any more. It is unsettling to confront regulators where quiet backrooms are the way things get done, and when there’s regulator turnover things can change fast. [My view is that this is correct but that Europeans would neither draft laws the way we do nor see compliance with law the way US courts (or administrators) would even if the law’s wording is the same.]

Q: what about the states? California might be able to regulate. Could be a reliable partner even! One difficulty is that there’s an attempt to stifle state regulations.

A: harder to operate on the corporate side where Illinois has one biometric law and California as another—harmonization is important. But there is a ton of action on the state side. Maybe we’d want these state actors as our representatives to the world—almost any actor negotiating would be better than the actors we have!

Venture Capital (Michael Burstein)  

VCs funded electric cars and mRNA vaccines, but recently VCs have concentrated investments in social media, crypto, and AI. Conventional wisdom is VCs pick and choose from promising pitches. It’s the ideas that drive the funding. We argue that’s exactly wrong: it’s the funding that drives ideas. VCs send signals to market about what they’ll fund, which induces entrepreneurs to found startups that will be funded.

VC preferences are shaped by social norms, need for power law returns, and short time horizons. Result: narrowing of innovation. We look at how VCs shape founder behavior—ethnographer’s dataset. Corporate law, contract law, and public policy could expand the possibilities for innovation.

2024: $215 billion from VCs, a little less than half of the pre-Trump science funding. So who makes investments in early stage companies with significant risk and when? Key features of the VC model: large equity stakes for founders, standard 10 year term of VC fund, 2/20 compensation structure. The goal is efficient allocation of capital w/in the funding realm.

But the outside perspective judges policies by innovation outcomes, not allocative efficiency. Here, the big problem of VC is clustering: $100 billion in AI, then $50 billion in healthcare—67% of VC funding, leaving 30 other industry categories like climate tech and hardware mostly unfunded. This carries through across demographic lines: female founders received 1% of funding, Black and Latinx got 1% and 1.5% of funding; 55% of funding was in California.

Innovation scholars shouldn’t treat VC as a black box: we look that entrepreneurs, the ones who decide what companies to found and what innovations to pursue. Entrepreneurs aren’t monolithic in preferences; partially motivated by finance but usually not the only or even the primary motivation for what the entrepreneurs want to do. Some entrepreneurs will trade off financial considerations for solving intellectually hard problems, or for doing good in the world, or something else. Preferences then interact w/market signals: from investors (VC sends signals about what will yield the highest returns 6-7 years post-funding, scalable with $ to increase likelihood of power law return); from product market (measure of success is positive unit economics and profitability); and from social world (divergence b/t private and social value of innovation).

Put preferences w/signals & get the marginal entrepreneur: the one who doesn’t eschew VC funding or just want money but has to decide where to invest innovative efforts.

VCs put a thumb on scale in pre- and post-funding environments. Often built around fads & bubbles; often very explicit in what they’re requesting. Repetitive & exhausting hype cycle.

Distorts investments: (1) misallocating resources, (2) distributive consequences. Wasteful duplication—similar to literature on patent racing. Pre-funding, perception of limited pool of capital induces overinvestment in these kinds of favored tech. Post-funding, VCs favor winner-take-all—you see wasteful investment in trying to capture consumers. VC returns may reflect anticompetitive behavior; unit economics diverges from ROI, and externalities are not fully internalized. This dynamic favors founders who resemble what VCs expect and can tailor their behavior to what VCs like, leaving out women, minorities, noncoastal and rural populations. Consumers are also left out—the tech is disproportionally aimed at the problems of the communities from which VC comes.

What can law do? Change content of VC signals: why is the standard term 10 years? Change the strength of the VC signals: modulate corporate law; promote countervailing signals/amplify other sources of funding.

The Innovation Paradox: How New Forms of Media Confound Copyright (Zachary Cooper)

The more innovation we have, the more © gets confused about dealing with new media. Use of Gen AI doesn’t reveal anything about creative relationship to work—it’s as helpful as saying “used software.” No means of auditing or evaluating. Authorship thresholds won’t work but that leads to problems of scale—too much content out there. We also have a problem of form: everyone can turn everything into everything else.

Copyright often does not recognize innovative new modes of creative expression, or allow innovation if built from other people’s works; innovation increasingly lowers costs of production, and innovation undermines fixedness—the notion that a work will stay itself.

New instruments since the late 70s have not been protected (synths). The sound of EDM for the next 50 years—but no one thought it was protected composition b/c it was just turning dials on a machine. We mined all the latent space in that composition—but the only thing that © protects is the melody. © is protecting the old part of Donna Summer’s I Feel Love, but not the innovative part. Meanwhile music services aren’t paying artists anything for anything they generate.

In 5 years no one will care if you used AI but it will be too late: we’ll have set up a surveillance apparatus that prevents you from creating unobserved.


No comments: