Thursday, March 13, 2014

House DMCA hearing part 2

This is the question period.  I note just how many people in this conversation assumed that the technology behind Content ID could magically and easily be rolled out (costlessly?) for every form of content and every site online.  Echoes of SOPA/PIPA: just tell the geeks what to do! 

Coble: O’Connor suggests changes to §512—should they be specific or broad?

O’Connor: Always nervous about getting too detailed; tech will change. First stage of proposal is voluntary stakeholder process, and only move on to legislation if that doesn’t work. PTO/Copyright Office are trying to work through this; Congress could change the statute to allow Copyright Office to regulate.

Coble for Doda: should there be numerical thresholds for requiring websites to take action?

Doda: we don’t think there should be limits on the number of notices, so long as the notices are issued in good faith.  (I think he misunderstood this softball question.)

Coble for Siemenski: doesn’t the provision for attorneys’ fees and damages suffice in §512(f)?

Siemenski: we don’t really know because there are too few cases, because of the great imbalance of power between those sending the notices and those receiving them—big corporations v. individual users.

Coble for Bridy: should Congress create incentives for voluntary systems to address infringement, and if so what?

Bridy: the market has created sufficient incentives, as evidenced by current voluntary agreements.  IP Enforcement Coordinator has encouraged them, such as the Copyright Alert system; voluntary Best Practices agreements with ad networks.  Payment processors have too. 

Coble for Schneider: are there things other than technical measures that Congress could do to reduce infringement?

Schneider: her 3 points are her ideas.  Lawyers might be better equipped.

Rep. Nadler for Oyama: we’ve heard a lot about the whack-a-mole problem.  Prof. O’Connor suggested notice and staydown. What’s your comment?

Oyama: all service providers are sensitive to this, because they haven’t done anything wrong and are working hard to rid our systems of them. Understands the appeal.  But you have to look across products. Congress didn’t impose prefiltering and monitoring requirements, so that Facebook, Twitter, Google can allow posts in real time without filtering every post and tweet. Think scale: 60 trillion URLs.

Nadler: But Ms. Schneider writes a song and sends you a notice. Someone else reposts the same song. Is there the tech so that, having received a takedown notice for that song, a repost can be automatically taken down again?

Oyama: Notice and takedown is the best for that, b/c copyright owners know what they own and they know where they’ve authorized it. 

Nadler: but you already know it’s unauthorized.  Is the tech available to automatically take down/prevent from going up? 

Oyama: depends on the platform. (I wish this were coming through more strongly; a takedown notice is not a digital fingerprint.) Not practical for ISPs. If someone says “not allowed,” that doesn’t account for fair use. Intermediaries don’t know who the rights owners are or where the content is allowed to be.  Great models in private sector on YouTube, because we host and we have copies of all files/reference files from rightsholders. Businesses can build on that. 

Nadler for O’Connor: Congress allowed for red flag obligations. At what point should repeat notices trigger red flags? 

O’Connor: the issue in red flags is that the way courts have addressed it with willful blindness, but that’s not in the statute. Congress should decide what willful blindness means and put it in the statute.  Identifying content: I had videos demonstrating copyright for songwriters, using my own guitar. Posted YT, and very quickly it was taken down. Fair use is a different category. Tech is strong enough to recognize “here is the entire song” and if it’s been already noticed it should stay down.

Goodlatte: How does one measure success? Is it number of notices sent? Amount of infringing content that stays down?  Some other measure?  (Um, the rise of innovative online services?)

O’Connor: Whether it’s balanced between the parties—whether artists can get takedown and staydown.

Bridy: numbers about the growth of internet and industries that distribute content online, and those are good news on both sides.

Goodlatte for Oyama: should ISPs be required to respond differently when it’s the 50th notice?

Oyama: There should be a consistent set of obligations. YT: over 100 hours/minute. We need to know each time whether the use is appropriate. Quantity wouldn’t be enough.

Doda: one size does not fit all; we agree with Google there. Where sufficient matching can occur, staydown is appropriate. Collaboration and coordination.

Goodlatte: are there appropriate penalties for abuse?

Siemenski: no, because of the volume of abusive notices.  Statutory damages!

Schneider: if we control improper uploading, and immediate removal, we don’t have to worry about punishing people. 

Oyama: incentivize transparency: Google’s Transparency Report has helped everyone figure out the best vendors and the bad actors, which improves the system.

Doda: has to be placed into context—the abuses/mistaken notices are rare.  Counternotification: if you counternotify, the content can (eventually) be put back up. We support a level playing field for abuse.

Bridy: current remedies aren’t enough—enhanced damages might be appropriate.

O’Connor: are enough.

Rep. Conyers: big corporations can take care of themselves. What about the individual artist?

Schneider: Shows the internet form for uploading content. Says nothing about accountability for uploader. When she has to send a DMCA notice, it’s in bigger print, showing many hoops to jump through to take down. (Her name, the location of the infringing material, what else?) YT’s takedown procedure is now so much better, but now it says “this video is no longer available due to a copyright claim by Maria Schneider,” plus a sadface; that’s designed to send animosity to me but uploader can use a username.  Upload controls need to be better.

Doda: we accept that onus is on rightsholder on the outset.  Individual creator burden is different—filtering process.  Would endorse contribution to a referential database. If the notice is sent and there are no counternotices, her work should stay down.  Stakeholders should debate the details like the sadface.

Conyers: are the smaller artists in a worse position?

O’Connor: yes.  Tools for staydown would avoid need for compliance staff of that magnitude.

Rep. from Ohio (sorry, bad eyes): Congrats to Google for Copyright ID as private sector move.  Only a collaborative effort between content, service providers, payment providers, and advertisers will work. Best solutions won’t come from gov’t but from free market collaboration.  YT couldn’t have launched as a startup if it had been required to start w/Content ID—but how has infringement affected other startups? 

Oyama: All providers face challenges here; we invest millions to root out bad actors. Overall picture is extremely positive because of the DMCA’s foundation of legal certainty. Incentive to innovate. Licenses from all major labels/studios on YT.  Rightsholders mostly choose to leave the content up and get the majority of the revenue.  Users, platforms, rightsholders benefit—incentivizing the collaboration to grow the pie together is the right way to go.

Q: Rogue sites ejected from your network—how many were from outside the US?

Oyama: a large number are internationally based, but it’s a mixture.

Q for Schneider: what would you like to say to college students who think you’re driving a limo?

Schneider: my 3-time Grammy winning album should long ago have paid for itself if it wasn’t pirated; I’m still paying off $100,000.  We are diluted by being splashed over the internet, and musicians are coming to the conclusion that this exposure isn’t resulting in money, but just diluting us—if they see a dozen different performances on YT, they won’t buy our music. Young musicians are very scared. 

Rep. Chu: Editorial in the Hill today: system isn’t working for small creators who are victims of theft.  Safe harbor isn’t a loophole: could do things today to make notice and takedown work better for small creators. MPAA did a study showing that search engines are the main means by which people get pirated content.  Google changed its algorithm in 2012 to take into account notices received. Should’ve resulted in lower search results for pirate sites/quality sources. But several months later, studies show that sites for which Google received 100s of thousands of notices are still top of the list.  I tried to watch 12 Years a Slave for free—I only got to “watch 12” before I got an autosuggest to “watch 12 Years a Slave for free.”  Same with Frozen: When I type in “watch Frozen online for free” I get an offer to watch it for free. Why isn’t this fixed? (Um, because you searched “watch Frozen for free”?)

Oyama: there’s been improvement.  Users search for movie titles, song names, etc.  You can type in terms and see relatively how popular queries are. 12 Years a Slave is a popular query and the results are clean. Trailers, info, links to purchase. You can also add “free” but you should be very clear that there are still conversations about queries that end in free/download/stream—this is a very technical issue, and requires working with retailers to make sure that they have those words too. But if you look for what people actually search for, they are clean. You are talking about a relatively small set of queries. And for those, we need something legit to surface. If the movie is not online, it’s hard for us to return results for that query.

Chu: I didn’t type in free.  (But she seems to have used autocomplete to get it.)

You say that takedowns must exceed 5% of the content on a site before changing an algorithm.  But then 20,000 notices may not qualify a site. That’s a lot.

Oyama: there’s no minimum threshold for the algorithm.  Smaller set of queries is still an issue, but if we’re talking about truly bad actors, we should target them at their source—follow the money, don’t tell us to screw with our algorithm.  (I am paraphrasing more than usual.)

Q from another Rep. Farenthold: How easy is it for me to get a license to put music under my cat video? How many hoops?

Schneider: all you have to do is ask me for permission, and that’s up to me to give it.

Q: so I have to find you the songwriter, and then the performer.

Schneider: I’m at

Q: Isn’t there an opportunity to make it easier for innovators/creators of derivative works to license your content legally?

Schneider: it violates my copyright to use my music without my permission.

Q: I want to respect that, but I also want music on my cat video.

Schneider: public domain.

Q for Oyama: search engines enabling infringement—when Congress was threatening to regulate movie content, MPAA voluntarily created ratings system. (Not what I’d call a great model.) Shouldn’t you be better corporate citizens on this whackamole?  I can get Shazam to identify a song in a large room. You ought to have the tech to do that, voluntarily.  (Yes, for music/video they have fingerprinted, if they’re the hosts.)

Oyama: we are working on that. Using automation to help rightsholders.  We process millions of notices per month. We try to direct users to one-click purchases of legit content.

Q: expecting something from Google is different from a small website owner or from a small ISP. If I have a bulletin board, I don’t have the resources to screen every photo.

Rep. Deutsch: Keep in mind independent artists.  DMCA enabled growth of digital services; need to ensure current balance continues to work for both sides. DMCA designed to protect good faith, not people benefiting financially from pirated content.  This has been obscured.  Google has intervened as a friend of court to press the view that DMCA is available for those who are actively inducing copyright infringement.

Oyama: Not aware of briefs that say that.  Critical purpose of DMCA is legal certainty, and we see tremendous boost to creative industries from these platforms. Case law distinguishes bad actors like IsoHunt from legit services like YT and Google; we’ve had lawsuits targeting us too.  There are bad sites that don’t operate within the DMCA.

Deutch: if purpose of site is to induce copyright infringement, there should be no safe harbor whether they technically comply with it or not.

Oyama: while that sounds reasonable, amicus briefs have many issues.

O’Connor: there are legitimate licensing mechanisms. We shouldn’t use safe harbor to shield people who put up clearly infringing material.

Rep. Marino (PA): Creators have horror stories.  For Sieminski: how do you interact with some user who’s received hundreds of notices?

Sieminski: yes, as required by the law, we have a repeat infringer policy, and if a user does receive over a certain number of notices, their account is suspended permanently.

Marino for Oyama: Can you implement a voluntary system moving authorized legit results to the top of the page? Red light/green light system.

Oyama: we always want authorized legit results to appear. We’ve worked with rightsholders to make sure vast majority of queries relating to media are legit.  DMCA applies to all 68,000 service providers.

Marino: I like states’ rights/no fed gov’t, but here we need more to be done.  (Side note: pretty sure that’s what many people say about their own particular issue that he doesn’t want the feds involved in.)  Can we not return results when someone types in “Frozen free”?

Oyama: striking “free” from search results would be a bad idea. There is a lot of legit content available for free.  Surface legit content: increase the availability of legit content.

Marino: there’s got to be a way to flag “free” searches.

Oyama: if you want legit results, your legit pages should use the word “free.”

Marino: mobile apps?

Oyama: Google Play: we really hope that will grow opportunities for independent artists/big companies alike. We have notice and takedown for mobile apps too, and kicked out 25,000 under notice and takedown.

Rep. Richmond: Sometimes we’re forced to act even though we aren’t the best to act on tech issues. Suggests stakeholders get together and figure it out.  Can you manipulate/manage autocomplete so that it doesn’t suggest “free”? You’re pushing them to that space even if they didn’t want to go there.

Oyama: you can see what real users are actually typing in. You can see it’s the movies and the artists, and the results there are clear. Our policy on autocomplete is that we accept (I think she means “for removal”) terms associated with piracy—but there are many legitimate uses for “free.” The conversation with rights owners remains ongoing.

Richmond: what advice would you give small artists about protecting their copyrights?

Oyama: the advice I get from other small creators: use new distribution models. Five years ago, people in the industry were very focused on takedown, but now we see tremendous opportunities when users get excited about music; artists can monetize that.  Stay focused on enforcement, but also think about other ways of enabling internet enforcement. This morning, read an op-ed from a country artist: how the internet saved my career—she used user analytics to figure out where she was popular and add those places to her tour.  Run advertising around content.

Richmond: but how would you advise them to protect their copyright and make sure other people aren’t making money from it? Look at it from the other person’s side.

Rep. Smith (MO): What should voluntary agreements look like?

Doda: Can be cumbersome.  User-upload sites aren’t really well-controlled by the system.

Oyama: any system has to recognize how startups work.  They start small.  Google added over 50,000 hours of engineering to YT after acquisition. This resulted in a fingerprint system scanning more than 15 million fingerprints, which then resulted in over $1 billion to music industry; new artists can make over six figures on their YT channels.

Rep. DeBiene (sp?) (WA): what is going on internationally?

Oyama: when we know sites are based in foreign countries, sometimes we do have good diplomatic relations—coordinating diplomatic pressure would be a good place. Follow the money to dry up US ties/incentives is also good.  Fair use and safe harbors: we rely on them to exist, and if we see them threatened in foreign countries we can’t deliver them there, which takes revenue away from American companies. So we should press DMCA/safe harbors in our international agreements.

Schneider: We should set the bar/be an example to the world of protecting artists.  A company making billions on its patents and artists’ IP versus my community, which is hemorrhaging on its IP. There has to be something that makes it sustainable. Finding one person on YT who makes money is like going into a poor neighborhood and finding one person who won the lottery.  (Of course, making music was a reliably lucrative career for most people who wanted to be musicians before the internet, right?)

O’Connor: we need artists to have space/tools—if we don’t make it hard to infringe, sites will have to copy the sites that make it easy to infringe.  (I feel we have strayed from the international question.)

Bridy: remember to think about startups that don’t have money to invest in huge burdens felt disproportionately by small companies.

Rep. Collins: Google was once a small startup; now it’s big.

Oyama: Making this process as simple and automated and low-cost as possible can be a big piece. There is a thriving vendor market—specialized in sending notices. Many different people can use those services.  Specialists are getting smarter and faster about it; bring them into the conversation.

Rep. Jeffries: Ultimate aim of copyright law is to stimulate artistic creativity for the general good.

Bridy: absolutely. DMCA has helped that balance.

On red-flag knowledge, we’re going to get some guidance from Viacom v. YouTube: most of the courts have said that red flag knowledge is knowledge of facts or circumstances from which specific infringing activity is apparent, not just generalized knowledge that infringement is occuring on the system.

Jeffries: has the definition been sufficient?

O’Connor: No.  It’s too limited to actual knowledge of a particular work, even if you have a sense that lots of infringement is going on. Congress should set a policy: if you’re aware that infringement is going on, you have an obligation to do something.  (What?  Get the geeks on that.)

Jeffries: constructive knowledge. But is there an argument that the internet is different? The DMCA did not impose an inquiry requirement.

Oyama: the internet context makes the specificity requirement more important because of the diversity of the ecosystem and the different ways artists are engaging.  Artists have very different stances on what they do/use.  We need them to tell us which uses are okay.  Don’t destroy innovation by making it more risky for providers to build new services/filters.  No one understands a vague standard and that’s a deterrent.

Rep. Poe: I don’t like stealing. What we’re dealing with is internet thievery/piracy.  (Sigh.) Trying to solve it through the private sector, not the police.  (Why?  If this analogy is correct, prosecuting crimes is literally the only thing the nightwatchman state should be doing (and crimes against property are the core crimes in the libertarian vision of crimes).  I’m not saying I like the dominant “liberal” position here either, but I dislike special pleading too.)

I found streams of House of Cards on the first page (or maybe he thinks he can; not clear). Isn’t there a way to get rid of bad results through your algorithms?

Oyama: notice and takedown is well suited to that. As soon as we’re alerted that links are bad, we take it down (6 hours).

If you google House of Cards it will be legit results. 

Poe: can’t we stop those results from appearing in the first place?

Oyama: go after the people posting the pirate content.

Poe: 73,000 takedowns in ad system, 200 million in past two years—how much does that cost?

Oyama: it’s a huge burden, tens of millions of dollars; hundreds of people working on it—can’t estimate totals.  Because of engineering effort and bulk submission tools for use by trusted rightsholders, we now have a good system, but we still have engineers and lawyers working on policies.  We are building licensed models like Google Play; we share revenue if creators are getting revenue. No one makes money if someone clicks on a pirate site.

Poe: I hope you can solve this without gov’t involvement because sometimes gov’t makes things worse and not better.

Rep. Colline: Safe harbors often require an actor to try something in good faith.  Why not require good faith/reasonable effort to prevent infringement? We wanted to protect certainty/growth of internet, for good reasons, but now we see this reposting problem making notice and takedown a bit of a mockery.  If it can be instantly reposted, the DMCA isn’t having the intended effect.  Could require ISP to remove/disable access and prevent its reposting.  It would be better for industry to figure this out itself, but if the tech exists when notice is provided we ought to have the ability to have it taken down and to prevent it from recurring.  Don’t put all the burden on the victim of the crime. Once you ID and notify, honor request by not requiring it to be redone.

Schneider: educate the uploader.  How? That has to be worked out. Streamlined takedowns are good, but Content ID needs to stop it before it goes up across the internet.  Imagine if Content ID worked for everybody. 

Dota: uploaders are often repeat infringers, so stronger repeat infringer policies would help. Clearer parameters around tracking them and keeping them out.  (How do you know?)

O’Connor: education could be done by directing people to the CCC, Harry Fox, other licensing mechanisms.  Many mechanisms to do this legally.  (Don’t anybody tell the judge in the Georgia State case about the ease of using the CCC ... oh, wait.)

Oyama: education is something we work on in YT Copyright School.  We have strong repeat infringer policies.  While preventing reposting sounds attractive, we don’t know when a post is authorized.  Links, comments, tweets, would all have to be screened before posting.  That would chill these services and they’re driving content sales. Be careful for innovation.

Rep. Lofgren: Without safe harbors, there wouldn’t be an internet. First do no harm.

Oyama: note we can’t match content for a filter on search. We have text, not embedded video etc.

Lofgren: could an uploader encrypt and defeat the ban on reuploading?

Oyama: Yes, which is why we need more cooperation.

Lofgren: SOPA, one lesson: don’t suggest things that are impossible or that would break the internet.

How would you scrutinize each site on the web the way YT scrutinizes video?

Sieminski: YT did great work developing Content ID with its $50 million investment, which is many times bigger than our company, and we’re bigger than many.  Tech also can’t answer the question of whether use is authorized or fair.

Lofgren: ISP doesn’t have incentive to stand up for speech. Should there be financial disincentive for DMCA abuse?

Sieminski: Yes: statutory damages exist for infringement. It’s not the majority of notices, but even a little censorship is not ok. We don’t see millions of notices because we’re not a filesharing platform; we do see abuses.

Rep. Issa: How is this different from teachers sharing Xeroxes in classrooms? 

O’Connor: interesting questions of fair use/classroom use—he was never handed most of a book.

Issa: but today, if the equivalent is occurring the copyright owner can now find it in the open?

Oyama: Vibrant market now exists for vendors.

Issa for Schneider: 3-4 decades ago, you wouldn’t have known about people copying on 8-tracks, which were good enough to listen to—isn’t it easier to track today?  Takedowns involve trial and conviction by accuser.

Schneider: cassettes weren’t very good quality, and tracking is no good because so much of my music is out there.  Vast majority of artists now pay for their own records.

Oyama: we are seeing increased takedowns because the internet is expanding, but time to takedown is going down because of automation.

Rep. Lee (TX): Affirmative duty to monitor: how would that work?

O’Connor: content can be quickly reproduced. For entire work reposted, use Content ID.  (I’m sure the photographers will be reassured to hear that Content ID can identify all their works.)

Lee: who would be liable?

O’Connor: people posting, plus we’d target a culture of copyright contempt where startups decide to turn a blind eye.

Oyama: the sky is the limit on monetizing globally as markets expand. The question is how do you direct legitimate content to users? By increasing the availability of legit offerings. Spotify’s rollout decreases piracy 25% in an area. Think broadly about stimulating licensed services. The DMCA is critical to that. Countries without internet safe harbros don’t have our internet industry.

Lee for Sieminski: how many counternotices did you receive last month?

Sieminski: 4 out of 825. Counternotice system has many problems—procedures are tough for average users; there’s a 10-day period of silencing.

Lee: §512(f) does have a fee provision. Can courts craft damages?

O’Connor: we should make sure we can deal with abuses.

Schneider: Monetization won’t help me: the money is so small per view.  (RT: Of course, all those people would have paid full freight without YouTube, right?)

Content is being used to draw eyeballs for ads. The solution is more robust staydown and Content ID for every company so everything is filtered.  (RT: This, by contrast, is not more paraphrased than usual.)

No comments: