counternotice – Techdirt (original) (raw)

from the designed-to-screw-over-everyone-but-the-big-companies dept

Here are two separate stories regarding the mess that is modern copyright law, that is now mostly “mediated” by companies that half-ass randomly deal with things and sometimes do not. While this is, perhaps, a better setup than stupidly suing kids for daring to like or share a song, it still suggests the entire copyright framework of today is broken. Both stories involve experts in copyright law finding that the system sucks, even when you know the system.

Story One: The Extra-legal Takedown Game is Rigged

The first story comes from Michael Weinberg, the Executive Director at the Engelberg Center on Innovation Law & Policy at NYU. The center is home to multiple copyright law professors and experts who do some amazing work.

Way back in 2019, the center had a symposium on “Proving IP,” featuring two musicologist experts (who were on opposing sides in this case), talking about about the musicology aspects of the infamous “Blurred Lines” case. As the Engelberg Center explains:

The primary purpose of the panel was to have these two musical experts explain to the largely legal audience how they analyze and explain songs in copyright litigation. The panel opened with each expert giving a presentation about how they approach song analysis. These presentations included short clips of songs, both in their popular recorded version and versions stripped down to focus on specific musical elements.

After the symposium, the center (reasonably) posted the video to YouTube and the audio on its podcast feed. Back in 2019, soon after it was posted, Universal Music filed a strike on the YouTube video, because that’s just how Universal Music rolls (even if the company should know better). Eventually, the Center was able to get the video reinstated, but only after Universal Music stood its ground saying the video was infringing, and the Center resorted to using backchannels to reach people at YouTube to say “hey, what the fuck…?”

However, never underestimate the fuckery that Universal Music can get up to with a copyright takedown system. Last fall, four years after this event had been posted on the Center’s podcast feed, Spotify removed the podcast, again claiming it was due to copyright infringement on Universal’s work. The Center responded the initial copyright claim, by noting that it was clearly fair use. After receiving a confirmation (and, smartly, the Center took a screenshot), Spotify still chose to take the podcast down (falsely) claiming that the Center had not responded to the original notification.

Now, in some ways, the fact that Spotify asked for a response from the Center before taking it down is a baby step of progress. Many services immediately take stuff down and ask questions later. A “notice and notice” system is much more reasonable, but I guess it only woks if the company reads the response notices.

When they complained, Spotify told them they had to sort things out directly with Universal Music. The Center — again full of copyright experts — realized it needed to see the details of whatever notice UMG had provided Spotify. If it was an actual DMCA 512 takedown notice, they could try to file a 512(f) claim back against Universal (again, Universal has been sued over this before and, in theory, knows it’s supposed to take fair use into account).

But, Spotify wouldn’t even tell them the nature of the notice. Was it a DMCA? Was it Spotify’s own audio matching service? No one knows. No one’s willing to say.

On October 12th, Spotify told us that in order to have our podcast episode reinstated we would need to work things out with UMG directly. That same day, we asked for UMG’s actual takedown notice so we could do just that.

We did not hear anything back. So we asked again on October 23rd.

And on October 26th.

And on October 31st.

On November 7th — 26 days after our episode was removed from the service — we asked again. This time, we sent our email to the same infringement-claim-response@ email address we had been attempting to correspond with the entire time, and added legal@. On November 9th, we finally received a response.

Apparently, cc’ing “legal” gets results.

Spotify’s email stated that our episode was “not yet subject to a legal claim,” and that if we wanted to reinstate our episode we needed to reply with:

This second element is noteworthy because it matches the language in Section 512(f) mentioned above.

We responded with a detailed explanation of the nature of the episode and the use of the clips, asserting that the material in question is protected by fair use and was removed or disabled as a result of a mistake (describing the removal as a “mistake” is fairly generous to UMG, but we decided to use the options Spotify presented to us).

Our response ended with another request for more information about the nature of the takedown notice itself. That request specifically asked if the notice was a formal notice under the DMCA, and explained that we were asking because we were considering our options under 512(f).

Spotify quickly replied that the episode would be eligible for reinstatement. In response to our question about the notice, they repeated that “no legal claim has been made by any third-party against your podcast.” “No legal claim” felt a bit vague, so we responded once again with a request for clarification about the nature of the complaint. The next day we finally received a straightforward answer to our question: “The rightsholder did not file a formal DMCA complaint.”

In other words, it wasn’t a DMCA, it was just Spotify’s own systems (its version of ContentID or whatever) that made a match. The Center’s key takeaways were that this system, where online service providers (under tremendous pressure from rightsholders) set up these kinds of outside-the-legal-process tools to takedown content, users are pretty much left with no recourse, and they’re likely to get screwed:

What did we learn from this process?

First, that Spotify has set up an extra-legal system that allows rightsholders to remove podcast episodes. This system does a very bad job of evaluating possible fair uses of songs, which probably means it removes episodes that make legitimate use of third party content. We are not aware of any penalties for rightsholders who target fair uses for removal, and the system does not provide us with a way to pursue penalties ourselves.

Second, like our experience with YouTube, it highlights how challenging it can be for regular users to dispute allegations of infringement by large rightsholders. Spotify lost our original response to the takedown request, and then ignored multiple emails over multiple weeks attempting to resolve the situation. During this time, our episode was not available on their platform. The Engelberg Center had an extraordinarily high level of interest in pursuing this issue, and legal confidence in our position that would have cost an average podcaster tens of thousands of dollars to develop. That cannot be what is required to challenge the removal of a podcast episode.

Third, it highlights the weakness of what may be an automated content matching system. These systems can only determine if an episode includes a clip from a song in their database. They cannot determine if the use requires permission from a rightsholder. If a platform is going to deploy these types of systems at scale, they should have an obligation to support a non-automated process of challenging their assessment when they incorrectly identify a use as infringing.

Story 2: The system is just as broken for actual victims of infringement

The second story also comes via a internet law / copyright law professor, David Post (whom we’ve quoted many times on Techdirt). It turns out that on the side, Post (as a hobby) plays folk music with another lawyer and friend, Craig Blackwell in a band called Bad Dog. They’re really good! You can listen to their music on SoundCloud (turns out there are a few other bands out there called Bad Dog, but that link takes you to the right one).

They recorded this album and uploaded the tracks to SoundCloud last year. And then, they decided to get a small run of CDs made using Disc Makers. But, Disc Makers alerted them they wouldn’t press the CD, because a computerized check found that the music was found all over the internet where it was claimed by other artists.

The NY Times eventually wrote an article about the whole mess. It appears that once they uploaded their songs to SoundCloud, someone came in, grabbed all the tracks, and uploaded them everywhere else (YouTube, Spotify, Apple Music, etc.) but put it under totally different names.

But not long after “The Jukebox of Regret” was finished in July and posted on SoundCloud, nearly every song on it somehow turned up on Spotify, Apple Music, YouTube and at least a dozen other streaming platforms. This might have counted as a pleasant surprise, except for a bizarre twist: Each song had a new title, attached to the name of a different artist.

This mysterious switcheroo might have gone unnoticed. But by happenstance, it was discovered when the guy who produced the album posted one of the songs on his studio’s Instagram account. To his astonishment, Instagram automatically tagged the song “Preston” by Bad Dog as a song called “Drunk the Wine” by Vinay Jonge — a “musician” with no previous songs and zero profile on the internet. He didn’t seem to exist.

The full extent of this heist soon became clear. “Pop Song” by Bad Dog had become “With Me Tonight” by someone named Kyro Schellen. “The Misfit” had become “Outlier” by Arend Grootveld. “Verona” had become “I Told You” by Ferdinand Eising. And so on. Same music, different track names and credited to different artists, none of whom had any other songs or any profile on the web.

Now, in theory, this seems like exactly the sort of scenario where copyright is supposed to help the artist. This is a case where someone was not just making a copy of someone’s work, but then uploading it to various other music platforms under totally different names, and pretending it was theirs.

So Post & Blackwell, again, both of whom are (in some ways) deeply familiar with the copyright system as lawyers, tried to make use of these tools and automated systems that are supposed to help protect artists. How do you think it worked out?

To retrieve their songs, Mr. Post and Mr. Blackwell sent out what are called takedown notices, or formal requests to remove pirated music, to a bunch of different sites. The band members used their SoundCloud page to demonstrate that their recordings predated all the uploads on the streaming platforms.

Two sites responded fairly quickly. Amazon Music removed the songs in about a week. YouTube soon followed.

Other platforms offered little more than canned emails. (“Your claim will be processed by our team,” Spotify replied.) Apple Music sent a form letter, too, though it included a tantalizing clue: the name of the company that had uploaded the songs.

And who do you think was the company that had actually uploaded these songs, violating the copyright of Bad Dog?

Warner Music, of course! The same Warner Music which, at one time, was one of the loudest voices screaming about how copyright law needs to be more draconian, and how sites like YouTube were evil. Now it’s Warner Music uploading infringing tracks to help some scammers profit off of musicians’ work.

The only thing that got Warner to back down? A call from the NY Times reporter.

On Dec. 5, this reporter emailed the public relations department at Warner. A spokeswoman there looked into the matter and soon after said the songs had been uploaded via a subsidiary called Level, a music distributor catering to independent artists. (“Your release, streamlined,” the company says on its website.) For a $20 annual fee, Level uploads audio to a long list of digital streaming platforms. It asks only for customers to tick a box and agree to terms of service, which include a promise not to post any audio owned or created by someone else.

Warner moved quickly. On Dec. 6, the company removed all the pirated versions of Bad Dog’s songs from all of the sites. (The company would not discuss how.) Soon after, anyone typing “Vinay Jonge” into Deezer, the French online music platform, got an error page that read, “Oops … It did it again.”

By then, Bad Dog’s songs had collectively been played more than 60,000 times on Spotify. The number suggests that the fraudster found a way to generate listens for the song, but not at numbers that would arouse suspicion. At Spotify’s rates, all those listens would translate into just over $250.

So, yes, Warner was just acting as a middleman here, but it’s kind of incredible how, in just 15 years or so, Warner went from hating YouTube and claiming the whole site was an infringing mess, to being a major source enabling scammers to upload infringing tracks to profit from.

As the article notes, this scam is likely happening a lot, and people just aren’t noticing it. It’s basically a fluke that Post even realized it was happening with his music, and then the problem was only solved because he was able to get a NY Times reporter to pick up the case.

In both of these stories, though, you quickly see just how broken the copyright system is on both sides of the equation unless you’re a giant company. In all of this mess, the record labels (Universal Music and Warner Music) and the streaming companies (Spotify, YouTube, etc.) make out like bandits, and the actual creators get screwed.

But, what are the chances of that ever getting fixed?

Filed Under: automated takedowns, copyright, counternotice, david post, fair use, michael weinberg, scammers
Companies: spotify, universal music group, warner music

DidYouKnowGaming Gets Video Nintendo DMCA’d Restored

from the for-once dept

Back in December we discussed how Nintendo got a video on the DidYouKnowGaming YouTube channel taken down via a DMCA notice. While Nintendo is notorious for being an intellectual property bully and enforcing what it thinks are its rights in as draconian a manner as possible, what stood out about this particular story is that the video in question was a journalistic effort to document a game pitched to Nintendo that never came out, included no gameplay footage, and therefore didn’t reproduce any actual game assets. It appears for all the world that Nintendo used the DMCA system to take down a video comprised of pure gaming journalism, which is not how any of this is supposed to work.

DYKG, for its part, both promised to fight the takedown and implored its Twitter followers to reach out to Nintendo and tell the company what they thought of this sort of censorship of facts.

And now it appears that the channel made good on its promise to fight back, or at least not give in. YouTube restored the video this week after DYKG filed a counternotice, finding that there was no copyright infringement within the video.

“We won,” YouTube channel DidYouKnowGaming tweeted on December 28. “The Heroes of Hyrule video is back up.” It added that YouTube confirmed the original copyright takedown notice was indeed from Nintendo and not an imposter, and that the video has received over 20,000 views in its first day back.

“When you counter a DMCA on YouTube, the company who DMCA’d you has 10 working days to show that they’ve taken legal action against you, or the video is restored,” tweeted Shane Gill, the owner of DidYouKnowGaming. “So I spent the past two weeks checking my email to see if Nintendo was suing [sic] me.”

I suppose that lawsuit could still be forthcoming, but man would it be a stupid move on Nintendo’s part if that happened. Again, the important bit is that there is no actual Nintendo content being copied here or anything in the video that would fall outside of obvious fair use provisions.

The takeaway here is that it would be really nice if stories like this becamse more common, particularly when it concerns the more notorious IP bullies out there.

Filed Under: copyright, counternotice, didyouknowgaming, dmca
Companies: nintendo

Facebook Is So Sure Its Erroneous Blocking Of Music Is Right, There’s No Option To Say It’s Wrong

It’s hardly a secret that upload filters don’t work well. Back in 2017, Felix Reda, then Shadow Rapporteur on the EU Copyright Directive in the European Parliament, put together a representative sample of the many different ways in which filters fail. A recent series of tweets by Markus Pössel, Senior Outreach Scientist at the Max Planck Institute for Astronomy, exposes rather well the key issues, which have not improved since then.

Facebook muted 41 seconds of a video he uploaded to Facebook because Universal Music Group (UMG) claimed to own the copyright for some of the audio that was played. Since the music in question came from Bach’s Well-Tempered Clavier, and Bach died in 1750, there’s obviously no copyright claim on the music itself, which is definitely in the public domain. Instead, it seems, the claim was for the performance of this public domain music, which UMG says was played by Keith Jarrett, a jazz and classical pianist, and noted interpreter of Bach. Except that it wasn’t, as Pössel explains:

Either I am flattered that a Bach piece that I recorded with my own ten fingers on my digital keyboard sounds just like when Keith Jarrett is playing it. Or be annoyed by the fact that @UMG is *again* falsely claiming music on Facebook that they definitely do not own the copyright to.

This underlines the fact that upload filters may recognize the music – that’s not hard – but they are terrible at recognizing the performer of that music. It gets worse:

OK, I’ll go with “very annoyed” because if I then continue, Facebook @Meta DOES NOT EVEN GIVE ME THE OPTION TO COMPLAIN. They have grayed out the option to dispute the claim. They are dead wrong, but so sure of themselves that they do not even offer the option of disputing the claim, even though their system, in principle, provides such an option. And that, in a nutshell, is what’s wrong with companies like these today. Algorithms that make mistakes, biased towards big companies like @UMG.

This absurd situation is a foretaste of what is almost certainly going to happen all the time once major platforms are forced to use upload filters in the EU to comply with Article 17 of the Copyright Directive. Not only will they block legal material, but there will probably be a presumption that the algorithms must be right, so why bother complaining, when legislation tips the balance in favor of Big Content from the outset?

Follow me @glynmoody on Twitter, Diaspora, or Mastodon. Originally posted to WalledCulture.

Filed Under: copyright, copyright filters, counterclaim, counternotice, mistakes, public domain
Companies: facebook, umg, universal music group

GitHub, EFF Push Back Against RIAA, Reinstate Youtube-dl Repository

from the DEAR-RIAA-YOU-ARE-CORDIALLY-INVITED-TO-GFY dept

A few weeks ago, the RIAA hurled a DMCA takedown notice at an unlikely target: GitHub. The code site was ordered to take down its repositories of youtube-dl, software that allowed users to download local copies of video and audio hosted at YouTube and other sites.

The RIAA made some noise about copyright infringement (citing notes in the code pointing to Vevo videos uploaded by major labels) before getting down to business. This was a Section 1201 complaint — one that claimed the software illegally circumvented copyright protection schemes applied to videos by YouTube.

The takedown notice demanded removal of the code, ignoring that fact there are plenty of non-infringing uses for a tool like this. It ignored Supreme Court precedent stating that tools with significant non-infringing uses cannot be considered de facto tools of infringement. It also ignored the reality of the internet: that targeting one code repository wouldn’t erase anything from dozens of other sites hosting the same code or the fact that engaging in an overblown, unjustified takedown demand would only increase demand (and use) of the software.

Youtube-dl is a tool used by plenty of non-infringers. It isn’t just for downloading Taylor Swift videos (to use one of the RIAA’s examples). As Parker Higgins pointed out, plenty of journalists and accountability activists use the software to create local copies of videos so they can be examined in far more detail than YouTube’s rudimentary tools allow.

John Bolger, a software developer and systems administrator who does freelance and data journalism, recounted the experience of reporting an award-winning investigation as the News Editor of the college paper the Hunter Envoy in 2012. In that story, the Envoy used video evidence to contradict official reports denying a police presence at an on-campus Occupy Wall Street protest.

“In order to reach my conclusions about the NYPD’s involvement… I had to watch this video hundreds of times—in slow motion, zoomed in, and looping over critical moments—in order to analyze the video I had to watch and manipulate it in ways that are just not possible” using the web interface. YouTube-dl is one effective method for downloading the video at the maximum possible resolution.

At the time, GitHub remained silent on the issue, suggesting it was beyond its control. Developers who’d worked on the youtube-dl project reported being hit with legal threats of their own from the RIAA.

There’s finally some good news to report. The EFF has taken up GitHub/youtube-dl’s case and is pushing back. A letter [PDF] from the EFF to GitHub’s DMCA agent gets into the tech weeds to contradict the RIAA’s baseless “circumvention” claims and the haphazard copyright infringement claims it threw in to muddy the waters.

First, youtube-dl does not infringe or encourage the infringement of any copyrighted works, and its references to copyrighted songs in its unit tests are a fair use. Nevertheless, youtube-dl’s maintainers are replacing these references.

Second, youtube-dl does not violate Section 1201 of the DMCA because it does not “circumvent” any technical protection measures on YouTube videos. Similarly, the “signature” or “rolling cipher” mechanism employed by YouTube does not prevent copying of videos.

There’s far more in the letter, but this explains it pretty succinctly in layman’s terms:

youtube-dl works the same way as a browser when it encounters the signature mechanism: it reads and interprets the JavaScript program sent by YouTube, derives the “signature” value, and sends that value back to YouTube to initiate the video stream. youtube-dl contains no password, key, or other secret knowledge that is required to access YouTube videos. It simply uses the same mechanism that YouTube presents to each and every user who views a video.

We presume that this “signature” code is what RIAA refers to as a “rolling cipher,” although YouTube’s JavaScript code does not contain this phrase. Regardless of what this mechanism is called, youtube-dl does not “circumvent” it as that term is defined in Section 1201(a) of the Digital Millennium Copyright Act, because YouTube provides the means of accessing these video streams to anyone who requests them. As a federal appeals court recently ruled, one does not “circumvent” an access control by using a publicly available password. Digital Drilling Data Systems, L.L.C. v. Petrolink Services, 965 F.3d 365, 372 (5th Cir. 2020). Circumvention is limited to actions that “descramble, decrypt, avoid, bypass, remove, deactivate or impair a technological measure,” without the authority of the copyright owner… Because youtube-dl simply uses the “signature” code provided by YouTube in the same manner as any browser, rather than bypassing or avoiding it, it does not circumvent, and any alleged lack of authorization from YouTube or the RIAA is irrelevant.

GitHub’s post on the subject explains the situation more fully, breaking down what the site’s obligations are under the DMCA and what it does to protect users from abuse of this law. It also states that its overhauling its response process to Section 1201 circumvention claims to provide even more protection for coders using the site. Going forward, takedown notices will be forwarded to GitHub’s legal team and if there’s any question about its legitimacy, GitHub will err on the side of USERS and leave the targeted repositories up until more facts are in. This puts it at odds with almost every major platform hosting third-party content which almost always err on the side of the complainant.

And the cherry on top is the establishment of a $1 million legal defense fund for developers by GitHub. This will assist developers in fighting back against bogus claims and give them access to legal advice and possible representation from the EFF and the Software Freedom Law Center.

Youtube-dl is back up. And the RIAA is now the one having to play defense. It will have to do better than its slapdash, precedent-ignoring, deliberately-confusing takedown notice to kill a tool that can be used as much for good as for infringement,

Filed Under: circumvention, copyright, copyright 1201, copyright 512, counternotice, recording software, takedowns, youtube-dl
Companies: eff, github, riaa, youtube

Twitch's Freak Out Response To RIAA Takedown Demands Raises Even More DMCA Questions

from the not-how-any-of-this-is-supposed-to-work dept

As many of you probably saw last week, Twitch decided to delete a ton of videos in response to DMCA takedown claims (which most people believe came from the RIAA). As we pointed out earlier this year, the RIAA had started flooding Twitch with DMCA takedowns over background music used in various streams. The whole thing seemed kind of silly, and now it appears that Twitch (despite being owned by Amazon and having some pretty good lawyers) was caught without a plan.

And that manifested itself in the way it handled these takedowns. Rather than the standard process — taking the content down, letting the user counternotice, and then potentially putting it back up 10 days later if no lawsuit was filed — Twitch decided to just totally wipe those files out and not even leave an option open to users to counternotice.

The key bit:

We are writing to inform you that your channel was subject to one or more of these DMCA takedown notifications, and that the content identified has been deleted. We recognize that by deleting this content, we are not giving you the option to file a counter-notification or seek a retraction from the rights holder. In consider of this, we have processed these notifications and are issuing you a one-time warning to give you the chance to learn about copyright law and the tools available to manage the content on your channel.

We know that copyright law and the DMCA are confusing. Over the past few months, we’ve been improving the tools available to help you manage music use in your live and recorded content. These include the ability to delete all of your Clips at once and control who can create Clips on your channel, scanning new Clips with Audible Magic and launching a free way to stream high quality music on your channel, Soundtrack by Twitch. Now that these tools have been released to all creators, we will resume the normal processing of DMCA takedown notifications received after 12 noon PST on Friday, October 23, 2020.

So… reading between the lines here, it sounds like someone (likely the RIAA or some similar organization) ratcheted up the threats for not being responsive enough on takedowns — and someone up top just said “nuke ’em all, so we can claim we got rid of everything.” But that’s incredibly stupid on multiple levels.

That first paragraph above is completely nonsensical and I’ve read it over multiple times trying to parse out what the hell it means. The company admits that it isn’t giving anyone any chance to counternotice, or get takedown demands removed. In fact, it sounds like they can’t even bring back any of this content. It’s just gone. That could really suck for some users who will not have that content at all.

But then in follows up with “In consideration of this…” which sounds like it’s apologizing and going to give users back some sort of benefit… but instead, it says it’s giving those users “a warning” and telling them to “learn about copyright law.” What? What sort of absolute nonsense is that. If someone’s video was mis-identified, or the video was fair use, why should they get a warning and have to “learn about copyright law”? It feels a lot more like Twitch should learn about copyright law.

Indeed, there’s a potential argument that by deleting the videos and not allowing there to be a counternotice, Twitch got rid of the DMCA’s safe harbor. If you read Section 512(g) of the DMCA, about the replacement of removed or disabled material, it says that there is no liability for the removal of content (which makes sense) with three “exceptions” delineated in Section 2. Here’s the relevant part of 512(g):

(1)No liability for taking down generally.? Subject to paragraph (2), a service provider shall not be liable to any person for any claim based on the service provider?s good faith disabling of access to, or removal of, material or activity claimed to be infringing or based on facts or circumstances from which infringing activity is apparent, regardless of whether the material or activity is ultimately determined to be infringing.

(2)Exception.?Paragraph (1) shall not apply with respect to material residing at the direction of a subscriber of the service provider on a system or network controlled or operated by or for the service provider that is removed, or to which access is disabled by the service provider, pursuant to a notice provided under subsection (c)(1)(C), unless the service provider?

> (A)takes reasonable steps promptly to notify the subscriber that it has removed or disabled access to the material; > (B)upon receipt of a counter notification described in paragraph (3), promptly provides the person who provided the notification under subsection (c)(1)(C) with a copy of the counter notification, and informs that person that it will replace the removed material or cease disabling access to it in 10 business days; and > (C)replaces the removed material and ceases disabling access to it not less than 10, nor more than 14, business days following receipt of the counter notice, unless its designated agent first receives notice from the person who submitted the notification under subsection (c)(1)(C) that such person has filed an action seeking a court order to restrain the subscriber from engaging in infringing activity relating to the material on the service provider?s system or network.

So it certainly appears that you can read this to say that Paragraph (1) (the no liability bit) “shall not apply… unless the service provider… replaces the removed material and ceases disabling access to it not less than 10, nor more than 14, business days following receipt of the counter notice….” And thus, there is a reading of this that says that not replacing the material after receiving a counternotice (that is not followed by an actual lawsuit) could remove the safe harbor protections.

Of course, this may not really matter for a variety of reasons. First off, who would actually sue here? The safe harbors are from the copyright holder, but here the copyright holder is likely to be happy that Twitch has gone overboard in deleting all of the content. Second, this aspect of the safe harbor, which can be read to require the replacement of content following a counternotice has some other problems in that it can be read as forcing a company to host content. And, a website should have the freedom to not host any content it doesn’t wish to host, even if that content is not infringing.

I vaguely recall a lawsuit a few years ago on this point, though in search for it I can no longer find it. From my (apparently faulty) memory, I recall that someone challenged a website’s unwillingness to restore content after a counternotice, and the court found that a website has every right to keep a work it disabled offline. If anyone can remember which case this is, let me know.

Still, it remains somewhat perplexing that this is how Twitch handled all of this, and then claiming it would go back to “normal” DMCA processing as of the end of last week. Why couldn’t it have just kept that up this whole time?

Filed Under: counternotice, dmca, takedowns
Companies: riaa, twitch

NYU Sues YouTube For Reposting Video After Video Poster Sent DMCA Counternotice

from the not-how-the-law-works dept

You would think that a large university like New York University — better known as NYU — would have decently competent lawyers. Especially since NYU has its very own law school that is frequently one of the top ranked law schools in the country. So it’s a bit surprising to see NYU file a copyright infringement lawsuit against YouTube that seems to, pretty clearly, go against the DMCA’s rules (found via Eric Goldman).

The lawsuit is mainly focused on someone named Jesse Flores, who apparently runs a YouTube account called Atheists Exposed. I think it would be fair to say that Flores does not like atheists, and the summary of that YouTube account my be subtitled “videos of atheists behaving badly.” I’m not sure which video in particular upset NYU, but the university submitted a DMCA takedown notice to YouTube, claiming that some of his footage infringed upon its copyright. As I type this, the video in question remains down.

Having suffered through watching at least some of the other videos Flores has on the account, I’m assuming it’s a clip of an atheist somehow associated with NYU, but (you guessed it) behaving badly. This might be embarrassing for NYU, but given the commentary around the videos, there’s at least an argument of fair use here (though, without knowing more, perhaps the fair use claim isn’t strong). Still… it does seem that basically all of the videos on the account are for the purpose of comment and criticism. NYU insists that the video is not fair use, stating that Flores had no license to the work “nor does the Defendant?s use of the Work fall into fair use or any other limitations on exclusive rights of copyright.”

Flores, appears to have filed a counternotice on May 3rd — in which he doesn’t even claim fair use. Rather, he says that he received permission to post the video from the Veritas Forum at NYU, saying that it had the rights to the video in question.

NYU then reached out to Flores directly asking him to withdraw the counternotice. Flores, perhaps not too surprisingly, has refused to withdraw it. YouTube has made it clear that it will follow the rules in the DMCA — specifically Section 512(g)(2)(B) and (C) — which say that upon receipt of a valid counternotice (and the lack of a lawsuit filed against the uploader) the site is to replace the removed material in 10 to 14 business days after the counternotice. There are a few conditions around this, but none are that important in this case. There also is some question as to whether or not the site has to replace the content. The statute can be read that way, though some have argued that that’s a weird result, since sites should have free control to refuse to repost any video if they so choose.

Either way, NYU noticed, YouTube took it down, Flores counternoticed, and YouTube has said it’s putting the video back up. Under the very, very, very, very clear language of the DMCA, this means that YouTube is protected from liability if there is any infringement in the video. That’s the whole basis of the safe harbors. They say “if you take down upon notice and follow these other rules, such as reposting it after 10 business days upon counternotice,” then you’re protected from a copyright lawsuit.

So, what did NYU do? It sued YouTube (and Flores). The suit against Flores, obviously, will turn on whether (1) he actually received a license, as he implies, or (2) if not, if the use was actually fair (and again, we don’t know enough to say at this point). But including claims against YouTube? That’s not just a non-starter, it’s ridiculous, and makes you wonder if NYU’s lawyers spoke to any of the copyright experts at NYU’s own law school to explain that this is not how the DMCA notice-and-takedown process works. The most generous interpretation of this is that NYU’s lawyers completely misinterpreted the DMCA (and YouTube’s notification to NYU about the counternotice) and read the requirement of a lawsuit to include YouTube, so that the site knew not to put the video back up. What YouTube and the law actually require is merely evidence that a lawsuit was filed against the uploader. Including YouTube in the suit is an interesting way to provide “evidence,” but it also goes completely against the DMCA’s safe harbors.

I’m guessing that NYU will file an amended complaint fairly soon that drops YouTube from the complaint. But, still…

Filed Under: 512, athiests exposed, copyright, counternotice, dmca, jesse flores, safe harbors
Companies: nyu, youtube

Network Solutions Confused About The DMCA

from the that's-not-how-it-works dept

Last week we wrote about how Microsoft abused the DMCA to force Cryptome offline via Network Solutions. Since then, there’s been some interesting scrambling by all parties involved. Mircorosft claimed that it never meant to take Cryptome down entirely, just the one document (though, it no longer is asking for it to be taken down). But that doesn’t make much sense, because Network Solutions only had the ability to take down the whole site, not pieces of content. Either way, what really confused us was Network Solutions response to the DMCA takedown, which was that it waited until Cryptome filed a counternotice to take down the site. That’s not how the DMCA works.

Yet, in a blog post sent over by Achura, Network Solutions tries to provide a “layperson’s guide” to the DMCA. The only problem is that they get it wrong.

First, Network Solutions seems to think that the DMCA provides for a “notice-and-notice” system of dealing with takedowns, whereby it needs to first notify the user and wait for them to respond. Unfortunately, the DMCA does not follow such a procedure. It would be much better if it did. However, the DMCA is a “notice-and-takedown” setup, whereby the service provider who receives the notice needs to first take down the content, if it wishes to retain its safe harbor protections. It can choose not to take the content down (though, that rarely happens), but it risks losing the safe harbor protections. As the law itself clearly states:

upon notification of claimed infringement as described in paragraph (3), responds expeditiously to remove, or disable access to, the material that is claimed to be infringing or to be the subject of infringing activity.

So NetSol is wrong to claim that they first need to notify the user and wait for the response.

Second, NetSol is then wrong in how it responds to a counternotice from the user. It claims:

If the customer challenges the Notice by submitting a Counter Notification that complies with the DMCA, the Host is required to disable access to the allegedly infringing site for a period of “not less than 10 business days, nor [sic] more than 14 business days” (the “Challenge Time Period”).

Again, this appears to be incorrect. If it had been following the DMCA, it should have already taken the content down to retain safe harbors. It makes no sense to say once the counternotice is sent then you take down the content. Instead, the no less than 10 days/no more than 14 days refers to how much time the service provider is supposed to wait before putting the content back that it already took down. Of course, given that NetSol was confused about the notice-and-takedown process, you could see why it felt the need to take the content down after the counternotice — because that’s the point that it realizes it was legally supposed to take the content down earlier.

Filed Under: copyright, counternotice, dmca, takedowns
Companies: cryptome, microsoft, network solutions