
It began with an invitation to present at a TEDX event in Grand Rapids, MI. I wanted to share with the TED audience the complex relationship between “creationism” and “curationism”—or, more simply, the way creators are also organizers of ideas and content. It seemed pretty simple. On stage in Grand Rapids, I told the story of being both a filmmaker and curator. And I showed a short clip from a film, about 30 seconds or so out of a 15-minute presentation. I talked about the perils and complexity of a world of bandwidth and content abundance. A world in which we’re overwhelmed with data, tweets, blogs, check-ins, and media.
I explained that we used to surf the Web. Now the waves are just too big. And I offered a solution. A cure, if you will: curation.
“Curation is the new magic that makes the Web work,” I said, “bringing the Web back to human scale with human filters you trust and love. A powerful mix of passion and context turns noise back into signal.”
The talk was well received, and TEDX posted it to YouTube, where it received almost 2,000 views.
And that was that—until a year later, when the organizers of TEDX contacted me, somewhat concerned. It seems that YouTube had taken down the video of my talk, due to a copyright violation claimed by Starz Media.
Unspoken, but hanging in the air, was the concern that I had somehow presented something that I didn’t have the rights to. Somehow, I must have broken a copyright law. The takedown notice was there in black-and-white. And others had linked to that video, including WNET’s MetroFocus blog. There, too, the video left an ugly hole. I was labeled a copyright crook.
I racked my brain. What could I have included in my talk? A photograph? I’d been very careful only to use Creative Commons images. Music? There was none. Then I remembered that I’d used a clip from a film. Just 30 seconds, and certainly within an editorial context. It should have been fair use. And, as the anger began to rise inside me, more seriously—it was my film.
In 2001, I’d made a film about the days after 9/11 in New York. The film was called 7 Days In September, and to make it I’d reached out to the world of both amateur and professional filmmakers who’d recorded their POV on 9/11 and the days that followed. Hundreds of people responded to an ad I placed in The Village Voice, and from those stories and footage I put together a film from 28 individual perspectives, including mine.
In 2004, I’d licensed the dvd rights to Anchor Bay Entertainment. Almost eight years ago. It was a seven-year deal, and thinking back to where the world was in 2004, there was no mention of “streaming rights” or “Web rights.” But when Anchor Bay sold to Starz, Starz determined that it controlled the rights on Netflix, YouTube, Hulu, and other Web video distributors. Putting aside for a moment the question of whether it did or didn’t have those rights, my contract with Starz had ended more than a year earlier.
So what had happened? I presume that Starz had provided YouTube with a “digital fingerprint” of my film—and all the films they’d previously had the rights to. And since there is no way to make a judgment about fair use in an automated system, Starz had discovered the segment of the film inside my talk and issued a takedown.
To its credit, once I reached out to Starz, it immediately lifted the block, and the video now plays without restriction.
But what if the film clip I used to make my point wasn’t mine? Would Starz have released the ban? What if my talk had brief clips from a number of other distributors? Who would fight for fair use? Just knowing whom to email at Starz was critical to get my film back live on the Web.

The problem with automation is that it makes people, who should be the sole decision maker, lazy and they allow a machine to make decisions for them. The amount of injustice and harm that does to many people because of they lazy attitude of people who depend to much on automation can be huge. Perhaps it's a good idea for our culture to get back to us humans being completely responsible for all decisions. Machines cannot exercise responsibility.
#1 Posted by Arnold Kirschner, CJR on Thu 26 Jul 2012 at 09:36 AM
This is a very important story, and I hope you will be talking about it next week in NYC.
#2 Posted by Kirsten Lambertsen, CJR on Thu 26 Jul 2012 at 03:26 PM
The irony here is delicious: a talk about curating the web to bring it back to human scale is thwarted by automation. But in all fairness, human decisions are, if anything, more boneheaded than the robots'. It's not a binary "humans good, robots bad" decision. If anything, a robot with thorough instructions will be more objective and make a more rational decision.
#3 Posted by Stephan Hokanson, CJR on Tue 31 Jul 2012 at 11:51 AM
A bigger deal needs to be made about YouTube's takedown processes. YouTube is one of the only UGC clearinghouses actually creating useful tools for DMCA compliance, CC-licensing, and allowing response to takedowns on fair use grounds.
And yet....
Your example is just one of the areas where YouTube's attempts fail. As the person in charge of posting video from lectures at a major university I've had to respond to dozens of takedowns in recent years by citing fair use, only to have the copyright holders simply reject our claim. With fingerprinting and automation, the concept of fair use might as well not exist at YouTube.
#4 Posted by Dan Jones, CJR on Thu 2 Aug 2012 at 11:44 AM