In the still-very-new world of the internet, the democratization of content has been both a boon and a challenge for rights holders.

One of the most challenging aspects to come to terms with is the sheer volume of content out there.  Every minute 72 hours of video is uploaded to YouTube.

That is just one of many content hosting channels.  People upload music videos, mashups, and original works every second of every day. Some percentage of that number is going to include works not owned by (or licensed to) the uploader.

Currently, the thinking of rights holders has been to address this issue through third-party technology.  Many of the largest media hosting companies (YouTube, SoundCloud, Facebook, etc) use a solution created by a company called Audible Magic, which matches content in real time against a massive database of music and rights holders, and issues automated “takedown” notices.

On YouTube, content owners get a choice: either have the matches removed with a warning, or ‘claim’ them so that revenue from the uploaded videos gets correctly directed to them.

There are a few overlapping challenges here.

There is a difference between a user uploading a music video created by an artist, and making a tribute video with their friends that uses a snippet of a Tears for Fears song.  Both of these are different from someone posting their remix of the same song to SoundCloud, which is very popular with remixers, and uses similar technology.

There are also technical limitations to the Audible Magic platform, as well as the fact that their database does not necessarily contain the latest in sample clearances.  Many artists will post samples or advance versions of remixes that may be incorrectly fingerprinted as the original song.  Unfortunately, the software cannot always identify these different versions and therefore issues a takedown notice automatically.  There does not seem to be a way for the software to flag any content as “questionable”.

There are too many new pieces of content being uploaded for anyone to reasonably expect that this process should or could be taken on manually.  However, let’s take a look at the potential risk of false positives happening, especially when popular users are affected.

Peter Kruder Soundcloud Issue

False positives, while inevitable, put rights holders in an awkward position.  In a time where everyone has a voice, there may need to be more focus on how takedowns are communicated.  The tone in which these errors are addressed should be closely examined, in order to avoid a PR issue stemming from what was at its core a computer error.

Is there a way we can communicate the options more clearly? Can we avoid the potential for false positives to lead to a negative impression of the rights holders? Can we optimize the appeal process to provide exemplary customer service?

This is a matter of communication and tone.  What can we do to avoid the kinds of incidents that perpetuate outrage over things that need not be such charged interactions?

How can we continue to improve the relationship between rights holders and an ever-more-empowered consumer/creator-base?

How can we protect everyone’s rights and still encourage creativity?

Maybe we could all make a concerted effort to improve how we communicate take-down notices, keeping in mind that simplifying the process for identifying false positives will minimize the potential fallout.

Leave a reply

required

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>