As an attorney who spends every day helping clients protect themselves online, imagine my surprise when I received an email from YouTube with the subject line, "Your video has been removed from YouTube." And the email was intended for me, not for one of my clients. Amazingly, the video that YouTube removed was one I created -- about how to protect yourself online! It was a recording of a webinar I had recently presented, titled "Domain Name Disputes: What Happened in 2015 (and How to Protect Yourself in 2016 and Beyond)." The video had been published on YouTube for several days before I included a link to it on my GigaLaw blog. Within a few hours, YouTube removed it.
Why? Good question.
What the 'Community Guidelines' Forbid
YouTube's email simply said:
The YouTube community flagged one or more of your videos as inappropriate. After reviewing the content, we’ve determined that the videos violate our Community Guidelines.
Naturally, I "Googled" "youtube community guidelines" (since YouTube's email perplexingly did not contain a link to them), which led me to a page warning users, "Don't cross the line." The page identifies the following types of taboo videos:
- Nudity or sexual content
- Harmful or dangerous content
- Violent or graphic content
- Copyright
- Hateful content
- Threats
- Spam, misleading metadata, and scams
Of course, these categories are just shorthand for more complete descriptions. Reading the headings alone isn't very informative. For example, most videos are protected by copyright law -- the question is whether the user who posts the video has appropriate rights to do so.
In any event, it was impossible even to imagine which of these categories might have been implicated in the decision to remove my video. After all, I (and my co-presenter) created all of the content in the video, which consisted solely of PowerPoint slides and our narration. No music, no movie clips, no photographs. Certainly no nudity (though there is one slide that discusses the impact of adult-related domain names -- .adult, .porn, .sex and .sexy -- on trademark owners). Nothing that could possibly be considered harmful, dangerous, hateful (I've served as a member of the ADL Anti-Cyberhate Working Group) or threatening. And nothing related to spam, scams or the like. After all, these are the issues I counsel my clients to avoid.
My Successful Appeal to YouTube
So, embarrassed that my webinar video had been taken down so soon after I informed my blog readers that it had been posted, I immediately (within 17 minutes) responded to YouTube, via their "appeal" process, which essentially consisted of a short form. I had room for perhaps a sentence or two -- nothing that would allow for much explanation or legal argument.
And then, about 15 hours later, I received another email from YouTube:
After further review, we've determined that your video doesn't violate our Community Guidelines. Your video has been reinstated and your account is in good standing.
Good news, of course. Though, in the maddening interim, I had replaced the video link in the blog post with a new copy of my webinar video, this time hosted on Vimeo instead of YouTube. (There are plenty of arguments out there about which service is better, but I won't digress.) I had become impatient and did not want the entire business day to pass with the video offline.
Although I was happy that YouTube had restored the video, I remained perplexed. Why was it taken down in the first place? Why wasn't I given an opportunity to respond to any complaints before it was taken down (as is common in copyright-related situations under the Digital Millennium Copyright Act)? And why was it reinstated?
So, I asked YouTube. Via email. Twice. But, of course, no answer.
YouTube's lack of a response is frustrating, but understandable. After all, Google reported last year that it receives 2.2 million takedown requests every day.
Lessons About Online Publishing
While, in a sense, all's well that ends well, the experience has certainly left me frustrated -- and more empathetic to website publishers who have to deal with issues like this all of the time.
So, what did I learn? At least three important lessons:
- Using a free service (such as YouTube) often means that customer service will be lacking, or absent. While I was frustrated that YouTube's decision seemed arbitrary, I had few grounds for complaint, and certainly no real person to whom I could complain.
- Web hosting services typically give themselves broad discretion to take down their users' content. Even Vimeo, which offers paid services, reserves for itself the right to "suspend, disable, or delete" a user's account "if Vimeo determines" that a user has violated its agreement (which, like YouTube, has its own guidelines).
- Always have a backup plan. Fortunately, YouTube quickly (and rightly) restored my video, but not before I reposted it with Vimeo. While I was able to do so as soon as possible, there was some true "downtime" where users couldn't view the video. With better planning, I could have replaced the video even more quickly.