The ad exodus from YouTube has died down since its peak in March, but YouTube continues to update its guidelines to reassure advertisers and, in some ways, its creators. In a blog post, YouTube outlined more specific definitions of hate speech and what kinds of incendiary content wouldn’t be eligible for monetization.
Three categories are classified as hate speech, with the broadest one being “hateful content.” YouTube is defining this as anything that “promotes discrimination or disparages or humiliates an individual or group of people on the basis of the individual’s or group’s race, ethnicity, or ethnic origin, nationality, religion, disability, age, veteran status, sexual orientation, gender identity, or other characteristic associated with systematic discrimination or marginalization.”
The second category is “inappropriate use of family entertainment characters,” which means content showing kid-friendly characters in “violent, sexual, vile, or otherwise inappropriate behavior,” no matter if the content is satirical or a parody. The final category is somewhat broad: “incendiary and demeaning content” means that anything “gratuitously” demeaning or shameful toward an individual or group is prohibited.