Google's facing an advertiser revolt this week after it came to light that several big brands' ads were served on YouTube videos promoting terrorism, extremism, and other violent forms of hate.
More than 250 brands have split from YouTube after various investigations in the Times of Londonouted the ad placements. Needless to say, brands don't want their ads being served on YouTube's most hateful content. And how often are their ads served in these cesspools?
Well: It took us just around five minutes on YouTube to confirm how big of a problem this is. That's it.
SEE ALSO:Google swears ads will stop showing up in the absolute worst placesTo test it, we searched for a random offensive term—the holocaust denial slogan "Holohoax"—then clicked on whichever related link looked the most unpleasant over and over again.
A lot of these videos are obviously, grossly hateful, and Google has rightfully cordoned them off from its ad service.
But after a dozen or so shoddy clips of bug-eyed conspiracy theorists and antisemitic imagery, the next click revealed something a bit different: A scene from the DreamWorks animated movieThe Boss Baby.
The trailer was attached to a clip called "The Jewish Question. International Jewry vs Individual Jews: The Difference," in which a Swedish guy with a mullet who calls himself "The Golden One" delivers a monologue full of antisemitism, at one point even defending Adolf Hitler.
An ad for Dreamworks' 'Baby Boss' ahead of an antisemitic video.Credit: screenshot'The Golden One' uses a grossly antisemitic caricature to distinguish the 'international Jewry' from other Jewish people.Credit: screenshotSearch the news for "The Golden One"—who boasts nearly 50,000 YouTube followers—and the first hit you'll likely find is a video in which he endorses David Duke's failed senate campaign last year. The former KKK leader in turn shared it on his Twitter.
That video sometimes features an ad for the Japanese luxury watchmaker Orient.
Orient Watch Company gets a plug in 'The Golden One''s David Duke endorsement.Credit: screenshotThe advertiser exodus was triggered by a Times of Londonreport last week that spotlighted questionable placements like these. It's gained steam as more and more brands have piled on, and it hit American shores on Wednesday when AT&T, Johnson & Johnson, and others followed suit.
One might reasonably wonder, though, why brands are just discovering this problem now.
One reason: Most of the buying and selling of their ads is what's known as programmatic advertising—it's handled by software.
Like all of Google's ads, YouTube ads are placed through automated exchanges that sort them, based on what's known about a given user's demographic.
Advertisers can choose to exclude their promotions from appearing on certain topics or categories of video, as a way of further controlling where their ads are seen.
The company also announced more tools this week that'll allow them to blacklist specific channels or categories of video.
Here's the problem: No matter what settings a brand picks, Google claims it neverplaces ads on videos "with hate speech, gory, or offensive content."
Naturally, given the billions of videos on the site, it's nearly impossible to enforce this to a tee.
But there seems to be some gaping holes.
For instance, on a channel where the notoriously awful hate group Westboro Baptist Church posts strange, homophobic pop song parodies, an ad for Google's own computer science student program popped up on a video called, "Fat Bottomed Whores."
The song is a reimagining of Queen's "Fat Bottomed Girls" that includes the lyrics: "Oh, will God destroy the whore tonight? Oh, when will we see that fiery sight?"
Charming, right?
Other ads attached to that video, on subsequent replays, included a pre-roll from job-hunting site Monster.com, and mid-roll banners for Columbia College Chicago, Minecraft, and the shopping site Shein.
A spokesperson for Monster.com said the placement was "entirely unacceptable" and that the company will be "investigating" why this happened.
"We would like to thank Mashable for bringing this matter to our attention, which we will be investigating," the spokesperson said by email. "It is entirely unacceptable that our advertising has been allowed to appear next to this content. Monster does not endorse the beliefs and messages of this group. We are committed to supporting individuals from all backgrounds in both finding employment and pursuing successful careers."
Lyft said it pulled all ads from YouTube in response to a request for comment. "This is beyond offensive," a Lyft spokesperson said in an email. "As soon as we learned of it, we pulled our advertising on YouTube."
In another instance, a locally targeted ad for a lecture by the first black female astronaut, Mae Jemison, appeared within a video entitled "'F*** all n****rs' song."
The clip seems to be meant as an ironic joke from a black creator, but if the comment section is any indication, the more than 40,000 people who viewed it don't necessarily see it that way.
Either way, it's strange that Google wouldn't have a default advertising ban on videos involving the hard-R N-word.
Credit: screenshotAn ad for Lyft appeared in a video posted by the racist skinhead group Keystone United.
Lyft said it pulled all ads from YouTube in response to a request for comment.
"This is beyond offensive," a Lyft spokesperson said in an email. "As soon as we learned of it, we pulled our advertising on YouTube."
Other brands didn't respond to requests for comment. Neither did Google.
We found these examplesand plenty of others—including a few blue-chip brands—within the span of a couple disturbing hours on the site's skeevier reaches.
The problem here isn't just that brands would rather not be seen next to Nazis or terrorists. They obviously wouldn't, though savvy consumers understand that ads and content aren't necessarily linked because of their proximity.
The issue lies in how the money is distributed. A portion of the money made from ads placed on YouTube videos goes towards the creators themselves. So when a brand is unwittingly paying for space on a Westboro Baptist Church video, it's also funding that group in general, which is a queasy thought for anyone involved.
Google insisted in a blog post last Friday that ads placed on videos that violate its advertising rules constitute "a very small percentage of cases."
In our brief trip down the YouTube hate rabbit hole, we clicked on several times as many offensive videos withoutadvertising as those that had ads. And I deliberately searched out the most vile content I could find.
And yet, it still wasn't so hard to find. Based on that experience, it's pretty clear that Google's got quite a widespread problem on its hands. Any kind of plan they might have to handle it—and whether or not they can woo back those advertisers they've lost—remains to be seen.
TopicsYouTubeAdvertising
(责任编辑:綜合)
Dog elected for third term as mayor of Minnesota town
Mom discovers security cameras hacked, kids' bedroom livestreamed
Singapore gets world's first driverless taxis