Sometimes what’s good on paper doesn’t mean it’s good in practice. Sometimes it veils something far more nefarious in its intentions.
Take YouTube and their recent controversy. In order to combat their definition of ‘extremist content’, the worldwide video-sharing platform responded to threat of a mass advertising boycott by “implementing ‘broader demonetization policies’ around ‘content that is harassing or attacking people based on their race, religion, gender or similar categories'”.
Honestly, it’s tough to blame them for this approach when “analysts are predicting that Google will lose roughly $750 million as a result of an international ad boycott that kicked off last month, when marketers discovered that their campaigns were running against extremist videos on YouTube.”
“The latest companies to pull their ads from the video platform include Pepsi, Walmart, Starbucks, FX, General Motors, Dish, JP Morgan, Johnson & Johnson, and Lyft, Variety reports. They join AT&T, Verizon, GSK, and Enterprise Holdings, which pulled their ads earlier this week, citing the same concerns.”
Sounds great, right? While YouTube is headquartered in America, the hub of equal and free speech, it still exists as a private company, meaning it can ultimately decide which content it wants on its platform. So if they find a video that promotes harassment and just blind hatred, they have the right to ‘demonetize’ those videos or flat-out remove them.
Demonetization is the process of decreasing the money a channel can make off a video once it reaches a certain view count threshold:
“While creators can get revenue from ads, individual views don’t account for much money until they reach the hundreds of thousands. Making sure your videos can reliably have ads matched with them is essential for creators being able to have long-term revenue.”
Here’s a list of things that may result in demonetization, according to YouTube’s new policy:
- Sexually suggestive content, including partial nudity and sexual humor
- Violence, including display of serious injury and events related to violent extremism
- Inappropriate language, including harassment, swearing and vulgar language
- Promotion of drugs and regulated substances, including selling, use, and abuse of such items
- Controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown.
How idealistic. Unfortunately, I, as you should as well, have two major issues with this. For one, most of it is completely subjective, and two, it’s vague. The fifth point, in fact, is absurd in how broad it’s defined:
“Guidelines that contain something as broad as ‘subjects related to political conflicts’ do not provide creators with useful information. It makes it sound as if YouTube is no longer going to monetize channels that cover current events, which of course is not the case.”
And in the case of subjectivity, who is ultimately deciding what constitutes as hate speech, especially in this day and age where something as simple as challenging a different opinion can be defined as such. If I’m a conservative with millions of subscribers and I have thoughts on illegal immigration, what’s to stop enough people with different beliefs and a large following to report me enough times to have my video demonetized.
Take for instance the YouTube Heroes program rolled out last September; perhaps one of the greatest attacks on free speech based on subjectivity you’ll ever witness on a social media platform:
“YouTube heroes gives users the option to flag a video for being inappropriate, and as a result you can get your video demonetized by it becoming age restricted or removed completely, which will add a strike to your channel and possibly lead to it being deleted.”
Oh, but it gets better. And by better, I mean much, much worse. Here’s the five-step process:
- Become a hero
- Learn more in seminars
- Unlock super tools that allow you to mass flag videos
- Get behind the scenes access, contact YouTube staff directly, and try new products first
- Top hero perks, basically become a full-time unpaid Google employee.
Imagine my shock when I saw comments were disabled on the official video, which currently sits with a Like/Dislike ratio of 30,722:956,895.
This is where a huge problem lies. A video can get demonetized simply because it offended the wrong person or people. What offends some may not offend others. This isn’t as simple as a hardcore racist saying “I believe Race X is better than Race Y and Race Z is worse than all of them!”. A vast majority of the time it comes down to innocuous beliefs that other people simply don’t agree with.
But again it isn’t as simple as that, either. What it appears to be is an outright attack on YouTube content creators with good intentions. Because this demonetization process isn’t just attacking the likes of virulent racists like David Duke. It’s going after creators like H3H3 Productions, Philip DeFranco and even Jenna Marbles, who “have all had hundreds of videos no longer qualify for advertising revenue, and other YouTubers are claiming they didn’t have a chance to appeal to their demonetization.”
YouTube has demonetized everything from “Vape Nation” to “Thank You for 3 million” with no notification and no option to appeal @TeamYouTube
— Ethan Klein (@h3h3productions) March 29, 2017
@h3h3productions @TeamYouTube dude same, I’ve also had a bizarre selection of videos demonetized with no notification or option to appeal.
— Jenna Marbles (@Jenna_Marbles) March 29, 2017
“It isn’t just large channels that are being affected by these changes — YouTuber Tim TV, who has been a fulltime YouTuber for about six months, told Kotaku that he saw that his revenue was, ‘tanking faster than ever before,’ and that he found the changes ‘terrifying'”.
Here’s a little background from H3H3’s Ethan Klein on just how out of line and lacking in transparency YouTube can be when it rolls out these vague stipulations:
You heard that right. Even tagging things like ‘Suicide’, ‘Rape’, and ‘Drugs’ can get your video demonetized, not taking any of the context whatsoever into mind. That means someone who tagged ‘Suicide’ because they wanted to give advice on suicide prevention, or a rape survivor who wanted to tell their story and tagged ‘Rape’, or a doctor who wanted to give medical advice and tags ‘Drugs’ would have had their videos demonetized.
And the worst part of it all? YouTube didn’t even warn the creators. Just read how lacking in foresight this approach was:
“In 2012, YouTube began demonetizing videos based on new advertising-friendly guidelines. This was not done by people, but by an algorithm that looked at metadata of videos and other factors to decide whether it was likely to be something as an advertiser wouldn’t want to be associated with.”
But don’t worry, because everything is better now, right? Well..
“Google currently uses a mixture of automated screening and human moderation to police its video sharing platform and to ensure that ads are only placed against appropriate content.”
Look, we get it. YouTube is a massive platform with billions of videos from all over the world. Sometimes automation is the only way to keep some things in check that a human can’t reach. However, this is a significant issue when YouTubers like Matan Uziel is no longer getting ad revenue on their videos dealing with “women about hardship, including sex trafficking, abuse and racism.”
Why did it get pulled? Isn’t it obvious? One of those automated screeners saw “sexually suggestive content”, maybe some “violence”, and “controversial or sensitive subjects and events”, and was programmed to demonetize the video of a creator with obvious good intentions.
But they’re not alone:
“Dr. Aaron Carroll runs a channel dedicated to healthcare policy and research and discovered this week that 27 of his videos were demonetized and had been for months. It seems likely that the algorithm regularly flagged a program discussing prescription drug costs, the opioid epidemic, and treatments for diabetes because it thought those videos were celebrating illegal drug use.”
How is that for a precedent set by YouTube? If you dare used your large following to discuss the evils of addictive drugs or tell the stories of abused victims, no ad revenue for you. Oh, and like Ethan explained in the video, they wouldn’t tell you about it, either. You wouldn’t get notified and your video wouldn’t even become age restricted. Your video would just be demonetized.
Fortunately, this policy changed last fall. YouTube now:
- Lets you know when a video has been demonetized
- Shows a notice next to demonetized videos
- Allows you to request a manual review of demonetized videos
- Re-monetizes videos that the review finds to be not in violation of YouTube’s ad-friendly policy.
It’s a great gesture sure. But why did it take four years to correct, and why were channels not even notified in the first place?
It was a shoot first, ask questions second policy. By thinking they’re doing the right thing and acquiescing to the demands of their advertisers (Not surprising considering YouTube operates at a loss), they negatively affect innocent YouTube content creators who treat the platform as a full-time job and livelihood.
As YouTuber Arin ‘Egoraptor’ Hanson’ said, “he wanted YouTube to ‘be more clear about what advertisers are opposed to having their ads displayed on. What can creators do specifically to make their content more advertiser friendly?'”
But to really get into the meat of YouTube and its advertisers’ intentions with subjective censorship and constant threats of demonetization for ThoughtCrime, I don’t think we can go anywhere until we explore what I have dubbed The PewDiePie Situation.
For those who don’t know, PewDiePie is basically the face of YouTube. He has over 54 million subscribers, and his videos are basically him talking into a webcam talking about one thing or another. His audience is mostly made up of the younger generation, mainly middle and high school kids.
But about a month ago, PewDiePie was attacked, seemingly at random, by the Wall Street Journal who took some out-of-context jokes and videos and decided to go on a character assassination spree.
“According to the Journal’s analysis, over the last six months the YouTuber posted nine videos that included either anti-semitic jokes or Nazi imagery, including one, posted on January 11th, that featured two men holding a banner that stated: ‘Death to All Jews’. Another video, posted January 22nd, featured a man dressed as Jesus saying, “Hitler did absolutely nothing wrong.”
The entire premise was based on Fiverr, a company that asks buyers to pay just $5 to do absurd things, like having two people dressed in traditional native garb to hold up a sign that says ‘Death to All Jews’, or having a man dress as Jesus and saying “Hitler did absolutely nothing wrong.” PewDiePie was convinced they wouldn’t do it because of how insane the statements were, but they actually did it.
Out-of-touch, narrative-driven journalists who worked for traditional outlets discovered the videos and went on a crusade to take down the evil PewDiePie empire. They went through his videos, chopped up more out of context clips in his videos, and said, “See! See! Look how evil he is! How can parents let their children watch this?”
PewDiePie was not contacted by the WSJ to defend himself for their first hit piece.
As a result of this attack, PewDiePie actually lost out on a partnership with Disney’s Maker Studios. Also as a result of this attack, PewDiePie’s 50 million+ subscribers realized traditional media outlets are using out of context video clips to defame the character of a YouTuber who had exhibited zero anti-semitic or racist tendencies in the past.
The Wall Street Journal, worth noting, has 2.1 million subscribers. It was also voted as one of the least cool brands by 18-24 year olds.
And isn’t it just ironic that the author of the original hit piece of PewDiePie was written by Ben Fritz, who composed a tweet in 2009 stating: “Just attended my first chanukah party. Had no idea jews were so adept at frying.” Here’s another in 2015 talking about having a “hard on purely for the Nazis” and one more stating “well obviously I’m not counting jokes about black people. Those are just funny.”
So what’s the meaning of this? Why is the Wall Street Journal of all publications going after YouTube’s most popular YouTuber? Well, I did some research into the WSJ and have a theory, but let me preface it with this response from PewDiePie on the whole ordeal:
“Old-school media does not like internet personalities because they are scared of us. We have so much influence and such a large voice, and I don’t think they understand that. The story was an attack towards me by the media to try and discredit me, decrease my influence.”
While I would like to personally cite and specifically quote the Wall Street Journal’s findings and rebuttals, I can’t because I need to pay for a subscription. It’s exemplary of how a bitter, dying, and desperate publication from the old guard is lashing out and attacking the new; latching onto a statement or joke that could be misconstrued as racist or anti-semitic, which is basically a death sentence to someone working in the public eye, and selling that to uninformed users.
In perfect media collaboration, the Washington Post, Vox (who had the slimy audacity to, once again, use an out of context clip of PewDiePie raising his arm and equating it to a Nazi salute as their cover image for the article), Wired, and Salon were all quick to jump on the “Is PewDiePie a Nazi/Alt-Righter/Racist?” bandwagon.
YouTube content creators, people like PewDiePie, H3H3, and Philip DeFranco, are independent and don’t answer to anyone other than what appeals to their subscribers. They don’t answer to advertisers, high-profile donors, boards of directors, executives, or producers. These are people armed simply with a webcam, a microphone, and a platform reaching tens of millions.
To the traditional media, this isn’t just terrifying, it’s a threat to their information monopoly.
Independent media, courtesy of the unbridled internet and social media platforms like Twitter and YouTube, have been on the rise and have shaken traditional media to its core. Distrust in these institutions is sewn as more and more people realize they’re not getting the full story, while independent media, free of influence, is providing a perspective that’s never discussed.
How do you attack these independents when they don’t have a higher power that they answer to? It’s simple. You hit them where it really hurts: Their ad revenue, their character through out-of-context clips, enlisting critics with opposing beliefs, employing other mainstream outlets to join your crusade, broad and extremely vague definitions of ‘extremism’, and using the platform they post on to crackdown on them.
But this isn’t just an attack on popular YouTubers. It’s an attack on counter-narratives and content creators not shackled by the constraining chains of producers, boards of directors, and advertisers.
So it’s only natural that these dying publications in their death throes, like a cornered animal, are lashing out at its threats. Like YouTubers with over 50 million subscribers, or simply any YouTuber who is developing a following strong enough to take eyes off traditional outlets that are pushing a narrative delivered from on high.
Remember: “Whoever controls the media, controls the mind.” There is nothing more integral to controlling the whims of the masses than the control of information. There should be nothing surprising that in the age of “fake news” a popular YouTuber is getting randomly attacked, advertisers are threatening boycotts, and traditional media outlets are doing their best to defame independent sources of info.
The only question that remains now is, just how long do the traditional media outlets think they have left?