YouTube Rewind: A Thinly-Veiled Distraction From Controversy
The platform wants you to forget about its numerous gaffes this year.
To say that this year’s YouTube Rewind is a reaction to the poor reception of past entries would be an understatement. 2018’s iteration became the platform’s most-disliked video, featuring many creators which the majority of YouTube’s viewership have not even heard of — some, like Ninja, were predominantly present on Twitch — disregarding erstwhile main staples like PewDiePie due to controversy. This year, YouTube chose to play it safe, and in doing so, it stripped of Rewind of all character that made it unique and exciting, despite how often that lends it to be played as a pejorative more than an asset.
This plays out in the very first seconds of the video–self-referential statements are made about how creators themselves were not happy with the material they were handed, and specifically, the emphasis on pop culture moments — like Fortnite and K-Pop’s exploding popularity — rather than ones that are unique to YouTube. But the platform’s stake in altering the common perception that it views its creators as nothing more than cattle for advertisers to chew on is not only a reaction to last year’s Rewind–it’s also part of a concerted effort by the company to sanitize its image, following a myriad of troubling reports about its alternative influence network of far-right political pundits, its appeal to conspiracy theories including climate denialism, allowing child predation to run rampant — yielding it a sizeable fine by the FTC — and a whole other host of concerns that made it harder to keep advertisers in good standing, prompting fears from creators about further declining ad rates, as has been customary throughout the years.
But the most pressing danger that’s sure to emanate from this year’s Rewind won’t be any measure of an unwieldy like-to-dislike ratio–it’ll be the fact that the YouTube community will become less proactive about holding the platform accountable for the many sins it has yet to address. There are still many questions about how YouTube plans to handle terms-of-service infractions, given that it granted a get-out-of-jail-free card to Steven Crowder who’s yet to atone for his homophobic and racist remarks towards now-former Vox presenter Carlos Maza, and how aloof it has been to PewDiePie’s questionable wardrobe choices even in the presence of troubling precedent. Measures YouTube implemented back in early 2018 were supposed to stem the tide of problematic content from popular creators following the Logan Paul Aokigahara forest incident, but a recent report from the Washington Post seems to indicate there’s still a lot of favoritism and corporate interest-led lobbying that happens on behalf of YouTube’s troublesome, but yet quite lucrative bunch.
60 Minutes @60MinutesHow does YouTube determine what videos are too hateful or violent for the site? The company’s CEO showed 60 Minutes some examples. https://t.co/s6KtiDF9qF https://t.co/uJVYfNXptU
This comes on the heels of a 60 Minutes interview with YouTube CEO Susan Wojcicki, where the company saw fit to let the public in on more of what makes up their moderation scheme. It was an uncontested PR win for a company that always historically struggled with how to break free from the notion that its moderation efforts are firstmost led by faulty AI, and when diverted to humans, the outcome is more-often-than-not quite subpar. TechCrunch’s Connie Loizos noted that Wojcicki’s binary distinction between what’s harmful or not in that interview, does not inspire much confidence:
It’s a horrifying position for the company to take and Wojcicki to be responsible for, and worse, Wojcicki’s indirect answer to whether YouTube can capably police its own platform is that she knows she can “make it better,” adding, “and that’s why I’m here.” At this point, thirteen years after Google acquired YouTube and five years into its former ad chief’s tenure as CEO, that’s cold comfort.
The strength of YouTube is its community, and it’s the part that the corporate upper echelons of the company reflect the least. No matter what the platform does, it seems that no one is quite happy with the result, and we’ve been promised that the winds of change would eventually make YouTube a place where videos don’t get inexplicably demonetized or worse yet taken down, based on an indiscernible set of standards that moderators have a hard time enforcing under the crushing weight of poor wages and toxic workplace culture. That future, once thought to be an imminent possibility, is still on a course to get materialized, and the platform has yet to deliver.
YouTube @YouTubein 2018 we made something you didn't like... so in 2019... #YouTubeRewind https://t.co/c71moMNmOd https://t.co/L0dP80SJmI
If last year’s Rewind was a call for advertisers to steer clear of the internal cultural battle being waged for what the soul of the platform is yet to be, this iteration feels like it’s trying to instill a sense of comfort within creators that the platform isn’t yet again embarking on a journey of useless feature adds, or fixes for issues that never were. It’s a rundown of what was best watched, based on metrics whose objective merits are hard to dispute–that’s essentially YouTube throwing its hands up and footing the burden of its picks on individual user behavior, which is fair considering how hellish a year this has been for the platform.
No amount of media coverage was enough to deter YouTube from committing mistakes past, so it’s unlikely that praise would be of material worth either. What should remain of consideration is that YouTube, however favorably it tries to position itself, still has a lot to answer for. This year’s Rewind is a poorly-disguised distraction from the issue at hand–the platform, and its CEO, have promised repeatedly to undergo reform and radically alter the way they’ve been handling business to prioritize ethical considerations above unconditional growth. What we’ve seen so far, however, is proof to the polar opposite.
Views still continue to reign supreme, and even in instances where the algorithm is supposed to spot problematic patterns and put a stop to them, little human intervention is performed to make sure that this black box of a complex AI, powered by state-of-the-art machine learning, is doing its due diligence in making sure that the content floated isn’t a slight to a community YouTube is supposed to protect. In that way, YouTube masterfully played the PR game to convince everyone its internal corporate culture has become more attuned to their needs, even when little evidence of it shows.