YouTube’s Erratic Decision-Making Is Hurting BreadTube

The platform’s response to its missteps are pieced together with spit and glue.

YouTube incurred many accusations of political bias in the past. What those accusations mainly consist of, is the notion that YouTube as a platform, is rigging its methods of participation and promotion, to downgrade right-wing-leaning content in its algorithmically-based hellscape of a discovery system. Those accusations however, have remained unsubstantiated, and hinge heavily on a predominant view within the American conservative political class that Big Tech by virtue of operating out of the liberal haven that is San Francisco’s Silicon Valley, panders to progressive sensibilities more than it does to conservative ones.

In practice, that turns out to be mostly untrue. YouTube, Twitter and Facebook — the leading subjects of discussion within that debate — have mostly demonstrated that they have quite a warped sense of what is acceptable within the boundaries of their platforms, often resorting to reactionary measures only after political pressure mounts up from political parties domestically or internationally, or after threat of advertisement money being pulled looms around the horizon. It’s quite a tricky field to navigate considering the sheer volume of content that gets uploaded to these platforms every single day. But despite what giving the benefit of the doubt to companies may compel us to do, it is important to examine the evidence at hand and recognize that the pattern of political bias within the three biggest content sharing platforms does not swing leftward, and examples of that are ample and well-abundant throughout their history.

YouTube in particular has been waging an all-out war on all political content regardless of provenance, aim, or finality. It seems as though the algorithm cannot discern a healthy distinction between content that promotes extremist views, versus content cautioning against it. This puts the entirety of YouTube’s political section in a really tough bind. What’s happening essentially, is YouTube’s algorithm completely disregards context when examining content’s eligibility for existing, or — at the very least — monetization. If the algorithm was supposed to act neutrally as such, it wouldn’t have been such a major issue–the inexistence of extremist content on the platform would partly nullify anti-extremists’ quest to fight against it. All-too-often though — and it’s unclear whether this is purely algorithmic or just a poorly-calculated choice on part of YouTube’s content moderation team — anti-extremist content ends up being demonetized, outright removed, or in extreme cases resulting in the complete ban of an entire YouTube channel, while its counterpart enjoys at best a slap on the wrist in the form of demonetization–even as its creator continues to enjoy ample revenue from alternative funding avenues such as Patreon, direct tips, or merchandise sales as has become the habit.

The most recent example of such an occurrence has been what happened to Three Arrows–a channel that specializes in debunking Holocaust denialism, and overall myths about Nazi ideology. Earlier this month, YouTube pulled down their “Jordan Peterson Doesn’t Understand Nazism” video under the pretense that it violates its guidelines about hate speech. One would be baffled as to how a such a conclusion was reached, but to YouTube, misconstruing commentary for endorsement is a regular serving. Daniel initially reached out to YouTube through Twitter — which is a common tactic BreadTube staples use due to YouTube’s appeal process being particularly susceptible to public pressure — and got a favorable response.

This however, would not be the end of Daniel’s worries, and an even bigger blunder happens not that long after it seemed YouTube correctly disregarded the abuse of its flagging system. Daniel’s channel got completely removed, and it’s unclear why that happened in the first place. A preliminary examination of the time-line of events could suggest that the initial mass-flagging campaign Daniel incurred was shifted away from one video, to the effect of his entire channel. Speculation notwithstanding, Dan eventually got his channel reinstated, but it still was a bitter victory seeing at how YouTube’s tone shifted from cooperative to combative, back to being cooperative again with remarkable ease.

But this kind of erratic decision-making from YouTube’s content moderation team and the lack of a rigorous process for manually reviewing mass claims of terms-of-service violations is just symptomatic of YouTube’s inability to duly deal with legitimate claims to violations of its community-guidelines, turning instead to volume and public pressure as an accurate measurement of the severity of the action it must take. What this essentially amounts to, is YouTube disregarding any claims to its impotence that do not garner an arbitrary amount of public scrutiny, subjecting it thus to mob rule, and making its flagging system along with any guidelines of its applicability virtually useless.

This is not the first time that it had recently happened. Just last month, the platform was caught in the thick of a major controversy regarding lackluster enforcement of its community-guidelines on right-wing political pundit Steven Crowder. Carlos Maza who hosted a video series of political commentary for Vox, cut together a clip of disparaging homophobic and racist remarks he made against him, while promoting a shirt on his show that spells out “Socialism is for F*gs”. YouTube could’ve seized the opportunity to distance itself from its spotty record of dealing with LGBT+ content but what instead ended up happening, is YouTube kept shifting the goalpost of what was required from Steven Crowder before he could resume normal operation. First it was inaction, then it was a broad suspension of his monetization capabilities invoking its “exceptional measures” policy enacted back in February of 2018, then YouTube walked it back to a weakly-worded demand for him to remove promotion links to his offensive T-shirts, and since then, Steven Crowder’s presence on the platform seems to be completely unfazed by the controversy.

Two years earlier, taking down BreadTube — the name referring to the leftward part of political YouTube — content was par for the course. Natalie Wynn, aka ContraPoints, made a video to contest the right-wing’s claim that the left hates free speech. It delved into familiar territory for seasoned progressives–mainly that it emphasizes the role of a bilateral interpretation of such doctrine, especially when conservatives repeatedly aid their own, just as they characterize the demands of BreadTube of equal treatment as abuse of power.

The video unfortunately, did not get received with the leeway BreadTube would expect from their political opposites. It got mass-flagged for violating YouTube’s policy on spam/misleading content, and was removed from the platform shortly thereafter. After Natalie appealed the decision, YouTube upheld its original assessment, while slapping a strike on her channel in the process. Two weeks would pass until YouTube rectifies its original stance, prompting collective confusion amidst BreadTube circles on just how easily its flagging system could be exploited for politically-motivated censorship of speech. But even as Natalie’s woes were temporarily tamed, the platform continued to hold quite a distinct grudge against monetizing her content for some indiscernible reason. Most onlookers would chalk it up to how the algorithm was trigger-happy with demonetizing LGBT+ content, especially at the time, but ContraPoints’ case presented itself as an especially-interesting intersection of YouTube’s lackluster support of LGBT+ creators and its usual propension for demonetizing, or outright removing left-wing political content.

Shaun, a BreadTube staple who specializes in response videos to extremist content has met the same fate in the past. After Lauren Southern revived the concept of White Genocide within the political consciousness of YouTube, Shaun took it upon himself to respond to Lauren’s claims so that they may not remain unchallenged. The video proved popular enough, and so effective, that 4chan’s extremist political board /pol/ engaged in a concerted effort to get his video taken down on grounds of anti-Semitism. The strategy was effective, and Shaun’s video got taken down just a week after it was uploaded. This prompted an indignant BreadTube in protest to mirror the video across their individual channels. A full ten days after the video was originally taken down, it was brought back after YouTube correctly judged that allegations of anti-Semitism were fraudulent, but it still begged the question of why the video to suggest a mass-displacement of minorities was left untouched by the platform in the first place, only for a response video to be easily taken down when organized action against its creator was taken.

The phenomena of censoring left-wing voices on the platform is far from an isolated fluke. BadMouse, Kat Blaque, Peter Coffin, and others far too many to recount have all been met with a similar lack of due diligence when examining whether content should stay on its platform monetized–if at all. This happens, while channels like Dave Rubin’s, Ben Shapiro’s, Steven Crowder’s, Stefan Molyneux’s, Lauren Southern’s and a plethora of other right-wing political pundits continue to consistently be favored by YouTube’s discovery algorithm, resulting in a very wide gap of popularity between them, and their BreadTube counterparts.

This tracks along a very disturbing pattern from conservative politicians who’re well-intent on putting the screws to YouTube for an alleged bias against conservatives on the platform. But the precedent for such a narrative just does not bear out. The problem is not that YouTube continues to favor right-wing political YouTubers over BreadTube–it is rather that it completely overlooks just how screwed up are the optics of allowing a video promoting extremist views that the company itself has made constant reform to the goal of reducing, only to throw the content that’s supposed to moderate its propagation under the bus. Progressive political commentators would rather at the current stage have some sort of tangible reassurance that their main income source cannot be easily targeted either through a campaign of mass-flagging or YouTube’s poor enforcement of its policies. But conservatives view it as a serious threat to their cultural and intellectual capital if their ill-conceived ideas of social conservatism — even when doused in blatant policy violations of the very companies they’ve sworn to protect the freedom of — are challenged through opposing literature, or simply YouTube taking more proactive measures to ensure its policies aren’t abused, and are adequately enforced.

YouTube clearly views its battle against escalated scrutiny from politicians as a game of appeasing its bottom-line, rather than enacting its unique position as a monopoly in the video sharing space as a conduit to further nurture a more hospitable place for all people regardless of gender, sexuality, race, cultural background, or religious affiliation. Those hypothetical efforts however, become less of a possibility as YouTube continues to pay lip-service to conservatives by enacting some of the most bizarrely-worded, ill-advised decisions of content removal and demonetization against its insignificantly-small minority of BreadTube content creators–as if that’s going to shift the tide of conservative scrutiny in their favor.

The platform has been recently weathering some of the fiercest winds of bipartisan antitrust scrutiny from the leading voices in America’s tech policy debate. Some of it has centered around antitrust, and how congregation of power allows companies to curtail regulatory oversight just through sheer magnitude–Elizabeth Warren adopted that view and has vowed to break up Big Tech were she to win the upcoming presidential elections. The Jeff Sessions-staffed US Department of Justice was recently seen cooking up an antitrust probe on Google, and mere two weeks after that, the FTC was said to also be launching its own independent investigation into child predation claims on YouTube.

There seems to be an imminent reckoning Big Tech will have to face from the upper echelons of American legislature. Whether it’s the ever-so-soft political tissue that is harm to children, the correct claim that Google has worked its monopoly to dissuade fair competition from occurring in the online search and video distribution space, or conservatives’ fright of an impending purge of their kind from these platforms, the roads all seem to be pointing towards the direction of tighter regulation. And there’s nothing more YouTube would rather have than the reigns on its operations being completely off, and continuing to profit from the very lucrative business of coddling right-wing extremist views on the platform completely unchecked.

Alex Stamos, former chief security officer at Facebook — who recently railed against Mark Zuckerberg for his managerial ineptitude and lack of responsibility — and computer scientist at the Stanford Internet Observatory made the case that YouTube — whose might extends over a userbase the size of is comfortably well above the entire population of China — should assume government-like transparency, if they’re taking on the responsibility of regulating speech on a scale that deeply challenges modern ideals of domestic sovereignty. That lack of transparency, and eagerness to appeal to a reactionary notion of what they should allow on the platform is at the core of their current upheave.

YouTube can simply no longer afford to send its CEO on apology tours every single time a new controversy arises. Susan Wojcicki has been promising a smoother operation of YouTube for more than two years now. In early 2017, after a a major pull of advertisers due to claims their promotional material had appeared next to racist content, Susan made a plea at NewFronts — a pitching festival for advertisers — and vowed to major advertisers that “[They] can and [they] will do better”.

Two years later, when speaking to Recode’s Kara Swisher at the Lesbians Who Tech summit in San Francisco, she renewed that pledge by saying that she “[wants] to say there’s more progress to be made” and that she “100 percent [acknowledges] it”. And not mere months after, as the Steven Crowder and Carlos Maza saga was unraveling, when Susan was pressed by Recode’s Peter Kafka on the abhorrent nature of so much of what gets uploaded on YouTube — regardless of its political dimensions — she simply responded by saying she “[acknowledges] that” and that “[they] have tremendous tools at [their] fingertips that [they] can continue to invest in to do a better job”. The Verge’s Casey Newton had an interesting perspective on Susan’s non-apology to creators affected by the lack of effectual policy enforcement. He wrote for the Interface the following:

The more I reflect on YouTube’s current moment, the more I believe that the outrage against it stems from the company’s lack of accountability to the world. Whatever decisions YouTube makes, the world has no real recourse, even as creators like Maza suffer real-world harm in the meantime. We focus on what the policies say, and which of them the company chooses to enforce, but the larger story in my mind is the way that YouTube became a quasi-state without also developing a credible system of justice.

When is that change going to eventually come? Google’s track-record of mishandling problematic behavior within its organization has been shoddy at best–so much so in fact, that more than two-thousand Google employees spoke out on the hiring of Kay Cole James, the president of the conservative think tank “the Heritage Foundation” to its AI ethics board. Soon thereafter during Pride Month, Google said any protest of its mistreatment of LGBTQ employees and the recent line of problematic YouTube policy positions would constitute a violation of its code of conduct.

If Google lacks all corporate will to involve itself more closely with the convoluted proceedings of the biggest opinion-shaper on the internet, then regulatory oversight might not be such a bad proposition. The current US administration has shown no real will to pursue this more seriously, and so the responsibility hinges completely on an eventual Trump replacement to put back anti-discrimination laws on the table, and hold major social media platforms accountable for toppling democracies, and making the lives of entire communities an insufferable odreal.