That Time YouTube Thought I Was a Racist …

Friday. I’m working on a how-to guide for a client, which means I’m poring over a dozen or so homegrown videos that their underpaid and over exploited college-student staff created. The videos are usually pretty dreadful, although this batch isn’t as bad as most, which makes me wonder if they brought in outside help. Really I don’t care. I just want the pain to be over.

Anyway, it’s no wonder I’m a tad more than half asleep when the final video ends. What wakes me is the collage of YouTube’s “watch next” videos, custom plucked from obscurity. Just for me.

  • Ann Coulter Owns Whoopi Goldberg on Race and White Guilt 

  • 12 Times Michelle Obama Looked Like a Tranny
  • Watch Rachel Maddow get that STUPID SMIRK wiped off her face by Trump’s Election

  • Ben Shapiro Takes Down Bill Maher On “Muslim Ban”

  • Revelation: Dawn of Global Government 2016

insane alt-right playlistI’m in marketing so I know what this means: YouTube thinks I’m racist and a masculist. Or maybe I’m being overly sensitive. Maybe it just thinks I’m an alt-right, narrow-minded, ultra conservative, anti-globalist, pig-headed asshole. Yeah, that’s probably it.

This, incidentally, offends me and not in a pretend “Why are they calling me this?” way. It’s not like a conversation I had a few months ago in which someone told me he wished everyone would stop accusing him of being racist simply because he supported Trump.

“Then why did you support Trump?” was my question.

Anyway, this is a big deal and I don’t like the passive-aggressive accusations YouTube is making about me. It’s wrong and I know it. So why doesn’t YouTube’s algorithm agree?

People will think what the algorithm tells them to think

Here’s the deal. I’ve spent most of the last four years trying to game search algorithms because I was in work situations where I needed to study the tricks that people use to get their pages to rank higher. Doing so gives you insight into how most search algorithms work.

For the uninitiated (consider yourself lucky, BTW), a search algorithm is a formula used to determine what you’re really looking for online. When you’re using a search engine like Google, this means you should receive search results that hit the mark. According to Google, their algorithms “rely on more than 200 unique signals or ‘clues’ that make it possible to guess what you might really be looking for.”

YouTube’s algorithm is going to be a little different. Because the universe of possible search results is smaller, it won’t need to be quite so complex, but it likely is more predictive. In my case, I wasn’t searching for results, I was looking at the collage of videos YouTube was sure I was going to want to watch next. These suggestions were based totally on predictions. Think of it as high-tech stereotyping.

I probably would have been fine living my life without ever caring about search algorithms had it not been for a former boss, one of the most pious AND corrupt VPs ever to grace a boardroom. Call him a snake in the grass, a liar or an essential member of your executive team, I still despised every minute I worked with him. But he did teach me a lot, like what a micromanager is, what Mormon priesthood meetings must be like (I have to assume they’re dull events in which some out of touch, old white guy drones on about something I care nothing about, which was the same way every team meeting we had would progress), and how search algorithms worked.

I learned the search part because Google’s search results were this VP’s nemesis. His goal was laudable — land our company’s website in the number one spot for every search term he could think of — and he tried all the tricks that Google hates to make it happen. Keyword stuffing (he spent his weekends counting every occurrence of a keyword in our posted blogs so we’d have a list of things to change on Monday mornings); ensuring headlines led with the target keyword and promised something that would earn clicks even if it wasn’t paid off in the article (who wouldn’t want to read “Data Warehouse Secrets You Need to Know — Or Else!” that was really just a take on some guy’s high school job at Burger King). He built fake sites that linked to our company’s web pages (if Google catches this, you’re done, or so I’m told), bought fans on Twitter (Google didn’t care about this but I like bringing it up anyway), and mandated every article exceed 800 words, which was great when you were dealing with a concept that warranted 50.

As is usually the case with incredibly corrupt people in business, Google didn’t notice, and the board of directors rewarded him with a promotion.

As for Google’s search algorithm, here’s what I learned: Google pays attention to what you do on the web and not because they’re spying on your shoe purchases for the government. Google wants this info because it uses your activity to predict your future activity, too. The better Google is at this, the more you’ll search. The more you search, the more likely you are to click on an ad and the more money Google can make by selling ads that target you. Plus, when your search results are exactly what you want to see, you’re downright giddy because you won’t have to wade through a pile of irrelevant shit to find out why your cat’s turds are suddenly burnt umber instead of their usual brown.

This is the an extreme oversimplification but you get the idea.

Google’s algorithm and YouTube’s predictions look through your previous searches, what you click on, and may consider how long you stay on a page, the specific search terms and phrases you use, your buying habits, what other people who searched those same terms were looking for, how your neighbors feel about the subject, etc. The algorithm digests all of this information — and plenty more — to decide exactly what it thinks you want to see because, you know, if you wanted a fact, you’d go to a library.

poop parasiteThis means your search for “burnt umber cat turd” turns up information on a fecal parasite that makes you randy. Yes, that’s a true story, you can read it here. Befittingly, it was #2 in my search results. According to Google, “Algorithms are computer programs that look for clues to give you back exactly what you want.” Apparently I want bacterial Viagra.

More often than not, these algorithms hit the mark. But what happens when you want something you wouldn’t normally be looking for? You know, when you’re trying to objectively research an opposing viewpoint? If the algorithm’s results reflect your normal internet tendencies, your easy-access answers, the first 10, 20, or 30 results, might only give you the side of the story you already know. The rest is up to you to eek out. You have 124,768 search results to dig through so grab a cup of coffee.

By the way, this is one of the problems that’s not discussed a lot, although it does influence the increasing bias in our country and elsewhere. When the low-hanging fruit simply confirms what we already believe, why in the world would we keep moving forward to page 2 of our search results … or page 22 … to get the take from the other side? This isn’t 1978. We want our information fast. And digested. Don’t expect us to work for it.

What Did I Do to Make YouTube Think I Was an A-hole?

Back to YouTube, I will admit I usually appreciate the recommendations it makes for me. On any given Saturday night when a sip too much of red wine, which means anything over a half-glass these days (damn age!), makes me think it’s a good idea to force my children to watch a Frankie Goes to Hollywood video, and then YouTube suggests Karma Chameleon, these recommendations are spot on. My children should be tortured by Culture Club (but not Simply Red — I don’t want DCFS knocking on my door).

But not this time. The algorithms on this Friday night are failing for some reason. Sure, I’ve been known to stare at my phone watching a video of ludicrously vile thoughts that froth out of Alex Jones. I hated Milo at least a year before my friends knew who he was, thanks to YouTube. I’ve watched some of the most absurd editing ever courtesy of Paul Joseph Watson, and I’ve even tuned in to see who Tomi Lahren was — and why. But I personally believe I’ve negated each of these instances with my zillions of hours spent with Mother Jones, Samantha Bee and Carpool Karaoke.

Search and prediction algorithms exist to bring you information. Information, by nature, should educate you. But biased results and recommendations don’t help. They just mean your ignorance to opposing views wins and nobody comes out ahead.

Now here’s the best part: when I try to replicate YouTube’s vile suggestions, I can’t. I play the exact same videos, fast-forwarding each to the end because I need to stay awake. I use the same computer, same browser, and even the same browser session because who shuts their computer down?

Nothing. My suggestions are normal. Stephen Colbert, SNL and cat videos.

So why did YouTube think I was a racist for an hour or two? I have my theories:

  1. The YouTube search algorithm had a temporary hiccup, and I was there to witness it. These things MUST happen, right? Any system built by humans is going to have flaws. Just look at democracy.
  2. Since I was viewer 4, 8, 13 on most of these videos, maybe my list was heavily influenced by the viewing habits of the people who had also watched these videos… which means my client. This option scares me.
  3. My suggestions were hacked by lizard men who control the government as a way to distract me so I couldn’t see them luring my children into the back yard where the demons who used to run our country could kidnap them and feed off their positive energy. I don’t buy this one at all — my kids are teenagers so they have no energy, positive or otherwise, and are too lazy to go into the backyard. I’m not even sure they remember where it is.

I’m hoping it’s just a temporarily flawed system. But deep down, I’m guessing it was door #2. I should probably blame viewers 1, 2 and 3. Maybe there’s something wrong with exploited college students in general. Maybe hatred is how the people who made these videos take out their frustrations over working for low wages in full-time positions so they can pay overpriced tuition (Bernie!!! Where are you when we need you?). Or maybe these are simply the kind of people who would work for a company that puts a higher value on keeping payroll low rather than rewarding professionalism and experience.

Or maybe, since these videos ARE notably better than previous ones, the recommendations are the result of an outside vendor who produced the videos. I like that theory best.

Regardless, I’ve learned my lesson. Unfortunately, it’s not to stop watching clips from the sideshow that is Alex Jones, although for a few days I do stay clear of anything alt-right. But eventually I remember that I need to know what’s going through other people’s heads and the only way for me to do this is by experiencing the world as they do. Not something I’d recommend for most people — watching Alex Jones, listening to Michael Savage or stomaching anything associated with Rush Limbaugh is maddening, mind-numbing and boring.

My takeaway is that Google, YouTube and every search algorithm around has inherent flaws. But maybe that’s not always a bad thing. Because if I can get a list that’s inundated by messages from the right, doesn’t that mean someone on the right can get suggestions that are peppered with lefty thoughts like mine, too? Especially if I put an effort into changing the predictions through my own viewing habits.

Looks like I’ve got some watching to do.

 

Leave a comment

Welcome to FemmeFutile.com

Howdy and thanks for stopping by. Femme Futile is the special little space on the internet where I complain about all the shit that humans have to deal with on a regular basis. Get in touch and we can vent together!