3 Signs You’re In the AI Cult of Data

Written by crisbeasley | Published 2018/05/31
Tech Story Tags: ai | machine-learning | embodiment | technology | ai-cult-of-data

TLDRvia the TL;DR App

Believe your own eyes, not just the decimal points.

I grew up in a cult. Since leaving 12 years ago, I’ve studied the tools and techniques used by cults to entrap their members. As you might guess, I hold a deep-seated desire to NEVER be pulled back into one, so a big, loud alarm bell blares in my head when I spot the signs that an organization is relying on groupthink to control its members. Here’s what I see in the tech industry.

1. You’re not allowed to talk to others.

Cults keep members brainwashed by not letting their members talk to outsiders. Facebook and Google now apparently scoop up 80% of new PhDs in machine learning. Their employment contracts ensure they cannot have frank conversations about what’s happening with anyone outside the company, sometimes not even outside their own team. That’s terrifying.

It certainly helps explain how Google could’ve released the thoroughly upsetting AI bot, Duplex, that simulates a voice to make phone calls on your behalf. It’s far too easy for Googlethink to perpetuate when their employees can’t get feedback from people who aren’t brainwashed. If they’d talked to anyone outside their bubble, they’d have gotten a frank “WTF?! That’s hella creepy.”

Google even discourages their employees from using web services by other companies. I had a bizarre conversation with a long-time Google employee who wouldn’t sign up for AirTable to collaborate with me because she only uses Google tools, even in her personal life. She seemingly didn’t think of Google as a second party to her. She referred to them with we as in “we don’t use software made by other companies.” It gave me the icks.

Cult leaders teach their members not to trust those outside of the cult. In my upbringing, we were always told “the world” didn’t understand us, which kept us from taking anything anyone else said seriously. We were taught to view anyone who wasn’t “with us” was the enemy.

Solution: Regard authority figures who tell you you can’t or shouldn’t talk to others about what you’re building with deep suspicion.

We as creators of AI systems must be able to talk to each other and figure out how to build algorithms that benefit all of humanity. It’s a super hard problem, and we’re not going to solve it by hiding in our silos.

2. Mathwashing is the new brainwashing.

Algorithms do not transubstantiate garbage data into capital-t Truth with three significant digits and a decimal point. The tech industry prides itself for making data-driven decisions, yet I see a level of dogma around AI that can only be described as a cult.

Love it or hate it, algorithms are taking over more and more decisions about your life. There was once a day when the bank decided whether or not to loan you money based on their person-to-person interaction with you. Now you have a credit score. There was once a day when your prison sentence was decided by a person in black robes with a gavel in hand. In more and more places, those sentences are now handed down or influenced by an algorithm.

In both cases algorithms promise a more fair system which would treat everyone the same. In both cases the biases which colored the in-person judgements were made even worse in the algorithm.

At their best, algorithms model the world in a useful way, but we can never, NEVER allow ourselves to mistake the map for the territory. Algorithms are always wrong, always. At best they are compact, useful, and efficient, but they’re never 100% correct.

The map is not the territory.

Solution: Believe your own eyes.

We can never abandon our human senses. Our eyes and ears tell us if what we are creating is good, but we have to bother to look and listen. In the tech industry, too often numbers matter more than the rich human stories that tell the how and why, not just the what happened. We must refuse to blindly believe numbers which are easy, convenient and simple above the human stories which are often messy, contradictory and painful.

We need ALL the data, both the quantitative and qualitative. Numbers will never capture the texture needed to have real debates about which cultural value to embody in the digital service you’re bringing into the world. Numbers will not tell you, for example, why trans people’s accounts are being suspended or what impact that has on their lives. There’s no amount of significant digits in the world that can adequately tell that story. We must have the rich, complicated stories in order to correct the errors in our algorithms.

3. Cults control their members’ financial lives.

I know a lot of people who work in tech in Silicon Valley, and though some truly do believe in the mission of their company, far too many of them are there for the paycheck. I’ve been there. I know. Sometimes you tango with the devil on Tuesday.

Typically cults control their members’ financial lives by making them do very low or unpaid labor. FaceGoog have done the opposite, paying people so much more than anyone else, even than other tech companies, that people willingly snap on their own handcuffs. Once you buy a house and/or put your kids in private school, your own lifestyle inflation ensures you can’t leave without major disruption to your family’s life.

“Golden handcuffs” aren’t called that because they honor the agency of the people who are subjected to them. I have a dear friend and former colleague whose one-year anniversary at Facebook just passed. He’s literally counting down the days until his four-year vesting is complete so he can move on. This story isn’t at all interesting because it’s so common, and that’s my point. This form of financial control has become so normalized that we don’t think there’s any other option.

There is no such thing as a “standard” contract. If you’re selling your company or negotiating terms of employment, stand in front the mirror and repeat these words again and again until you can spit them out next time someone throws that bullshit at you.

Solution: If your company asks to to do something you morally object to, don’t do it.

I wish I had an easy solution for you, but there isn’t one. You will endanger your job by having principles and standing up for them. You might get handed a bankers box to pack up your red stapler and rollerball pens while everybody watches.

“A principle isn’t a principle until it costs you something.”– Bill Bernbach

I stood up for my principles and left my cult at 26 years old. It cost me my marriage to my best friend, being ostracized from my entire social group, and a chunk of my bank account. My jaw locked closed for a month due to the stress of leaving, but I’ve never once looked back.

When you’ve risen up from the ashes you don’t fear the fire nearly so much anymore.

It’s hard as fuck. I get it. When you give yourself permission to burn your whole life to the ground though, you get to write your own check for what kind of life you want to lead. I trust myself to a depth I hadn’t previously known was possible.

I lost the $200k I invested in my own startup and still have debt to pay back against that. Despite that, I’m taking two months off of consulting to write my book now. I’ve lived in the Bay Area now for two years without a normal job. My roommates drive Lyft to make ends meet and guess what… we haven’t died or gone hungry. Our rent is on time, and we’re not half as stressed out as my friends with important places in an org chart. It’s a 100% lie that you must work for FaceGoogTwitZon. You are choosing to work for them.

Like, look… you have your own situation that I don’t know jack shit about. I’m not telling you what to do. You know what your capacity for risk-taking is. What I know is true is that no one ever makes anyone do anything. When we think we have no options it’s because we don’t like any of the options we can see, but that’s not at all the same thing as having no options.

Sometimes it takes a long time to escape from environments where you can’t do the right thing. I deliberated for two miserable years whether to leave the cult. It was the hardest thing I ever did. On August 1, 2004 I gave myself permission to leave if in a year’s time I still wanted to go. That final year passed in tortured slow motion. Every single one of those 365 days I reaffirmed that I wanted to leave. On August 1, 2005 I walked out the swinging double doors of Gospel Assembly and never came back.

We’re all doing the best we can in our given situations, but if you aren’t happy with where the tech industry is headed and this essay can inject some courage in you to do what your wild heart knows is good for you, your community and/or the world, that’s all I could ever hope for.

“Do the best you can until you know better. Then when you know better, do better.”– Maya Angelou

What does your Should Monster say to you?

I believe you can only build technology that’s as human as you are.

I’m writing a guidebook of practical exercises aimed specifically at technology creators to help reconnect you to your body, intuition and emotional guidance system so that you can build technology that’s good for humans. I’ll be shipping out card decks and books to alpha testers this June, so if you’d be interested giving feedback to help shape the project, pop yer email in this here box.

There's Evidence We're Living in a Simulation_The moment you perceive as NOW has already passed. Good new is – we can hack the simulation._hackernoon.com


Published by HackerNoon on 2018/05/31