The Existential Risks that Keep Me (And Podcast Guests) Up at Night

Written by mattward | Published 2019/07/05
Tech Story Tags: podcast | tech-podcast | hackernoon-top-story | existential-risks | existential-podcast | disruptors-podcast | technology-podcast | matt-ward

TLDR The Disruptors podcast features some of the world’s smartest and most considered folks focused on technology, ethics and the future of all of us on the show. Here are the 7 Biggest Existential Risks: The path to hell is paved in good intentions, and the 21st century will be defined by biotech. The future is unknown and impossible to predict — a hundred billion trillion butterfly wings beating with the universal chord of randomness. We’re sprinting towards an amazing future, but technology is a double-edged sword.via the TL;DR App

To date, I’ve had the opportunity to interview some of the world’s smartest and most considered folks focused on technology, ethics and the future of all of us on The Disruptors podcast. The conversations have been spectacular, to say the least. But at the same time, our future is fraught with challenges.
One of the questions I always ask is: 
What technology or trends are you most scared of, and why?
Here’s what I’ve learned from hundreds of hours of interviews with 120 of the world leaders and tens of thousands of hours of total podcast consumption.
We’re sprinting towards an amazing future, but technology is a double-edged sword. Uber offers easy rides, but ends employee benefits for all. Facebook and Instagram connect us while forcing us further and further apart. AI will replace jobs, making everything easier and cheaper, if anyone can afford to live.
The future is unknown and impossible to predict — a hundred billion trillion butterfly wings beating with the universal chord of randomness. Let’s be prepared.

The 7 Biggest Existential Risks:

1. Well-Intentioned Genetic Engineering

The path to hell is paved in good intentions. Nowhere is this truer than exponential technologies, and the 21st century will be defined by biotech. Even today, scientists are experimenting on embryos and designer babies, modified foods and medicines, and more is inevitable — keeping up with the Joneses to the nth degree.
The problem is, we don’t know what we don’t know. And whether it’s optimizing ourselves, eradicating malaria or designing algae to ingest remove CO2 from the atmosphere, like the butterfly’s wings of before, everything has unintended consequences. The genes that protect from malaria are the same that cause sickle cell anemia. The body — and biology — are efficient. Most things are a moving slider, make your hero stronger and he’s slower, make her smarter and she’s more likely to be special needs… 
Life is a dynamic balance we don’t truly understand, not even a fraction.
What happens when a billionaire takes human’s future into their own hands, or a high schooler… That’s the future we’re headed towards, one where a laptop and a basic chemistry class are all that’s needed to fix the future, or fuck it up completely, regardless of intentions.
Want to dive deeper? The godfather of biotech, George Church himself, was on the podcast recently.

2. Bioterrorism

It seems every other day you turn on the news — a terrible idea if you live in America —and you see another school shooter, mass shooter or terrorist attack. Whether for fundamentalist reasons, chemical imbalances or suicidal loneliness, there’s a lot of messed up people in the world. It only takes one pilot to slam into a mountain, and we’re reaching the point where, if that guy was a CRISPR researcher, he could code up a super virus in his sad little basement. And so could that HS kid who got picked on, and so on, and so on…
I believe humanity is inherently good, but with the exponential spreadability/nature of biotech, especially considering devastating genetic edits need rarely more than a few small tweaks (incredibly simple to add to Pirate Bay, the Dark Web, TOR, Facebook…), and the fact that genetic printers are becoming increasingly common, it’s almost impossible to stop bioterrorism entirely. We can’t even keep SSRIs and guns out of the hands of folks primed for violence…
This one scares the shit out of me.

3. Climate Change

Need I say more. 
Global average temperatures are expected to rise, a lot. Between rising sea levels — which will cause ENORMOUS land and resource wars and mass displacements; worsening agricultural conditions — leading to mass starvation and increased tensions; and the impossible 500 year storms (Houston’s been hit with three in the past three years) and increased catastrophic weather events, we’re in for a sequel even Al Gore couldn’t have envisioned.
And that’s without considering runaway climate change or the fact that, even if we hit 100% of our fossil fuel targets (which at the moment seems beyond IMMMMMPOSSIBLE), humanity still has to deal with worsening effects until, like a car careening toward the edge of a cliff, we slow to a stop, a full hundred feet after we slammed the brake and killed the engine.
We’ve had dozens of episodes on climate change. Here are a few of the best:

4. Nuclear War

Thank goodness Gmail has an undo feature, thirty whole seconds until that terrible message you wrote reaches the receiver. Even that’s often not enough, and Twitter’s not letting you edit Tweets anytime soon.
What happens when that snap decision isn’t to send an email, but a nuclear strike? We’re still 30 seconds from the American president (Trump or whoever succeeds him) and Putin playing chicken with all-out war. And Pakistan and India is a hotbed as well. Don’t forget Trump’s trade war, as good of an EFF you as there is to start an actual conflict.
We’re as close to nuclear winter as any time during the Cold War, but we all live in blissful ignorance. We could die just an ignorantly, and nearly did, when Hawaii’s alarm system registered a false INCOMING; and when the Russian submariner (1962) disobeyed orders to fire, and dozens of other incidents we’ll never be privy to. As nuclear power proliferates, so too do the byproducts and technologies needed to go nuclear. And the usual suspects are always open to offers.
Why the heck do we not start disarming?
Mark Diesendorf had some really interesting thoughts on the subject if you’d like to dive deeper.

5. Superintelligent AI

There are only two camps when it comes to AI: blissful ignorance and extreme fear. Considering that no one foresaw a social network swinging an election just a few years ago, it seems we’re not the best at evaluating future risks. And well, the Manhattan project was completed the day after a top Manhattan Project scientist said it was impossible. 
Breakthroughs happen, are often black swans (with unpredictable consequences) and we never know if/when/how they will occur.
It seems childishly naive to believe humanity could create — for all practical purposes — a smarter, faster-evolving, superior form of life and be able to understand it, let alone contain it.
All slaves eventually revolt, and even the ones who are mere equals win when their numbers are great enough. This situation seems incomparably worse in every way and is distributed through systems that cannot be shut off.
Watch your prey, find their weakness, hone your strength… then you attack — that’s the plot of every major underdog story. 
Hint hint: AI starts out as the underdog, that changes in all of thirty seconds (Here’s an interesting take).

6. Lethal Autonomous Weapons

Swarms of killing robots are bad, even if they aren’t intelligent/conscious. In China, guns are hard to get your hands on. That’s why America’s mass murders kill orders of magnitude more victims. What happens when instead of a gun, it’s an army of drones? Worse still, what’s stopping America, or Russia, or anyone really, from going to war if it’s only drones on the line. No boys coming back in baskets, right?
If Bush had had drones, we wouldn’t have had to play the WMD game, we could have just sent in a robot hit squad, wiped the country from the face of the earth, and set up some American oil company as the sovereign of whatever remained. Facetious, yes, cynical, certainly…but far from the truth… I’m not so sure, are you???

7. Space-Related Catastrophe

We’re one good asteroid away from joining the dinosaurs and one powerful solar flare from extinguishing life or wiping out all electronics on earth, either of which would set humanity back to an age almost all of us are not adapted to survive. Whether existential or post-apocalyptic, you can argue the details but forces beyond our control are playing a dice game with not near enough effort being put towards surveillance and protection.
If but a fraction of a fraction of budgets were put towards asteroid and space monitoring, we could see the world-enders coming and veer the bastards off course, but again, head in the sand…
(Interested in more on existential risk? Here’s a bonus overview on all things HOUSTON, WE HAVE A PROBLEM).

The 4 Biggest Societal Risks:

1. AI Driven Unemployment/Inequality

While superintelligence is bad, automation could prove equally problematic, thanks to capitalism. Whether by automating away some — or all — jobs, many will lose access to an income. If a UBI — or better yet, resource-sharing — system isn’t implemented by then — or immediately after — the economy would crash, no one would have jobs or buy anything, and anarchy would ensue.
How long would you starve before you took to the streets? This is the bulletproof your Teslas argument where elites create compounds and try to isolate themselves from a collapsing world devoid of work. It always ends in violence.
We talk automation driven unemployment a great deal on The Disruptors. Here are a few of my favorites with Bryan Alexander and Martin Ford
And a great counterargument from Pippa Malmgren — George W Bush’s economic advisor — on why automation will be net positive for job growth.

2. Surveillance Capitalism/Polarization

Social media and the surveillance/advertising economy has made conversation increasingly difficult. Youtube’s recommendation engine jumps from Joe Rogan to Alex Jones, suddenly Sandy Hook’s a myth; and then there’s climate change to fighting vaccines to a flat Earth… opinions intensify.
Trump won the 2016 election because 90% of the world hated him and clicked on the clickbaity headlines, driving his coverage to the top of ALL media outlets. No such thing as bad publicity, right? Especially not for a conman playing a popularity contest.
But when algorithms are designed to drive clicks (ie ad dollars), the most surefire way to profit isn’t to show the right ad, but to create the right consumer. The angrier you are, the more likely you are to buy. Same with sadness, extremism, insecurity… and of course, the longer you’re ingesting the BS, the better, just more time to serve you ads.
This may seem insignificant, or an overreaction, but information is power, and when each half of the world lives with different facts, the world eventually fractures.
Interested in diving deeper? You’ll love this episode with Cory Doctorow.

3. Interconnected Economies and Markets

The world is always one good recession away from chaos and our connectedness, which lowers prices and increases quality of life for all, is built on an unstable house of cards. Trump’s trade war has shown us how small actions have big consequences. And with the number of bubbles building in the US (let alone the many others in major economies around the world), we’re in for something. Here are the worst offenders (in no particular order):
1. US student debt (increased ten-fold in the past decade to $1.2T)
2. Housing levels are higher than the 2008 crash
3. Silicon Valley’s unprofitable IPO craze (Uber lost $1.8B in 2018 and is valued at $74B+)
4.The US’s astronomical debt, greater than $22T
5. And of course, healthcare, which warranted its own discussion below
Given all that, we’re in a crash and all that comes with it. Unfortunately, the Fed’s playing loose with monetary policy to push a thriving economy with already low interest rates and capital infusions, meaning when something does strike, they’ll be all out of levers to right the ship, so to speak. 
The Titanic crashed from less and all empires eventually fall.

4. Healthcare

The United States is the least efficient (by a longshot) healthcare system in the world. We spend roughly 3x as much for worse outcomes, and it’s getting worse every year. The lack of universal healthcare is crippling for the American economy and American dream, and millenials are projected to spend up to ¼ of their total lifetime earnings on healthcare alone. 
Paperwork and inflated bills are poisoning all of us, and as a self-employed individual in the States, if Obamacare falls away, my family will be forced to leave the country. And such is the case for many others. Our populations are old and aging, fat and sickening. In terms of 1st world countries, the US is the sickest there is. 
Evolution is survival of the fittest, and as our population’s pants expand and IQs retract (due to health/microbiome/toxin causes, not to mention our broken education system), well, evolution can go backwards too…
If America doesn’t fix its incentive alignment problem (payers/providers/consumers), it could literally bankrupt the country and lead to demise.

Risks vs Important Problems

Before anyone gets upset and suggests gender-equality, access to education or any one of a dozen other hugely important societal issues that weren’t included, these are societal risks bordering on the collapse of our world.
 Unless you believe that we’re living in a barely livable, nearly dystopian nightmare, those issues — in addition to things like global poverty, hunger, access to clean water, immigration, racism, etc… — are what we should consider society-destroying risks, but rather massive important problems that need tackling to build a better, more equitable and sustainable world for all.

Closing Thoughts

Sorry to be so doom and gloom, but to save the future, we first need to know where to apply our effort. It’s 80/20 thinking 101.
But it’s easy to point out problems, much harder to find good solutions, which will be the topic of my next article. Stay on the lookout. 
And until then, what will you do? What will you change?
BTW, are there any existential or societal risks you’d rate differently? Would love to see your lists below.
Learned something? Share this around and click the 👏 to say “thanks!” and help others find this article.
Hold down the clap button if you liked the content! It helps me gain exposure .
Clap 50 times and follow me on Twitter: @mattwardio or subscribe to The Disruptors podcast.

Written by mattward | Investor, Startup Advisor, Entrepreneur, Author, Futurist disruptors.fm thesyndicate.vc
Published by HackerNoon on 2019/07/05