Earth logo

Climate Change is a Bigger Existential Threat Than AI

AI, Climate Change, and the ending of things

By Alex Mell-TaylorPublished 8 months ago 10 min read
Like
Photo by Li-An Lim on Unsplash

It's been the year of fretting about AI. While there are many ethical considerations with AI, the issue of labor being one of the primary ones (see The Work of Art in the Age of AI), what AI evangelists talk about is often rooted in "Longtermist" concerns such as the end of human civilization as we know it. "Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war," reads a statement signed on by CEOs, academics, and other members of the business elite.

This framing has always been precarious, not only because it overstates the current level of this technology (AI is nowhere near Skynet levels) but because it undercuts the actual existential threat we are currently facing — i.e., climate change. Over the next few years, our society will be shaken to its core, not by AI but by our warming world, and any conversation that is not grounded in dealing with these concerns is fundamentally not serious and a red herring.

Existential Concerns about AI are a fantasy

Some concerns about AI are again valid. Like with most things under capitalism, technology over the last decade has been used not to help society as a whole but to extract wealth into narrower and narrower hands. From ridesharing apps to social media, the pattern has been clear: disruption is, in actuality, the practice of using regulatory arbitrage (i.e., taking advantage of regulatory gaps in government policy) to increase profitability.

This economic reality is not what people, like the AI alarmists who signed onto that statement, are referencing when they speak about the existential threat of AI. Signatory Geoffrey Hinton, for example, the infamous "father of modern AI," is heavily influenced by the philosophy of Longtermism, or the idea that influencing future outcomes is a key moral priority. He warned The New York Times that although immediately he was worried it could inflame job insecurity and misinformation, AI becoming more intelligent than humans was a considerable concern.

This type of rhetoric is common among Longtermist adherents. A common existential fear brought up with AI is the "paper clip dilemma." In short, this involves a massive, interconnected AI being instructed to produce as many paperclips as possible and, in the process, converting all of humanity, maybe even all organic matter, into paperclips.

It sounds scary, and the paper clip problem is an interesting thought exercise worthy of being written about in science fiction (and it has), but here, in reality, we are nowhere close to this scenario unfolding. Some experts have described current AI as merely a Stochastic Parrot. It simply cannot do what this scenario suggests: it might never.

The modern economy still requires the labor of real people, and even with the integration of more AI, that will still be the case for the foreseeable future. We are not at risk of turning everyone into paperclips because that would require real people to make those decisions, and that happening would result from a much bigger problem than AI.

Furthermore, Longtermism has been criticized for being quite vague, as what constitutes a "future good" is not as clear-cut as they often argue. At its more extreme, this philosophy can lead to a hyper-utilitarian outlook, allowing those in power to focus on far-off problems rather than the systems of harm they directly benefit from. As Parmy Olson writes in Bloomberg:

“Silicon Valley technologists…certainly mean well. But following their moral math to the extreme ultimately leads to neglecting current human suffering and an erosion of that other very human feature — empathy.”

Instead of causing a techno-apocalypse, what we are in danger of from AI is uncritically implementing this technology so that a couple of rich people can continue the era of wealth extraction that began with Web 2.0 and, in the process, exacerbate preexisting discrimination on a massive scale. Because AI is trained on datasets produced by people, it has recreated the same systemic biases of people. We have seen problems, such as facial recognition and hiring software discriminating against people of color, that should give us pause.

These are the actual dangers we have to worry about when it comes to AI. We need not fret about AI becoming Skynet, GLaDOS, or Hal-9000, but rather that algorithms will continue to be used to discriminate against marginalized identities and classes. The solution is not to ward off or prohibit AI, “deep learning”, or whatever you want to call it, but to place these resources in the hands of the public so that they may be properly audited and changed. As several AI academics note:

“We should be building machines that work for us, instead of “adapting” society to be machine readable and writable. The current race towards ever larger “AI experiments” is not a preordained path where our only choice is how fast to run, but rather a set of decisions driven by the profit motive. The actions and choices of corporations must be shaped by regulation which protects the rights and interests of people.”

However, such a solution would mean denying Silicon Valley its latest toy. So, instead, we get monologues about science fiction that move the debate away from these more sensible actions. Silicon Valley executives and founders get to frame themselves as Pandoras, benevolently warning us in advance of the boxes they had to open.

And while these leaders overinflate the significance of AI, fearmongering about how it could upend all of human civilization in the next five years, climate change threatens us all, not simply in the future, but right now.

Unlike GLaDOS and Skynet, Climate Change is Real

It cannot be overstated how existentially of a risk climate change is to our overall security as a species. Even being conservative, the projected effects of the carbon we have already signed for can potentially change everything we take for granted.

The rise in sea levels is the most common example brought up. Current estimates place the level of rise at 1 foot (30 cm) by 2050 (that's less than 30 years from now) and an additional 2 feet by the end of the century, but this will fluctuate more or less by region. 2 feet may not sound like a lot, but it will have a devastating effect on coastal regions across the world. Cities like Amsterdam, Bangkok, and Shanghai will just be gone.

In America, the East and the Gulf Coasts are expected to be hit far worse on average. According to research from the group Climate Central, if pollution remains completely unchecked, the coastline of Louisiana — New Orleans included — will be unlikely to make the transition. Florida, from Miami to Jacksonville, will come to know devasting floods as a routine part of life. In fact, from Savannah to Boston, few major cities on the East Coast will be spared from the effects of sea rise, with Boston probably sharing New Orleans and Miami's fate of being one of those cities that doesn't make it.

We could spend the rest of the article analyzing the effects of these changes alone — of what happens when hundreds of millions of people are forced to move inland. There are hundreds of thousands of people in Boston alone: Where do they go, and how will they be treated once they move? In her 2021 essay for The Intercept titled A Climate Dystopia In Northern California, Naomi Klein described how the material stresses of the California wildfires pushed the liberal college town of Chico to move to the right (a situation that does not appear to have improved). As she writes in the article:

“Today, Chico, with its brutal crackdown on unhoused people in the grips of a deadly pandemic and in the midst of serial wildfire disasters, does not demonstrate community “resilience.” It demonstrates something else entirely: what it looks like when the climate crisis slams headlong into a high-end real estate bubble and social infrastructure starved by decades of austerity. It also shows what happens when locally developed climate justice plans are denied the federal and state financing that they need to rapidly turn into a lived reality.”

Imagine Chico on a country-wide, even global scale. We are already experiencing a surge of xenophobia and far-right fascism from the beginning effects of climate change. By 2050, the instability from what has been estimated might be over 1 billion climate change refugees will have taken its toll. I can't predict what that kind of world will look like, but it will be utterly unrecognizable from today.

Part of the reason for this xenophobia is that we will be working with fewer resources overall. We know that an increase of 1.5 degrees celsius, which is all but inevitable given current pollution levels, will have major impacts on crop production. Some crops, such as corn, could see a marked decline. At the same time, wheat's growing range could be expanded. However, even in optimistic scenarios, climate change will increase the likelihood of "killing-degree days" where crops cannot grow at all and possibly even die off. These drawbacks will require immense shifts in production that will need to be addressed if we want to continue producing food at our same levels (so people don't, you know, starve to death).

Another primary concern is water. Something that often gets lost on people is that the water cycle (i.e., the continuous movement of water in the atmosphere) is based on a very delicate balance. While it is technically renewable, parts of it are stored in processes that are replenished very slowly. Canada, for example, claims to have 20% of the world's freshwater, but only 7% of that is "renewable," meaning that it only cycles regularly in a particular area and time at that rate. The rest is stored in lakes, underground aquifers, and glaciers that will not necessarily be replenished quickly in human terms if tapped out, adding an element of scarcity that most people need to consider when it comes to water.

And so, as we add further stress to the water cycle through human consumption and rising temperatures, we will have some regions that have less water overall. One only needs to look at the situation unfolding with communities dependent on the Colorado River to understand this problem. Nearly a century of overuse has led to a situation where the Southwest is now faced with significant water shortages.

Overall, there will be less food, less water, and less land, and that will create a toxic sociopolitical cocktail that has the potential to push us toward a more authoritarian world. That is the problem affecting us in this decade, not some weird sci-fi BS that might never come to pass.

Yet, we are not seeing the investments needed to do much about it. Joe Biden may have signed the "largest environmental bill" in US history (a low bar given how bad our country has been in this area), but this $375 billion investment over a decade (37.5 billion roughly a year) is inadequate compared to what is needed to prevent some of the scenarios I have pointed out coming to pass. Its passage also came at the expense of increasing the number of oil and gas permits being approved, which makes this entire matter tenuous. We don't have much time to dick around here. Increased investment in the green economy cannot come at the expense of new emissions if we want to enjoy future luxuries such as the city of Boston still existing.

We should treat climate change as a crisis, forcing companies to pay the cost of transitioning to a more ecologically stable world upfront. Instead, we are watching billions being sunk into AI (several billion coming from the federal government): a technology that is costly to train, both financially and ecologically, and does nothing to solve the problem at hand.

It feels like a strange distraction. If AI causes the apocalypse, it will be because of neglect, not a robotic hivemind nuking humanity into oblivion.

A Warming Conclusion

It cannot be emphasized enough how frustrating this conversation on AI is because it entertains the runaway fantasies of the rich at the expense of the present. AI does have drawbacks we have to worry about. It perpetuates the systemic biases of current society and risks creating a culture where racial and class-based inequalities are exacerbated in the name of "progress."

Yet even these worrying problems pale in comparison to the threat of climate change, which is an existential threat to human civilization in the here and now. If we do not start making serious investments in our climate and paradoxically divestments from our capitalist system, then many of us will not survive the chaos of this next decade.

Most of us do not have a New Zealand or Mars compound to retreat to. We need to focus on the problems we face, not the paranoid ramblings of rich people's imaginations.

NatureHumanityClimate
Like

About the Creator

Alex Mell-Taylor

I write long-form pieces on timely themes inside entertainment, pop culture, video games, gender, sexuality, race and politics. My writing currently reaches a growing audience of over 10,000 people every month across various publications.

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.