The Swamp logo

Worshipping False Gods

When Values Go Rogue

By Conor McCammonPublished 3 years ago 7 min read
Like
Worshipping False Gods
Photo by Peter Chiykowski on Unsplash

When I first started studying philosophy, I went deep. I realised that all of my cherished beliefs and ethical convictions could be incoherent, unjustified, delusional, even morally bad. I went into studying ethics with an open mind. More open than any of my classmates I suspect. Of course, to appropriate Eliezer Yudkowsky, there is more than one rationalist virtue. Merely being maximally open to changing one’s mind and considering arguments at face value isn’t enough (although it’s a wonderful start). Because it led me to views that I can now safely describe as the convictions of a supervillain. At one point I was a negative utilitarian so extreme that, if presented with a button that would destroy the universe, then I would unilaterally and without flinching hit it.

Now, it’s certainly true that people on average don’t place anywhere near enough weight on the disvalue of extreme suffering in the world and the urgency of solving it. But my response to anyone who told me that my convictions were supervillain-y was “that’s just your silly deontic intuition leading you astray. Sometimes things sound bad on face value, but on reflection the arguments are very strong. For example there are lots of drugs that ruin people’s lives, so our intuition says that we should ban them. However, upon reflection it’s possible to realise that this would actually make the problem worse, despite our prior intuitions. I’m saying a similar thing for ethics, and its real-world implications. You have to be willing to consider counterintuitive positions.” All of this was said by a young man who believed what he said, but who also wanted to score points by being Deeply Wise. But, sometimes contrarianism is just contrarianism, and there’s a deep reason why blowing up the universe seems to be a terrible idea: because it is a terrible idea.

*

A couple of years ago I was at the big EAGx conference in Australia. I met a man a few years older than me and we started chatting. He was smart and I could tell he’d thought deeply about politics, ethics and psychology among other topics. He was nice too, buying people drinks and listening intently to what anyone had to say. I was really enjoying our conversations.

At some point, we got around to talking about governance and ethical theories and this guy said very matter of factly that his ethical theory (stemming from State Consequentialism) was that of ‘Social Consequentialism’. Ignoring the consequentialism part, the important thing to note is that this meant that he considered the large, abstract thing called Society (sometimes the word ‘state’ may work equally well) to be the fundamental subject of moral worth. This meant that individual people were only instrumentally valuable, so far as they assisted in the stability/functioning/’health’/other metric of the Society. He actually contended that individuals were of very fundamental instrumental importance: you couldn’t have a functioning society without them, and you probably couldn’t have a functioning society without most people being somewhat happy with their lives.

“Ignoring whether that’s true,” I said, “there’s still something deeply wrong with your position. You have it exactly backwards. Individual humans are the ones who have preferences and values, hopes and fears, who are conscious and experience pain and joy and everything else. People are subjects, the fundamental subjects of moral worth. ‘Society’ - insofar as that the term points to a coherent entity - exists to serve individuals. Society is instrumental, individuals have intrinsic value. As Scott Alexander said, ‘we were here first’.”

And then he said something so horrible, so bone-chilling that it echoes in my dreams.

He said “well, it seems like we simply have fundamentally differing moral intuitions. Agree to disagree.”

Agree to disagree. About whether human lives were intrinsically worth anything. About whether people were any more than merely useful pieces of a larger, abstract system which itself was the sole subject of true value in the world.

“I’m sorry,” I said. “But I can’t agree to disagree on this. Our disagreement is terribly fundamental. It reminds me of three things: the Chinese Mohists (who were pretty cool for State Consequentialists), authoritarian regimes (not to accuse you of being a fascist, but both Nazi Germany and the USSR advocated for the subjugation of the individual to the ‘true’, ‘ultimate’ good of the Nation/Society/State), and Scott Alexander’s Ascended Economy wherein economic activity transcends human beings, creating systems of immense wealth and capacity with no humans around to actually benefit (Bostrom calls this a Disneyland with No Children, Nick Land calls it ‘utopia’ because he’s insane or evil or both). I’m not sure that your particular view endorses a fascist or Landian outcome . But it doesn’t preclude it. Even though in this world a healthy society requires human beings that are relatively healthy, have some freedom of movement, personal relationships etcetera, if it so happened that you found a method that would lead to a more functioning society by torturing everyone in cages, you would be committed to bringing this world about. That should make it obvious that your values are misaligned.”

“That argument reminds me of arguing about consciousness using p-zombies. Just because such a world may be conceivable to you doesn’t add any weight to your argument. We find ourselves in this world, where a society is fundamentally composed of individuals and where individual preferences generally correspond to the functioning of society. Think pragmatically about this. You and I would end up agreeing on most questions of policy. Even in cases where a weak-manned version of my worldview would say ‘I support the War on Drugs because it represents a strong society’ I will be swift to point out that my actual view is ‘I condemn the War on Drugs because it is ineffective in its aims and leads to further damage to the social fabric than a different set of policies’. I assume you would agree, on different grounds, that the War on Drugs is bad. So you see, we are bound to rarely disagree, particularly not on any matters you tend to find important.

I’m a very agreeable person generally, particularly in the sub-facet of politeness. This means that often I am too eager to dispel conflict and find common ground. So at this point in the conversation, I conceded that our normative worldviews converged on many of the same positions for any given real-world issue. And then, the larger group that had been listening to the conversation changed topics and the conversation ended.

*

Amongst the sampling of ‘average’ or ‘normal’ people I know, there seems to be an overreaching tendency towards ascribing any problems in the world to outright malevolence. In the EA and rationality communities, people have a tendency to do the opposite: to overemphasise the structural, systemic, impersonal causal aspects of a problem (link my post ‘Malevolence, Systems, Blame’ here). This systemic view often leads to a denial that actual malevolence even exists. And sure, it’s less common than the average person believes. But there are malevolent people around who actually want to do harm or don’t mind doing great harm in service of a goal like self-interest.

Often more dangerous and less obvious than self-interest are those who are happy to do harm in the service of some ideological goal - either because they think they have identified a cause more important than preventing harm, or because they actually have a different definition of harm altogether. The man I met at the EA conference was this latter kind. Polite, smart, interesting, civil, charismatic, normal. And above all, principled. It just so happened that these principles had put the cart before the horse and decided that society was the ultimate subject of moral value rather than, you know, people. He was not a classic supervillain. In fact, he was so close to being the classic hero, if he hadn’t made such a fundamental mistake in building his theory of value. But of course, I couldn’t communicate this to him: our argument bottomed out at claims that each of us considered self-evident. I didn’t have the language or understanding at the time to go beyond this seeming ‘fundamental normative disagreement’ by delving into meta-ethics with him. I just knew he was wrong.

And he was also wrong about us being roughly ‘on the same team’. We were not on the same team, not remotely. My visualisation of his beliefs was of a blind and soulless god, a Lovecraftian horror totally indifferent to human values except when they happened to align with its own ends. A demon which would readily eat children alive if only the act would sustain the mindless beating of its Eldritch heart.

I call this failure mode Worshipping False Gods. You can have everything else right: you can be smart, conscientious, and committed to the Good. All it takes is to concoct a theory of the good which commits obvious errors and permits unspeakable evils in the name of this theory. This is the orthogonality thesis applied to human goals.

Why write this post? Two reasons. To warn people against worshipping false gods (if your worldview commits you to some obviously horrible action, perhaps reconsider your worldview before dismissing the horror as ‘merely a maladaptive failure of moral intuition’). And to warn people against those who worship false gods. Even if our values align pragmatically, they are not on our side. And, given the right circumstances and opportunity, they will sacrifice all that we hold dear for some monstrous abstraction (see Nick Land).

opinion
Like

About the Creator

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.