Futurism logo

Computomorphism and the Compulogical Fallacy

Modern Day Cognitive Biases

By Everyday JunglistPublished 7 months ago 4 min read
1
Images like this are a big part of the problem. Image by Gordon Johnson from Pixabay

Anthropomorphism

The attribution of human traits, emotions, or intentions to non-human entities. It is considered to be an innate tendency of human psychology.

Computomorphism

The attribution of traits, characteristics, or capabilities of computers to humans and other biological entitities and systems. The tendency to use analogy to computers and computing to explain and simplify complex biological systems and phenomenon. It is definitely an acquired tendency of human psychology and is seen most often in pop culture writing about technology and neuroscience. The application of computomorphism in neuroscience gives rise to the compulogical fallacy.

The compulogical fallacy

In their classic work, The Philosophical Foundations of Neuroscience, M.R. Bennet and P.M.S. Hacker gave the name mereological fallacy to the logical disorder plaguing much neuroscientific thought at the time. Then, and still to this day, neuroscientists commonly assigned various cognitive attributes to the brain that can only logically be attributed to a whole human being. Examples include things like having memories, desiring things, seeing, tasting, judging, evaluating, etc. Their intent was to show the logical contradictions that arise as a result of this erroneous view of cognition.

In an analogous fashion to the mereological fallacy, the computer sciences and other technological fields, along with those who write and talk about them, have assigned various cognitive attributes to computers that can only logically be assigned to human persons and some (non-human) animals. This is known as the compulogical fallacy in honor of BH’s work. Table 1 shows a comparison of the two fallacies.

Table 1: Mereological fallacy vs. Compulogical fallacy

In essence the compulogical fallacy describes the logical contradictions that arise when we apply characteristics/behaviors/attributes/skills/abilities to machines and computers (abiological entities) that can only rightly be applied to human beings and some (non-human) animals (biological entities).

Machine 'learning' as an example of the compulogical fallacy

The term machine learning is but one example of this fallacy. The two words (each by their very definitions) when combined in that order result in a term that is a logical contradiction and the creation of something that is logically impossible, a learning machine. A machine cannot learn for if it did it would no longer be a machine. The same could be said for any computer (machine) and intelligence. A truly intelligent computer/machine, were it someday possible to create, or were it to be “born” or “emerge” would no longer be a computer/machine but something else entirely, something not human nor machine.

No one approach to this problem works best but there are at least three possible solutions.

1. Redefine the words 'machine' and 'learning'. This would be the most difficult as each word’s meaning has been fixed in the English lexicon with it’s standard/ accepted definition for over 100 years.

2. Argue that the act of creation of the term somehow changes the meanings of the words of which it is composed. Proponents of the term machine learning do not believe its use requires any justification and therefore this is the implicit position they have taken. Thinking in this fashion requires one to accept a Wittgensteinian view of language as meaning in use. It pre-supposes a fluidity in language that allows for the meanings of words and terms to change over time based on how they are actually used in the language. Witt would likely appreciate appreciate the support of his position on meaning in use, however he would almost certainly object to the term machine learning because he was very much against the invention and use of technical terms that obfuscate rather than clarify. Machine learning clarifies nothing with respect to how computers actually function, which is, even with the most modern computers, via the act of computation, not thinking or learning, which remain solely the provence of humans and some non human animals.

3. Drop the use of the term machine learning and replace it with something that is descriptive and logically coherent. This last would be the most appropriate and easiest though it seems there is very little chance of it ever happening as the offending term has been in use for so long now. Instead the proponents of machine learning have selected none of the above and continue to insist on using an absurd term (without any acknowledgement of its absurdity) to describe something they believe is a foundational field and critically important to many aspects of modern computing.

social mediatechsciencepsychology
1

About the Creator

Everyday Junglist

Practicing mage of the natural sciences (Ph.D. micro/mol bio), Thought middle manager, Everyday Junglist, Boulderer, Cat lover, No tie shoelace user, Humorist, Argan oil aficionado. Occasional LinkedIn & Facebook user

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments (1)

Sign in to comment
  • Alex H Mittelman 7 months ago

    Great work! Fantastic job!

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2024 Creatd, Inc. All Rights Reserved.