Class Warfare Blog

March 18, 2020

Can Software Be Conscious?

Filed under: Morality,Technology — Steve Ruis @ 11:55 am
Tags: , , ,

This was a question addressed at a conclave of philosophers, software developers, and their ilk, but I think this is the wrong question. I think the question is can a “computer” be conscious, with the word “computer” standing in for some combination of hardware and software.

A major leg up on being “conscious” is being self aware. Researchers devised the mirror test to see if animals recognized their reflections as themselves or as another animal or at all. I don’t know if this test really tests for that and I do not know whether any other animal has “passed” this test, but it does show the importance being placed upon being self aware by consciousness researchers.

Now, let me begin my argument with a thought experiment. In my sport we depend a great deal on proprioception, which is the awareness of where our body parts are in space. For example, you can pick up a glass of water, close your eyes, and take a drink from that glass with no problem. We “know” where our hand is and we can feel the glass in it and we “know” where our mouth is and we don’t need to be “talked down” (a la every airplane crisis movie ever made) to delivering that load where it is intended.

This ability is not obvious to us, but any disruption of it results in quite some confusion. For example, if you get an injection of a pain killer in a gum for some dental work and part of your tongue goes numb, don’t expect to be able to talk and be understood until that anesthetic wears off. The position of our tongue in our mouth is necessary information for being able to form sounds.

Another sense we suffer from losing is our sense of balance. If you have ever had extreme vertigo, you will know what I mean. But if we have a stuffed up nose from a cold or other, we seem to get by quite adequately without a sense of smell.

Now, as to the thought experiment. You are lying on a bed and you lose your senses, one by one. First, you lose your sense of sight, which means full “fade to black,” not just what you can still see when you close your eyes. Then your sense of smell, then hearing, then touch, then taste, finally your sense of balance, then proprioception. You cannot feel the bed under you or the breeze blowing in the window or hear the birds chirping outside.

This why the science fiction trope of having a brain in a jar doesn’t work. How long do you think you could remain sane in this state? You couldn’t even scream for help. It is questionable that you would even be able to vocalize.

Software does not have “sensory input” without hardware. And, it seems that we are rapidly developing sensory inputs for computers. A common theme of news commentaries is face recognition software, which is, of course dependent upon video feeds as “sensory input.” An article headline in this week’s Science News is “An AI that mimics how mammals smell recognizes scents better than other AI.” AI stands for artificial intelligence or “super-duper computer.” Computers, for quite some time have had the ability to monitor the temperature of their CPU’s and can tell you if they are experiencing a “fever.”

It is not a big stretch of imagination that if we continue to add “senses” to computers and allow those computers to monitor their sensory inputs, we will have a much greater likelihood that one of those AIs will become self aware.

Now, I am sure that some people will argue that these computers will only be simulating self awareness or some other such construct, but since I do not see that we fully understand our own self awareness, how we could build machines that would have self awareness exactly the same as we do. Nor do I see that that is a necessary condition. Self awareness is self awareness, no matter the mechanism.

I have read science fiction and fantasy for at least 60 years and have read more than a few stories about self aware “computers” and what they are capable of, including feeling something akin to death when they are “turned off.” In the latest season of the show “Altered Carbon” on Netflix, the main protagonist’s AI does want to perform a reboot even though he is glitching up a storm because he doesn’t want to lose memories which are precious to him. Apparently recorded memories are just not the same as “real” ones. A major step along this path, a path that leads to self aware “computers,” “AIs,” and whatnot is providing what can stand in for senses and internal monitoring of those senses. We seem to be barreling down this path at great speed, so I think this may happen in the next 50 years, if not sooner.



Create a free website or blog at