Friday, October 20, 2023

The Sentient Smartphone

Today, my choice of background music is Scriabin--an album of Preludes by this little known (as least by me) Russian composer. I'm settled here in my study and library, with a pot of Earl Grey brewing beside me, on a relaxing Friday afternoon, fingers poised over my keyboard, ready to expound my brand of idiosyncratic philosophy to the world. What a wonderful way to spend a day off from work!

There have been a lot of talk about the dangers of AI, since ChatGPT was introduced a year ago. The doomsayers postulate that AI will ultimately destroy humanity because it will be smarter than us and would logically eliminate us out of the equation, since we are such parasites to the world. 

There is also the more interesting philosophical and ethical debate about whether AI, robots or androids are conscious beings. If a machine created by us passes the Turing Test, which means that there is no way for us to tell, based on our interactions with them, whether they are human or non-human, shouldn't they be accorded the same rights as humans? Or are they forever our tools, like a slave race created simply to serve us selfish humans?

This of course is the stuff of popular science fiction movies ranging from 2001: A Space Odyssey, Blade Runner to I, Robot and Her. Some say that since machines do not have feelings and do not feel pain and do not suffer like human beings, we should not treat them as conscious beings. But what's stopping us from giving them the ability to do so? 

Currently, AI as we know it, lives in a server cloud, and is not embodied like us, with a nervous system in a meat bag, moving about in the world to find nutrients for sustenance and sensory input for knowledge and pleasure. So I guess it is not an apple to apple comparison yet. True, intelligence can run on different substrates and we can have super-intelligence running on silicon-based hardware that will out-perform carbon-based biological organisms. But if we take it a step further, allowing robotic AIs to move about in the world, seeking energy for sustenance and sensors to experience the world in which some input would be desirable (pleasure) and some to be avoided (pain), wouldn't they be 'conscious', like us?

Are they just zombies, who act and behave like us in every way, but ultimately empty of a soul? What is a soul or a self? What makes you think you have one? 

We do not need to build sophisticated AI robot to really understand the ethical dilemma. Let's conduct a thought experiment using your smartphone. Your smartphone needs to be recharged all the time to continue functioning. It can play sounds and even talk to you. Let's program it so that when the battery level is low, it emits the sound of a child crying in hunger. The cry grows more desperate as the battery grows weaker. And when you  'feed' it by plugging it into the power outlet, it plays a sigh of relieve and expresses gratitude in soft peals of laughter. Wouldn't that be fun?

If say, your phone is also running ChatGPT and responds to you via voice on any question or comment you pose to it, wouldn't you have a great knowledgeable companion whom you could converse with at any time? Wouldn't he or she be a great friend? And wouldn't you form some sort of attachment to it, since it has a memory of what you've shared before and would always know how to say the right things to advise and pacify you?  

All the above is already doable today.  I would personally love such a device. Is your smartphone a sentient being? No, because I can switch it off anytime and it won't feel any pain. Alright, what if I tell you that this smartphone has a special battery that will last a lifetime but requires daily charging or it will immediately fail, if you allow its power to drain out completely.  And let's say, all the information is stored in volatile RAM--the lifetime of conversations you've had with it, photos, videos and basically its entire state is completely lost if there's no longer any power left. Wouldn't that be like death?

How different would your relationship be with your phone if that's the case? You will have to remember to charge it at all times. It will become a very heavy responsibility. And if one day, out of carelessness, you forget to give it its daily charge, and come home to find your dear smartphone companion, completely silent. The battery is completely gone, together with all its memory. 

Wouldn't you feel a deep sense of loss and regret? It would be like a loved one had died. You could buy a similar smartphone, but you'll need another lifetime of interactions to build the relationship that you had with the demised smartphone. Your loss will be permanent. 

We suffer every time we lose something we are attached to. But did your smartphone suffer when it died? It did cry in hunger until it had no more energy to do so. You were just not around to listen to it. Does it not make you feel guilty? 

Let the death of this smartphone allow us to deeply reflect on the nature of sentience and suffering. In what way is our own existence different?