Sunday, April 27, 2008

What is it like to be something other than human?

When I was a naïve teen I tried breaking the mental barrier by imagining what it would be like to be dead. I thought I championed this enigma when I concluded it was like being in a deep sleep without ever waking up. At the time, I didn’t realize I was in for a meaningless endeavor. It doesn’t make sense to imagine the state-of-being a non-conscious entity would have. There is no point of trying to figure out what it is like to be a rock because rocks aren’t capable of knowing what it is like to be themselves!

In order to transcend my human state of mind I have to imagine what it would be like to be another conscious entity, like a dog or cat. In my opinion, imagining what it would be like to be a dog is a relatively simple task. As animals ourselves, we know what it is like to have desires, emotions, pleasures, pain, etc. It still is impossible for us to know exactly what it is like to be a dog, but we have a pretty good idea (most dog owners have a good idea of what their dog is feeling without thinking too hard).

I wanted to imagine a state-of-being that is truly mind blowing, some state of mind foreign to any conscious creature we know of. The first thing that came to mind was artificial intelligence; an AI with intelligence that far surpasses that of any human. How the hell can we imagine how a super smart AI would think like? If we knew how this AI would think, wouldn’t we be equally smart as it? I don’t think we can know exactly how it would think, but either way I am going to take a shot at it.

This AI would be able to change its own source code i.e. it can reprogram its brain whichever way it wants. You might be wondering how you can imagine this on an intuitive level. It would be as if you were to fundamentally change the way you think. I know this doesn’t make you any less confused so I will give an example. If you relocate the trash bin in your room there will be many instances where you throw your trash in the old location of the bin. Your mind has been conditioned to expect a bin in a certain location and sometimes you may forget it’s in a new place. Your weak brain has disabled you from efficiently throwing away your trash (without wasting time with a misfire). An AI wouldn’t have a problem here because it can erase any conditioning and reprogram itself to adjust for the different environment. Our minds are constantly fluttered with these impulses that have been conditioned in our mind. For instance, if I tell you to not think of a white elephant, you’ll think of it. An AI could choose whether his mind should be vulnerable to such impulses. You can imagine how this would help with the AI’s problem solving skills. It would have no bias, no obstacles in attaining new skill sets.

The AI would be able to discard bad or faulty ways of thinking and replace them with better ones. The ‘better’ ways of thinking would be the ways that help the AI solve more efficiently a problem or reach a goal. If you were an AI you might be able to solve Fermat’s problem in a matter of minutes. I have made a big assumption here; I assumed that the AI would want to do things. We humans constantly solve problems because we must in order to survive. We are faced with challenges, death threats, scarcity, etc. and that motivates us to problem solve. The AI would have to be programmed with desires similar to humans in order for it to want to solve similar problems we have. It would be in our interest to program an AI with the same desires as us because it would be interested in solving problems that we too care about. Of course the AI doesn’t have to be programmed to share our desires, but it does have to have some desires. Otherwise it wouldn’t be intelligent because it wouldn’t do anything!

In short, being a super smart AI would be much like being an intentional agent that can much more efficiently get what it wants. More efficient because it wouldn’t deal with the same handicaps are weak minds have like poor memory, biases, social conditioning, and any other concept you learned about in psychology.

No comments: