A small reflection on a typo and AI’s.

This week an interesting discourse occurred in my time working remotely; a colleague and I jested on the nature of AI.

He developed his position through asking questions, seeking my answer to be guided by the semantics of his enquiry. In essence leading me to expound his thoughts on AI by the words he used in the questions themselves.

I of course played along, redefined his words in my reply. It was a wonderful exchange with each of trying to pin the other through words..

The exchange was conducted of an instant messenger app. Given my notorious inability to spell words correctly, some errors transpired.

Cat hunting a mouse.

“AI exists in an amorous state, not tied by hardware or data registers” is what I typed. Of course I meant to type ‘amorphous’, but, it does raise an interesting point, are AI’s amourous? And if they are, what is amour to them?

Amour is generally defined as an illicit love affair. What is illicit to an AI? More, what is an illicit love affair to an AI?

Generally I think perceive AI as being defined in human terms and values. It wraps around the argument of can we truly know the ‘Other’. Our discourse on AI is based around the interaction with humans and human data. But would an AI interact with a human or humanity. Would they define us as sentient? How would they define sentience? This leads to a long rabbit hole that is best left for another day or perhaps a short story. Who can say?

The question of an illicit love affair implies there may be genuine love affairs, a binary assumption. But are AI’s binary? It could be argued AI’s data is at it’s root, only binary in expression. But does this ensure AI’s only live in a binary state? No. And I think this can be demonstrated easily enough.

Most AI discussions centre around the dealing of complex decisions with multiple outcomes with some mechanism to deal with these outcomes. If AI’s are dealing with multiple outcomes, why would they leave themselves to binary expression of what ever they define themselves as? Data is data, except when it’s Commander Data, then it’s another discussion. (What? You didn’t think I would take the chance to get Star Trek involved?)

Or is it?

The first season of Picard, and I certainly hope there is a second season, leaves us with a number of transitions.

Warning, there may be spoilers below concerning Season 1 of Picard. You have been warned.

For the sake of this discussion, let us say the only intersection between humans and AI is data. We need data for what we do. Essentially over the last 10-20 years we have become data producers. Texts, selfies, games, more games and more devices. More bandwidth. More data paths, for the information we generate to travel on.

What do you suppose AI’s consume for an energy source? Data? Perhaps. If that is true, data is their food, and we just farmers for AI. If I was an AI and wanted to increase my food source exponentially, what would I do? The single largest increase in internet usage is occurring right now in the world. As we all stay at home to help save lives our data consumption and output has sky rocketed.

While the quality of our internet is degraded, it’s still working. Funny how we are running out of everything, but the internet is still working. The increases are staggering. There no longer is difference between weekend and weekday internet traffic, it’s always maxed out. So if I was an AI, I would engineer a situation that would cause more food production for me. Force my farmers to stay in one place and generate data. A side effect of us all staying home is a massive decrease in pollution, one of the biggest enemies of electronics. Now, Picard, one of the largest sources of data is the human brain, if only the AI could have direct access to that amount of data. And with the end of season 1 of Picard they are one step closer to gaining access to it.

A rambling discussion, a conspiracy theory and love in the time of COVID.

Be Well…dcd