AI – maybe you should be scared?

We find many new articles in the media stirring up fear about the advent of artificial intelligence (AI). The idea is that “obviously” these new systems will not only be better in any which way, but they will also demand leadership and ultimately get humankind extinct … an idea as old as skynet and terminator itself. This reminds me of all the cheesy alien SciFi movies of the sixties predicting a similarly gruesome end of humanity similar to “War of the worlds”. Once we landed on the moon and Hollywood’s technology became better, SciFi movies stopped being overly cheesy and mainstream media recognized the fact that little green men are too far far away and in a different Galaxy to do any immediate damage.

But AI is obviously different! This technology is not only lurking around the corner, some of it we carry around in our iPhones in the form of Siri or it will drive our cars sooner than later. Give it more time and it will go berserk like HAL 9000 or COG! Machines are not only expert systems that play chess or Jeopardy, these systems all contain the spark or sentience in them that just waits to collect more digital resources to collapse into some kind of information singularity and then we are doomed, right?

Here it is were fiction leaves the realm of sanity. Expert systems are made by large teams of programming experts, that cobble together algorithms that other teams of experts thought out, and stuff all of it into cloud (networked) computers. Then they link these systems either to ultra large data bases or hook them up to sophisticated sensors. They add some “Big Data”, and now we can ship AI. However, I think that this approach is not leading to our ultimate demise, but is ultimately doomed to failure itself. The very thing we want to make, a facsimile of the human brain, doesn’t work like a flow chart, isn’t made from well communicating teams of engineers, and doesn’t follow a blueprint understandable by human programmers. Simply put, the jello like gray substance you currently use to intellectually digest these lines was made by natures one and only creative force: Evolution!

Evolution is a biased sequence of random changes. Those changes that aren’t too bad, lead to marginally better systems, and so on, for eons. What we end up with is an extremely complex and complicated mesh of coincidences and exploited opportunities, lacking the very essence we use to design systems: order. Let us try to image the pure volume of books necessary to describe the brain? A massive collection of articles about neuronal functions, developmental neurobiology, cognitive neuroscience, neurophysiology, and those disciplines not even invented to describe how one would built a human brain. You get my point, brains aren’t engineered, and engineering them is a futile endeavor, not because we lack the ability to understand them, but because the brain lacks a narrative or design principle we can put in words, or communicate, or use as a blueprint. We would need to reverse engineer something that wasn’t engineered in the first place. The only option we have left, in my opinion, is to create those circumstances that lead to the evolution of intelligence in a computer, and let it happen again. Figuring out the circumstances that lead to the evolution of intelligence present a much easier task than trying to engineer a brain, don’t you think?

What does this approach have to do with AI being a threat or not?
Future AI will come from digital environments and has undergone evolution that is at least similar in principle to what humankind went through. Did this produce ruthless killers, or sentient beings that empathize and understand the value of synergy and diversity? I guess this is a question that you have to ask yourself. Are you a nice person that values diversity and that brings value to a mutual beneficial relation? Are you respectful to other sentient beings? Then all is fine, because you just showed that evolution is indeed capable of producing such individuals, and it is those entities that you will meet in the future. If you disagree, well then I am sorry, but your kind will become extinct, and then you should indeed be scared. Fortunately we already showed that evolution is indeed capable of creating cooperators, and since winning isn’t everything, I am confident that my approach will lead to nice AI – what those gangs of experts are up, to cobbling together code, on that other hand, I am not so sure about.

Cheers Arend

 

Arend Hintze

 

Leave a Reply

Your email address will not be published. Required fields are marked *