Stop Calling It Intelligence

Written by vsbmeza | Published 2017/07/10
Tech Story Tags: artificial-intelligence | ai | etymology | future | future-of-work

TLDRvia the TL;DR App

The term Artificial Intelligence was coined in 1956 when we didn’t have much — if any — experience in the field. Since then, novelists, journalists and optimistic scientists have fuelled and allowed the term to be misinterpreted as a system that replaces the human way of thinking.

Recently I’ve had to try and explain this concept on more than one occasion.

Will it replace humans soon?When will code write itself?Will just AI take all the work off of your hands?Can we plug in AI to our system to make our product better?

I also had to fight trend induced investor needs to implement AI into all the things.

Such and similar questions by the dozens from all sorts of people have lead me to firmly believe that the terminology has wildly been abused and should be changed.

What we refer to as AI, has nothing in common with what people believe it to be. Can it solve problems? Yes. Can it recognise patterns? Yes. Can it think? No. Is it intelligent? No.

It’s a very clever mathematical tool that is used in very specific situations. It adjusts for mistakes over time to be better at the single one thing that it’s trained to do. You give it well specified inputs, and it gives you some output which you then need to make sense of.

A good example is a coin machine. You throw in some coins and it knows how much you’ve put in. An “AI” is practically that, but instead of having a declarative decision programmed into it, the creators of the machine have subjected the software to hundreds of thousands of examples to adjust its decision system to a point where it sort of works.

I don’t want to go into specifics, you can find a number of resources online to understand how they work. Bottom line is: an image recognition system will never develop the ability on its own to recognise music without someone actually making the sound be an image first. It cannot think!

Also, it’s not plug and play. You can’t just take “an AI” off the shelf and replace your workforce. You could make your workforce’s life easier with using sophisticated systems to provide extra information, but that’s about it.

So here’s where my head’s at:

  • The name is ambiguous and misleading
  • Doesn’t do what people think it does
  • It’s a lot of work to create and use it
  • People create it
  • People train it
  • People use it

In my books, there’s nothing Artificial about it and the intelligence is within the people creating, training and working with the system.

Can we start calling it: Advanced Analytics?

But Meza, what about all the nice predictions it can do?

For that, I have to re-iterate: The predictions are observed and recognised by the people or the systems using the output of the “AI”. You get a nice set of numbers that mean something which you then have to explain and understand.

But Meza, it learns by trial and error like a human!

Does it take you 2 million examples to recognise a specific coin with never 100% ratio?

(edit) I know that academically the Artificial Intelligence term could be applied. I’m not questioning that. What I’m trying to discover here is how to make it less ambiguous and misleading for non-academic people.

I had to get this off of my chest. Let me know if you agree or disagree. Would love to find a better term for this. Do you have any ideas?


Published by HackerNoon on 2017/07/10