Courtesy of Shutterstock

The other night I attended a dinner with a dozen CEOs of AI startups. Once again, I heard a near universal discomfort with the term “artificial intelligence” as they sipped Pinot Noir and fumbled to describe what they do.
“We’re not really trying to create intelligence that’s artificial,” said the CEO of a product strategy company. Another, who has built AI-based payment technologies, found the term dystopic. “Too many people think AI means the Terminator,” he said.
An investor in AI briefly defended the term as meaning simply machines that people make that are smart. However, after several entrepreneurs challenged her on the term’s vagueness, she conceded it’s hard to define what is meant by “intelligence” when talking about machines—or humans.
“Artificial intelligence” goes back at least to 1956, when computer and cognitive scientist John McCarthy used it as the theme of a conference at Dartmouth College. Since then, as my dining companions suggested, this moniker has taken on a multitude of meanings.
For those who hope to create machines with a conscious intelligence on par or better than humans, AI is the goal. Others consider that unattainable (at least in the near term), unnecessary, or possibly dangerous, recalling 2001: A Space Odyssey‘s Hal or The Terminator‘s antagonist Skynet.
These AI skeptics argue that we don’t really need machines to replace those cognitive functions that humans will be better at for the foreseeable future—like quick pivots of logic and imagination when the unexpected occurs, or even accurately identifying every image of a cat, something computers still have a hard time doing.
Many advocate a different sort of AI—“Augmented Intelligence”—meaning machines that support and augment human intelligence and endeavors, rather than mimicking or attempting to replace us.
Another way to describe this is “intelligence amplification,” or “IA,” a term first used by cybernetics pioneer William Ross Ashby, also in 1956. This was followed up by inventor Douglas Engelbart and his famous 1962 report titled: Augmenting Human Intellect. But neither “augmented intelligence” nor “intelligence amplification” have caught on.
Most computer scientists prefer terms like “machine learning” or “deep learning.” Or they refer to core technologies like NLP platforms, predictive APIs, and speech recognition. Most use “AI,” as in “artificial intelligence,” primarily when they’re raising money, since right now investors can’t seem to get enough of AI, whatever they think it is.
“AI is hot, and every company worth its stock price is talking about how this magical potion will change everything,” Om Malik wrote last year in The New Yorker. “If the hype leaves you asking ‘What is AI, really?,’ don’t worry, you’re not alone. I asked various experts to define the term and got different answers.”
None of the alternative terms, of course, come close to the sexiness of “artificial intelligence.” Even “machine learning” isn’t quite right. To the uninitiated, it sounds like dishwashers or blenders that can learn stuff.
So what shall we call this tangle of various sorts of advanced computing that are providing the smarts for everything from autonomous vehicles to lightning-quick trades on financial markets, and can analyze millions of medical images in the blink of a radiologist’s eye?
Is there a more precise and acceptable term? One that would also be cool if incorporated in the title of a film, or as a buzzword in a business plan?
One potentially relevant Latin word is captiosis. It means “smart,” although it’s a mouthful and doesn’t sound all that impressive. And “Captiosis Intelligence,” or “CI,” sounds redundant and flat.
Another thought would be to borrow the Latin-derived title of Andy Garland’s 2015 film Ex Machina. This term literally means “out of the machine,” which sounds promising, and works well abbreviated: “The venture capitalists were very happy to see the company using EM to power its super-smart gizmo.”
I could go on with more candidate terms, but in the meantime I modestly suggest an interim moniker—Deep Computing. It refers to the deepening power and reach of modern computers, with an additional allusion to the profundity of an endeavor that may reframe much of what we do as humans, if not what it means to be human. The word “computing” is less threatening than “intelligence,” and is probably more accurate for machines that aren’t actually intelligent—whatever we mean by that.
“Deep computing” is not as provocative as “artificial intelligence,” of course. This may be a good thing. For now, I’m going to see what happens when I work it into casual conversations. This will have to do until, perhaps, the machines learn enough to come up with a name for themselves.