AI is eating the world

(Want to get insights into emerging tech on a more regular basis? Sign up for the official Traction Report newsletter here).

I recently gave a talk in Paris to a group called Women in the Digital Economy, or WIDE. They asked me to talk about the future of technology, with special attention to AI. That turned out to be relatively straightforward since I have been working with Traction on developing a map for understanding the technologies that are shaping the future, as described in Six Points On The Map Of Emerging Tech. I took the early research and analysis in that project and presented it in Paris. My main contention is that AI is eating the world. This is the presentation, and more or less what I said in the talk.

I was asked to talk about technologies shaping our future.

I’ll start with the bottom line first:

AI is eating the world.

End of talk. [I pretended a mic drop, and started to walk off. [PS I’m not sure that the French know what a mic drop is.] Then I returned to the podium.]

No, I have a lot of things to say, but it’s good to have the end in mind as we start out on our journey today.

‘We make our tools and they shape us’. A great, great line, often attributed to Marshall McLuhan, but he never wrote it. It was a colleague, John Culkin, writing about McLuhan’s work.

Tools can have outsized impacts. The printing press was the prologue to the Renaissance, electricity sparked the industrial revolution, and the smartphone has dialed us into a brave new world.

We have been culturally, economically, and philosophically remade by the tools we hold, use, and think through.

But we seldom think of books or metal or plumbing — as technology, because we become blasé, overlooking the artifacts of a technology-overloaded world.

Tools can have outsized impacts. The printing press was the prologue to the Renaissance, electricity sparked the industrial revolution, and the smartphone has dialed us into a brave new world.

Just for example: What’ is the most common wearable technology in the world? Fabrics, not Fitbit. And don’t forget eyeglasses and jewelry.

As Cascio’s quote points out, we naturally don’t notice technologies that we grew up with as such.

And what about predicting the future of technology — which is what I am charged with today?

John Feffer wrote, in his recent scifi work, Splinterlands:

It’s always safe to assume that [the future] will be more of the same. Just faster, cheaper, and more out of control.

And that’s where we’re headed.

I have been working with Traction to create a map, one that captures technologies and trends, and that will help to understand what is likely to come.

But the map is not the territory.

However, like all models, a map can help us gain insight and suppress unneeded details to highlight essential factors.

We started with a list of emerging technologies—not necessarily even the right ones.

I asked a sample of a few hundred highly qualified folks — CIOs, directors of IT, business leaders — a handful of questions, specifically about importance — of technologies and their rate of adoption.

I was hoping to find out which technologies are foundational: tech that other tech is built upon. And to find the core set of tech that is intertwined, a core of foundational technologies that define an era of technological advance.

That’s what we’ll find in that overlap in the chart.

What surprised me:

VR/AR — maybe too linked to gaming for people to imagine the impact on the out-of-the-office workforce, like construction, manufacturing, retail, service, health care. The non-white-collar workforce, those doing hard work, not executives, creatives, or so-called knowledge workers. Probably will need to be broken apart, and not treated as one technology.

Robotics—really should be part of AI, because it’s more important as part of that, and widely deployed

Containers, Devops—should be part of Cloud.

Some might score better if collated — Conversational Marketing, Chatbots, Voice Interfaces, for example

If we limit to just the big six, that [above] is what we see.

Above is the rate of adoption results.

Lines up pretty well with importance, except AI . Because AI hasn’t hit yet, really.

If we plot the big six rated most important against the numbers for rate of adoption, here’s what we see [above]. Let’s walk through, one by one.

Cybersecurity is unlike the others: it’s defensive. It’s all cost, and no other benefits aside from avoiding the depredations of criminals. It’s insurance.

The global cost of cybercrime will reach $2 trillion by 2019, a threefold increase from the 2015 estimate of $500 billion.

Some stats:

  • The global cost of cybercrime will reach $2 trillion by 2019, a threefold increase from the 2015 estimate of $500 billion.
  • Last year, IDG reported 38 percent more cybersecurity incidents than the year prior.
  • Just half of the global organizations PwC surveyed reported that they already use advanced big data analytics to model for and identify threats.
  • Meanwhile, AI and machine learning techniques are adding significant muscle to fraud detection and application security efforts.
  • An Osterman Research survey of 540 organizations in North America, the U.K. and Germany revealed that nearly half had sustained ransomware attacks in the last year, like Wannacry.

One fact shapes all other discussion about AI’s coming impact: the experts are divided, with roughly half believing AI (if left unregulated) will lead us to a world of widespread joblessness, and roughly half believe we will invent new work and retrain those dispossessed.

I fall into the first group, AI Doubters. I’ll quote Ben Bernanke the former head of the Federal Reserve, who is a Doubter, as well:

You have to recognize realistically that A.I. is qualitatively different from an internal combustion engine — in that it was always the case that human imagination, creativity, social interaction, those things were unique to humans and couldn’t be replicated by machines. We are coming closer to the point where not only cashiers but surgeons might be at least partially replaced by A.I.
Marc Andreessen famously said ‘Software is eating the world’. Jensen Huang, CEO of Nvidia, says ‘AI is eating software’. So, by extension, AI is eating the world.

McKinsey estimates that as much as 30% of all occupations could be replaced by AI, and other research shows that hundreds of occupations are at risk of being completely taken over by AI.

This is a terrible time to become an Uber driver, but all roles where humans aren’t very good at the work are at risk. There are lots of things we do where our cognitive biases and limitations get in the way—like hiring and promoting people, and reviewing luggage scans at the airport — where AI is likely to do much better at a fraction of the cost.

The evidence of history shows that those dispossessed by their livelihoods being taken over by others — (either by AI, economic obsolescence, or jobs offshored) — have a very hard time shifting into new work. Take a look at Amy Goldsmith’s Janesville for example, about the GM plant that closed in Janesville Wisconsin in 2008, and most of the laid-off workers never made a strong transition, getting worse jobs with less pay. They are the Left Behinds.

AI will reach into the foundational layers of the next generation of software and hardware, and AI’s tendrils will reach into every niche of business and society, unless we create regulations to slow or prohibit it.

Marc Andreessen famously said ‘Software is eating the world’. Jensen Huang, CEO of Nvidia, says ‘AI is eating software’. So, by extension, AI is eating the world.

AI will impact everything in the economy from the bottom to the top.

  • Driverless cars are largely an AI problem, once the lidar sensors become cheap enough to deploy at a mass scale.
  • Andrew Ng, of Stanford and Google, says ‘AI is the new electricity’.
  • The internet giants are designing chips that will allow AI-intensive software to run thousands of times more efficiently on computing devices, like our smartphones, cars (yes, cars are computing devices, just with wheels and motors), and services in the cloud, ranging from Netflix personalization to medical diagnostic tools. It will become foundational, like the communications protocols that make the Internet work.

A few data points:

  • In March, BlackRock (the largest investment house in the world), handed control of $30B in assets to AI-based software, laying off 37 fund managers, and turning an additional 50 or so managers into ‘advisors’.
  • The US Army is training AI to shoot by monitoring the brainwaves of skilled soldiers: yes, killer robots are coming.
  • Researchers are turning to AI to handle the growing backlog and bottlenecks in science because there it too much research being done that isn’t being analyzed by other scientists, and at the same time the level of effort needed to create significant breakthroughs has been increasing for decades.

AI will reach into the foundational layers of the next generation of software and hardware, and AI’s tendrils will reach into every niche of business and society, unless we create regulations to slow or prohibit it.

In terms of the workplace, very soon, we will be relying on AI underlying or fronting applications to get our jobs done. But in the very near term we’ll start to see AI — or bots — as coworkers, and at some point a bot may be asking us to help them get something done, or task us to do something for a project the AI is heading up.

Driverless is coming to work.

Cloud computing is a foundational, and rapidly maturing technology.

Google has 600 engineers focusing full-time just on security.

Some stats:

  • Data centers are still the norm in enterprise IT, but cloud spend could soon catch up.
  • Cloud computing will get close to 100% of data center spend in less than 10 years according to Mary Meeker’s 2017 Internet Trends report.
  • Software as a Service (SaaS), especially, will see a big boost, with many organizations turning to the model. According to another report from BetterCloud, some 73% of organizations said more than 80% of their apps will be SaaS by 2020.
  • Public and private cloud spending both increased, but concerns are shifting from security to compliance and lock-in, the report said.
  • Cloud growth is also enabling innovations in edge computing, elastic databases, containers, and microservices, which are changing the way IT thinks about infrastructure.
  • Amazon Web Services has slowing revenue growth, but the reason is that major companies are buying multiple years in advance to get steep discounts. But that locks them in, and gives Amazon smooth revenue forecasts.

Concerns about security in the cloud are falling:

Google has 600 engineers focusing full-time just on security. ‘When you have that kind of scale, people realize big public clouds are the most secure place to be’, Google cloud chief Diane Greene said.

I really think of Big Data and Data Analytics as twins. All the interesting challenges in analytics have to do with big data.

However, the fact is that data as an application area is being confronted with the limitations of conventional approaches. Old-school Business Intelligence software is still in use at many companies, and more modern data analytics/big data tools like Splunk, Hadoop and Cloudera are being deployed at larger companies. The new bottlenecks are data scientists to make these new tools go. And, behind it all is the exponential growth in data and sources of data, as is likely to come from IoT.

The fact is that data as an application area is being confronted with the limitations of conventional approaches.

Data centers are becoming a cloud service, where companies don’t have to buy and build hardware and deploy software: they are buying it by the compute-minute.

Companies are demanding ‘web scale IT’ — like what they get from Amazon, Google, Facebook, and Microsoft — and it’s unclear that these new ‘big data’ companies can deliver.

Lastly, this is another area where AI may eat the lunch of older techniques. — With so much data, and more arriving so quickly, data scientists and quants may not be able to keep up. At some point — which may already be here for really large multinationals — there will be so much data that the only way to make sense of it — to determine how to divide it up and analyze it, to find correlations and make predictions — will be to point machine learning at it, and get out of the way.

The Internet of Things includes a number of immensely important and exploding areas, involving networks of devices, vehicles, buildings and other physical and virtual nodes, including the hardware and software infrastructure and protocols to allow them to communicate to each other and with traditional computing services and people.

In the near future, everything that is electrical will be a candidate for IoT instrumentation, given the convergence of miniaturization, 5G communications, low-power chips, and AI.

Moving things:

  • Driverless transportation
  • Drones
  • healthcare devices — monitors, IV devices, etc.
  • wearables
  • sensors — GPS, cameras, lidar, etc.

Stationary things

  • sensors — like retail beacons, camera-based visual analysis (AI)
  • appliances — like TV, refrigerators, washing machines, ovens, toasters, heating and air conditioning
  • lights
  • traffic lights and sensors

Huge things — Part of smart cities, smart homes, electrical grids, power plants

In the near future, everything that is electrical will be a candidate for IoT instrumentation, given the convergence of miniaturization, 5G communications, low-power chips, and AI.

Some headlines:

  • Smart speaker market is likely to be more than $2.5B in 2017: Amazon became the largest audio brand in Q1 ’17 only available in US, UK, and Germany.
  • Qualcomm is shipping more than one million chips per day for use in commercial IoT products. For example, more than 125M TVs, home entertainment, and connected home products have shipped with Qualcomm chips.
  • Intel says the ‘passenger economy’ could equal $7T per year
  • Various researchers estimate that we’ll see more than 30B IoT endpoints by 2020, and a total market of $600B by 2023.

I believe all of these projections are dramatically underestimated.

Let’s put it all back together:

So, technologies are either defensive or offensive: Cybersecurity is a defensive set of tools and technologies for the most part. Of course there is an offensive side of Cybersecurity: the tools and tech that hackers employ from the viewpoint of hackers.

As my friend Jamais Cascio said, if you want to imagine the fastest application of new technologies, imagine how criminals will use them.

And also on the offensive side, AI has already expanded to subsume Big Data and Data Analytics. And as it continues to grow as a platform underlying new technologies, it will look like this:

This.

As my friend Jamais Cascio said, if you want to imagine the fastest application of new technologies, imagine how criminals will use them.

So, already today, we have tens of thousands of hackers using AI to try to attack cloud-based data centers, where the internet giants are deploying AI to identify, analyze, and counter those threats. A cyberwar is being waged on our behalf, in the shadows, by implacable artificial enemies.

But that’s just one small corner of the AI economy we are accelerating into.

The only force that could slow AI’s spread would be regulatory, but I don’t see very much action in that direction, despite the expressed fears of AI Doubters like Elon Musk, Stephen Hawking, and — ahem — Stowe Boyd about the possible downsides. Though that would have to be the subject of another talk.

The future is unplanned, unrehearsed, the cumulative effect of billions of individual decisions by billions of individuals and organizations. The transition from today to next week and next year won’t be smooth, predictable, or organized.

The burden, then, is on us to adopt an agile frame of mind, individually, and in our organizations. And keep our eyes open.

Oh, and invest heavily in AI.

It’s far too easy to be complacent, even in the face of overwhelming indications of massively disruptive change on the horizon.

In your business, instead of trying to converge on a single ‘official’ future, explore dozens of alternative, even contradictory futures.

Perhaps you should appoint a ’10th man’ — (of any gender), who is deputized to consider — and plan for responses to — the most dangerous scenarios that threaten your company, even when everyone else thinks those scenarios can never, ever happen.

Remember, Alfred North Whitehead warned us in 1925…

Thank you for your attention.

Related

Open Innovation Comparison Matrix

Feature
Traction Technology
Bright Idea
Ennomotive
SwitchPitch
Wazoku
Idea Management
Innovation Challenges
Company Search
Evaluation Workflows
Reporting
Project Management
RFIs
Advanced Charting
Virtual Events
APIs + Integrations
SSO