Despite the vast potential of artificial intelligence (AI), it hasn’t caught hold in most industries. Sure, it has transformed consumer internet companies such as Google, Baidu, and Amazon — all massive and  data-rich with hundreds of millions of users.

Copyright by hbr.org

SwissCognitive, AI, Artificial Intelligence, Bots, CDO, CIO, CI, Cognitive Computing, Deep Learning, IoT, Machine Learning, NLP, Robot, Virtual reality, learningBut for projections that AI will create $13 trillion of value a year to come true, industries such as manufacturing, agriculture, and healthcare still need to find ways to make this technology work for them. Here’s the problem: The playbook that these consumer internet companies use to build their AI systems — where a single one-size-fits-all AI system can serve massive numbers of users — won’t work for these other industries.

Instead, these legacy industries will need a large number of bespoke solutions that are adapted to their many diverse use cases. This doesn’t mean that AI won’t work for these industries, however. It just means they need to take a different approach.

To bridge this gap and unleash AI’s full potential, executives in all industries should adopt a new, data-centric approach to building AI. Specifically, they should aim to build AI systems with careful attention to ensuring that the data clearly conveys what they need the AI to learn. This requires focusing on data that covers important cases and is consistently labeled, so that the AI can learn from this data what it is supposed to do. In other words, the key to creating these valuable AI systems is that we need teams that can program with data rather than program with code.

Why adopting AI outside of tech can be so hard

Why isn’t AI widely used outside consumer internet companies? The top challenges facing AI adoption in other industries include:

  1. Small datasets. In a consumer internet company with huge numbers of users, engineers have millions of data points that their AI can learn from. But in other industries, the dataset sizes are much smaller. For example, can you build an AI system that learns to detect a defective automotive component after seeing only 50 examples? Or to detect a rare disease after learning from just 100 diagnoses? Techniques built for 50 million data points don’t work when you have only 50 data points. […]

Read more: hbr.org