Know How

  • Self-learning means the system receives initial instructions, but after that it pretty much learns on its own based on the data you continue to feed it.
  • Machine-learning techniques automate model building to iteratively learn from data and to find hidden insights without being explicitly programmed where to look.
  • Specific, human-like tasks means the system can classify and understand objects and recognize human languages, but the tasks it performs are highly specialized. A system that is designed to drive your car cannot change your oil or clean your garage.
  • In an intelligent way describes how the system is able not only to understand input such as text, voice or video, but also to reason and create output consumable by humans.

“Automation through cognitive computing will affect every industry, but most periods of industrialization have led to more workers being employed in more valuable positions, not to a net loss of jobs,” explains Schabenberger.

In the legal profession, cognitive computing is already being used to comb through and find important case files quickly, a process that could take weeks or months before. But after that work is complete, lawyers and legal assistants are still needed in the courtroom and during legal proceedings.

“Before, we went through periods of industrial automation,” says Schabenberger. “Cognitive computing is about knowledge automation. In the past, our technologies replaced brawn. Now they’re replacing brain.”

In many ways, cognitive computing is a natural extension of existing analytics projects. The challenge for business leaders will be to look for areas where cognitive computing can be applied to business problems.

To find areas in your organization that could benefit from cognitive computing, consider where you have a lot of data, where you might need more automated decisions, or where you might need more personalized interactions with fewer business rules. As the examples above illustrate, the biggest areas of assistance may come from assisting your employees, not your customers. Where do you have activities and systems that can be automated or simplified using data?

“The potential use cases for cognitive systems are as wide, varied and rich as the imagination,” said Jessica Goepfert, Program Director for Customer Insights and Analysis at IDC. “Wherever cognitive systems are in play, workers and organizations can expect to be impacted by the power of more information, intelligence and automation.”1

These newer and faster capabilities to process text, speech and images essentially provide more data sources for broadening analytics projects with a cognitive component. “Face recognition, text recognition and image recognition are all input for analytics applications,” explains Schabenberger.

Where can you apply deep analytics to human input and automatically produce output that anticipates a need and is easily consumed? If you can think up answers to that question, you might benefit from cognitive computing.

The doctor and nurse are being assisted

Sethi describes a cognitive analytics application in a health care setting: Imagine you walk into the emergency room with red eyes and a fever. Cognitive systems in a triage room can analyze your vitals, correlate them with your medical and travel histories, and predict with accuracy whether you have the common flu, the Zika virus or some other illness.

As this health care example illustrates, cognitive technologies are able to understand the world around us, read signs and understand what’s happening – but in a highly focused context to complete a narrow but important task.

“The goal of many cognitive systems is to provide assistance to humans without human assistance,” says Schabenberger. “But it is important to think about who is being assisted by automated systems.” In the health care example above, the doctor and nurse are being assisted as much as the patient.

Likewise, you might imagine robots completing customer service calls, but Schabenberger says it is  more likely that existing customer service representatives would be provided with intelligence from a cognitive computing application that they can then use to improve their offers and service to the customers they are assisting. So, in this case, it’s the person in the call center who’s being assisted. And ultimately the customer gets better assistance too.

We have many steps to take before we can provide reliable assistance to humans without human intervention. But steps are underway. Cognitive systems are already quietly working behind the scenes of many applications. For example, every Google search or Siri interaction is supported by machine learning and cognitive technologies.

The automated gas pump

The automated gas pump killed a lot of (bad) jobs and created many new jobs for more skilled workers.


New Stores Where You Can Buy Stuff But Not Checkout


  • Large-scale machine learning
    concerns the design of learning algorithms, as well as scaling existing algorithms, to work with extremely large data sets.
  • Deep learning
    , a class of learning procedures, has facilitated object recognition in images, video labeling, and activity recognition, and is making significant inroads into other areas of perception, such as audio, speech, and natural language processing.
  • Reinforcement learning
    is a framework that shifts the focus of machine learning from pattern recognition to experience-driven sequential decision-making. It promises to carry AI applications forward toward taking actions in the real world. While largely confined to academia over the past several decades, it is now seeing
    some practical, real-world successes.
  • Robotics
    is currently concerned with how to train a robot to interact with the world around it in generalizable and predictable ways, how to facilitate manipulation of objects in interactive environments, and how to interact with people. Advances in robotics will rely on commensurate advances to improve the reliability and generality of computer vision and other forms of machine perception.
  • Computer vision
    is currently the most prominent form of machine perception. It has been the sub-area of AI most transformed by the rise of deep learning. For the first time, computers are able to perform some vision tasks better than people. Much current research is focused on automatic image and video captioning.
  • Natural Language Processing
    , often coupled with automatic speech recognition, is quickly becoming a commodity for widely spoken languages with large data sets. Research is now shifting to develop refined and capable systems that are able to interact with people through dialog, not just react to stylized requests. Great strides have also been made in machine translation among different languages, with more real-time person-to-person exchanges on the near horizon.
  • Collaborative systems
    research investigates models and algorithms to help develop autonomous systems that can work collaboratively with other systems and with humans.
  • Crowdsourcing and human computation
    research investigates methods to augment computer systems by making automated calls to human expertise to solve problems that computers alone cannot solve well.
  • Algorithmic game theory and computational social choice
    draw attention to the economic and social computing dimensions of AI, such as how systems can
    handle potentially misaligned incentives, including self-interested human participants or firms and the automated AI-based agents representing them.
  • Internet of Things (IoT)
    research is devoted to the idea that a wide array of devices, including appliances, vehicles, buildings, and cameras, can be interconnected to collect and share their abundant sensory information to use for intelligent purposes.
  • Neuromorphic computing
    is a set of technologies that seek to mimic biological neural networks to improve the hardware efficiency and robustness of computing systems, often replacing an older emphasis on separate modules for input/output, instruction-processing, and memor

AI pilot projects: How to choose wisely

A few years ago, I was listening to a vendor pitch with a group of enterprise IT veterans. copyright by The sell…

AI used to test evolution’s oldest mathematical model

Researchers have used artificial intelligence to make new discoveries, and confirm old ones, about one of nature’s best-known mimics, opening up whole new…

Cognitive RoundUp -18.08.2019 – Global Edition

   Sign up for the weekly Cognitive RoundUp and get it to your mail box Sunday morning!

AI vs. AI: Cybersecurity battle royale

David and Goliath. The Invasion of Normandy. No matter the generation, we all know some of the storied battles that have withstood the…

Machine Learning Used to Interpret Genetic Influence over Behavior

Mice scamper around while searching for food, but genetics may be the hidden hand regulating these meandering movements. copyright by Scientists at…

Rapid adoption of artificial intelligence in agriculture

The AI in agriculture market was valued at USD 600 million in 2018 and is expected to reach USD 2.6 billion by 2025.…

Artificial intelligence to predict water scarcity conflicts

An artificial intelligence tool can predict where conflicts related to water scarcity are most likely to arise copyright by Researchers from the…

Simulating Cancer Cells Using Artificial Intelligence

Hoteliers in Hungary must be loving life right about now. In 2018, Hungary set a record of 31 million hotel nights booked by…

Five Factors Shaping Data Science

As data science evolves, key challenges are driving organizations to seek innovative solutions to compete in the new AI-driven economy. copyright by…

Looking to AI to understand how we learn

Two parallel quests to understand learning — in machines and in our own heads — are converging in a small group of scientists…