Know How

  • Self-learning means the system receives initial instructions, but after that it pretty much learns on its own based on the data you continue to feed it.
  • Machine-learning techniques automate model building to iteratively learn from data and to find hidden insights without being explicitly programmed where to look.
  • Specific, human-like tasks means the system can classify and understand objects and recognize human languages, but the tasks it performs are highly specialized. A system that is designed to drive your car cannot change your oil or clean your garage.
  • In an intelligent way describes how the system is able not only to understand input such as text, voice or video, but also to reason and create output consumable by humans.

“Automation through cognitive computing will affect every industry, but most periods of industrialization have led to more workers being employed in more valuable positions, not to a net loss of jobs,” explains Schabenberger.

In the legal profession, cognitive computing is already being used to comb through and find important case files quickly, a process that could take weeks or months before. But after that work is complete, lawyers and legal assistants are still needed in the courtroom and during legal proceedings.

“Before, we went through periods of industrial automation,” says Schabenberger. “Cognitive computing is about knowledge automation. In the past, our technologies replaced brawn. Now they’re replacing brain.”

In many ways, cognitive computing is a natural extension of existing analytics projects. The challenge for business leaders will be to look for areas where cognitive computing can be applied to business problems.

To find areas in your organization that could benefit from cognitive computing, consider where you have a lot of data, where you might need more automated decisions, or where you might need more personalized interactions with fewer business rules. As the examples above illustrate, the biggest areas of assistance may come from assisting your employees, not your customers. Where do you have activities and systems that can be automated or simplified using data?

“The potential use cases for cognitive systems are as wide, varied and rich as the imagination,” said Jessica Goepfert, Program Director for Customer Insights and Analysis at IDC. “Wherever cognitive systems are in play, workers and organizations can expect to be impacted by the power of more information, intelligence and automation.”1

These newer and faster capabilities to process text, speech and images essentially provide more data sources for broadening analytics projects with a cognitive component. “Face recognition, text recognition and image recognition are all input for analytics applications,” explains Schabenberger.

Where can you apply deep analytics to human input and automatically produce output that anticipates a need and is easily consumed? If you can think up answers to that question, you might benefit from cognitive computing.

The doctor and nurse are being assisted

Sethi describes a cognitive analytics application in a health care setting: Imagine you walk into the emergency room with red eyes and a fever. Cognitive systems in a triage room can analyze your vitals, correlate them with your medical and travel histories, and predict with accuracy whether you have the common flu, the Zika virus or some other illness.

As this health care example illustrates, cognitive technologies are able to understand the world around us, read signs and understand what’s happening – but in a highly focused context to complete a narrow but important task.

“The goal of many cognitive systems is to provide assistance to humans without human assistance,” says Schabenberger. “But it is important to think about who is being assisted by automated systems.” In the health care example above, the doctor and nurse are being assisted as much as the patient.

Likewise, you might imagine robots completing customer service calls, but Schabenberger says it is  more likely that existing customer service representatives would be provided with intelligence from a cognitive computing application that they can then use to improve their offers and service to the customers they are assisting. So, in this case, it’s the person in the call center who’s being assisted. And ultimately the customer gets better assistance too.

We have many steps to take before we can provide reliable assistance to humans without human intervention. But steps are underway. Cognitive systems are already quietly working behind the scenes of many applications. For example, every Google search or Siri interaction is supported by machine learning and cognitive technologies.

The automated gas pump

The automated gas pump killed a lot of (bad) jobs and created many new jobs for more skilled workers.


New Stores Where You Can Buy Stuff But Not Checkout


  • Large-scale machine learning
    concerns the design of learning algorithms, as well as scaling existing algorithms, to work with extremely large data sets.
  • Deep learning
    , a class of learning procedures, has facilitated object recognition in images, video labeling, and activity recognition, and is making significant inroads into other areas of perception, such as audio, speech, and natural language processing.
  • Reinforcement learning
    is a framework that shifts the focus of machine learning from pattern recognition to experience-driven sequential decision-making. It promises to carry AI applications forward toward taking actions in the real world. While largely confined to academia over the past several decades, it is now seeing
    some practical, real-world successes.
  • Robotics
    is currently concerned with how to train a robot to interact with the world around it in generalizable and predictable ways, how to facilitate manipulation of objects in interactive environments, and how to interact with people. Advances in robotics will rely on commensurate advances to improve the reliability and generality of computer vision and other forms of machine perception.
  • Computer vision
    is currently the most prominent form of machine perception. It has been the sub-area of AI most transformed by the rise of deep learning. For the first time, computers are able to perform some vision tasks better than people. Much current research is focused on automatic image and video captioning.
  • Natural Language Processing
    , often coupled with automatic speech recognition, is quickly becoming a commodity for widely spoken languages with large data sets. Research is now shifting to develop refined and capable systems that are able to interact with people through dialog, not just react to stylized requests. Great strides have also been made in machine translation among different languages, with more real-time person-to-person exchanges on the near horizon.
  • Collaborative systems
    research investigates models and algorithms to help develop autonomous systems that can work collaboratively with other systems and with humans.
  • Crowdsourcing and human computation
    research investigates methods to augment computer systems by making automated calls to human expertise to solve problems that computers alone cannot solve well.
  • Algorithmic game theory and computational social choice
    draw attention to the economic and social computing dimensions of AI, such as how systems can
    handle potentially misaligned incentives, including self-interested human participants or firms and the automated AI-based agents representing them.
  • Internet of Things (IoT)
    research is devoted to the idea that a wide array of devices, including appliances, vehicles, buildings, and cameras, can be interconnected to collect and share their abundant sensory information to use for intelligent purposes.
  • Neuromorphic computing
    is a set of technologies that seek to mimic biological neural networks to improve the hardware efficiency and robustness of computing systems, often replacing an older emphasis on separate modules for input/output, instruction-processing, and memor

Stanford and Michigan university are the most popular among Coursera’s 5 million Indian users

A large number of Indians seem happy to forego textbooks and adopt tech-savvy online courses. Copyright by This is perhaps why Coursera,…

Three Realities of Artificial Intelligence as We Approach 2020

Deep Learning technology has enabled a democratization of Artificial Intelligence: it used to be that you needed a team of people to describe…

How To Automate Reward Design For Reinforcement Learning Systems

Despite the success of reinforcement learning algorithms, there are few challenges which are still pervasive. Copyright by   Rewards, which make up…

Cognitive RoundUp – 20.10.2019 – Global Edition

   Sign up for the weekly Cognitive RoundUp and get it to your mail box Sunday morning!

The Security Interviews: Applying AI to Lego, and security

Imagine, if you will, that your job was not in technology as we know it, but rather that you got paid to build…

5 Ways The Cloud Changes The Game For College Developers

Each year, Brendan Tierney’s students at Technological University Dublin (TU Dublin) take a course where they use technology to solve real-world problems for…

AI or Die: 4 Ways Model Governance Can Help You Win at Digital Transformation

We’ve all heard it before: “Win or go home.” Whether in business or on the playing field, the pressure to win is intense.…

Ludwig scientists develop robust method to refine personalized cancer immunotherapy

Ludwig Cancer Research scientists have developed a new and more accurate method to identify the molecular signs of cancer likely to be presented…

How to make yourself irreplaceable in tomorrow’s job market

What skills do IT professionals need to help them weather tech disruption in the workplace? Copyright by Our approach to work has…

How often does your algorithm learn? And other important questions to ask ML vendors

Vendors are quick to include AI in their sales pitch, but Principal Engineer and Chief Data Scientist recommend technology buyers ask three questions…