Autonomous Construction Consulting Health Research

What is Artificial Intelligence?

The dream of developing machines that can mimic human cognition dates back centuries. In the 1890s, science fiction writers such as H.G. Wells began exploring the concept of robots and other machines thinking and acting like humans

Copyright by

SwissCognitive, AI, Artificial Intelligence, Bots, CDO, CIO, CI, Cognitive Computing, Deep Learning, IoT, Machine Learning, NLP, Robot, Virtual reality, learningThe term  () refers to computing systems that perform tasks normally considered within the realm of human decision making. These software-driven systems and intelligent agents incorporate advanced data analytics and Big Data applications. systems leverage this knowledge repository to make decisions and take actions that approximate cognitive functions, including learning and problem solving.

, which was introduced as an area of science in the mid 1950s, has evolved rapidly in recent years. It has become a valuable and essential tool for orchestrating digital technologies and managing business operations. Particularly useful are advances such  and .

It’s important to recognize that is a constantly moving target. Things that were once considered within the domain of and computer chess, for example – are now considered routine computing. Today, , , , real-time analytics tools and various connected systems within the Internet of Things (IoT) all tap in order to deliver more advanced features and capabilities.

Helping develop are the many cloud companies that offer cloud-based services. Statistica projects that will grow at an annual rate exceeding 127% through 2025.

By then, the market for systems will top $4.8 billion dollars. Consulting firm Accenture reports that could double annual economic growth rates by 2035 by “changing the nature of work and spawning a new relationship between man and machine.” Not surprisingly, observers have both heralded and derided the technology as it filters into business and everyday life.

History of Artificial Intelligence: Duplicating the Human Mind

The dream of developing machines that can mimic human cognition dates back centuries. In the 1890s, science fiction writers such as H.G. Wells began exploring the concept of robots and other machines thinking and acting like humans.

It wasn’t until the early 1940s, however, that the idea of began to take shape in a real way. After Alan Turing introduced the theory of computation – essentially how algorithms could be used by machines to produce machine “thinking” – other researchers began exploring ways to create frameworks.

In 1956, researchers gathering at Dartmouth College launched the practical application of . This included teaching computers to play checkers at a level that could beat most humans. In the decades that followed, enthusiasm about waxed and waned.

In 1997, a chess-playing computer developed by IBM, Deep Blue, beat reigning world chess champion, Garry Kasparov. In 2011, IBM introduced Watson, which used far more sophisticated techniques, including and , to defeat two top Jeopardy! champions.

Although continued to advance over the next few years, observers often cite 2015 as the landmark year for . Google , Amazon Web Services, and Microsoft Azure and others began to step up research and improve capabilities, computer vision and analytics tools.

Today, is embedded in a growing number of applications and tools. These range from enterprise analytics programs and digital assistants like Siri and to autonomous vehicles and facial recognition.

Different Forms of Artificial Intelligence

Artificial intelligence is an umbrella term that refers to any and all machine intelligence. However, there are several distinct and separate areas of research and use – though they sometimes overlap. These include:

  • General . These systems typically learn from the world around them and apply data in a cross-domain way. For example, DeepMind, now owned by Google, used a neural network to learn how to play video games similar to how humans play them.
  • Natural Language Processing (). This technology allows machines to read, understand, and interpret human language. uses and semantic programming to understand grammar and syntax, and, in some cases, the emotions of the writer or those interacting with a system like a chat bot.
  • Machine perception. Over the last few years, enormous advances in sensors — cameras, microphones, accelerometers, GPS, radar and more — have powered machine perception, which encompasses recognition and computer vision used for facial and object recognition.
  • Robotics. Robot devices are widely used in factories, hospitals and other settings. In recent years, drones have also taken flight. These systems — which rely on sophisticated mapping and complex programming—also use machine perception, to navigate through tasks.
  • Social intelligence. Autonomous vehicles, robots, and digital assistants such as Siri and require coordination and orchestration. As a result, these systems must have an understanding of human behavior along with a recognition of social norms. […]

Read more