Industry Pharma Research Solutions

AI Is Still in Its Formative Years

ANALYSIS AI Is Still in Its Formative Years

The electronics industry over the past several years has made tremendous strides in creating in a manner imagined by Allen Turing in the 1940s.

SwissCognitiveThe convergence of algorithmic advances in multilayer neural networks, the evolution of PC graphics processing units as massively parallel processing accelerators, and the availability of massive data sets fueled by the Internet and widely deployed sensors — big data — has enabled a renaissance in software neural network modeling techniques commonly referred to as “,” or “.”

In addition, the evolution of 3D graphic shader pipelines into general purpose compute accelerators drastically reduced the time required to train models. Training time for applications as diverse as and has been reduced from months to days — and in some cases, hours or even minutes.

These solutions have enabled new applications ranging from computational science to voice based digital assistants like and Siri. However, as far as we have come in such a short period of time, we still have much further to go to realize the true benefits of .

The Eyes Have It

often is compared to the human brain, because our brain is one of the most complex neural networks on our planet. However, we don’t completely understand how the human brain functions, and medical researchers are still studying what many of the major structures in our brains actually do and how they do it. researchers started out by modeling the neural networks in human eyes. They were early adopters of GPUs to accelerate DLs, so it is no surprise that many of the early applications of are in vision systems.

As we learn more about how our brains work, that new knowledge will drive even more model complexity. For example, researchers are still exploring the impact of numerical precision on training and inference tasks and have arrived at widely divergent views, ranging from 64- to 128-bit training precision at the high end to 8-, 4-, 2- and even 1-bit precision in some low-end inference cases. “Good enough” precision turns out to be context-driven and is therefore highly application dependent. This rapid advancement is knowledge and technology has no end in sight. […]

18 Comments

Leave a Reply