FAGMA HealthTech HiTech Microsoft News Pharma Research

A Decade Of AI: Most Defining Moments 2010-20

OPINIONS A Decade Of AI: Most Defining Moments 2010-20

Be it access to world standard courses, platforms, libraries, frameworks, hardware, this was the decade when went mainstream.

Copyright by www.analyticsindiamag.com

SwissCognitive, AI, Artificial Intelligence, Bots, CDO, CIO, CI, Cognitive Computing, Deep Learning, IoT, Machine Learning, NLP, Robot, Virtual reality, learning — the yesteryear’s wildest science fiction is now an integral part of our lives. It wasn’t like this a decade ago. People were talking, theorising and experimenting with for sure, but what happened in the last decade has made more tangible. This was the decade when went mainstream. Be it access to world standard courses, platforms, libraries, frameworks, hardware — everything just fell into place. And, it wouldn’t be an exaggeration if one were to say that what was accomplished in the last ten years single-handedly fortified the foundations of our future. 

In this article, we look at a few of the most important breakthroughs that directly or indirectly have made a household name.

Convolutions Galore

The year 2012 was considered to be one of the most important years in the history of . This was the year when the power of convolutional neural networks (2012) was truly realised at the famous ImageNet competition where the participants were tasked with making accurate predictions of the objects. Dubbed “Alexnet” is a convolutional neural network (CNN) designed by Alex Krizhevsky, published with Ilya Sutskever and Krizhevsky. It halved the current error rate on Imagenet visual recognition to 15.3 per cent. The neural network taught itself to detect cats with 74.8% accuracy and faces with 81.7% from YouTube videos. The success of facial recognition in your phones or malls can be attributed to this work in 2012. The improved accuracies gradually allowed researchers to deploy models for medical imaging with great confidence. From retinopathy to cancer diagnosis, from kidney disease to assisted surgery, the field of medicine is gearing up for a very exciting decade ahead.

The 2017 “Attention Is All You Need” by A Vaswani et al., created a cascade effect that enabled machines to understand language like never before. Thanks to the Transformers architecture, can now even write fake news, tweets and even have the potential to cause political instability. What followed the introduction of Transformers was Google’s release of the BERT model, which the search giant uses for keyword prediction and even SEO ranking among many others. As BERT became the de facto standard of () models, other companies such as Microsoft and NVIDIA started catching up by piling up parameters. While NVIDIA’s Megatron came with 8 billion parameters, Microsoft’s Turing NLG model had 17 billion parameters. Then OpenAI(now partnered with Microsoft) turned the tables with the GPT model. While GPT-2 showed great promise, the real winner was GPT-3. […]

Read more: www.analyticsindiamag.com


Leave a Reply