Tens of thousands of papers involving A.I. are published each year, but it will take some time before many of them make their potential real-world impact clear. Meanwhile, the top funders of A.I. — the Alphabets, Apples, Facebooks, Baidus, and other unicorns of this world — continue to hone much of their most exciting technology behind closed doors.

Copyright by www.digitaltrends.com

SwissCognitive, AI, Artificial Intelligence, Bots, CDO, CIO, CI, Cognitive Computing, Deep Learning, IoT, Machine Learning, NLP, Robot, Virtual reality, learningIn other words, when it comes to artificial intelligence, it’s impossible to do a rundown of the year’s most important developments in the way that, say, you might list the 10 most listened-to tracks on Spotify.

But A.I. has undoubtedly played an enormous role in 2020 in all sorts of ways. Here are six of the main developments and emerging themes seen in artificial intelligence during 2020.

It’s all about language understanding

In an average year, a text-generating tool probably wouldn’t rank as one of the most exciting new A.I. developments. But 2020 hasn’t been an average year, and GPT-3 isn’t an average text-generating tool. The sequel to GPT-2, which was labeled the world’s most “dangerous” algorithm, GPT-3 is a cutting-edge autoregressive natural-language-processing neural network created by the research lab OpenAI. Seeded with a few sentences, like the beginning of a news story, GPT-3 can generate impressively accurate text matching the style and content of the initial few lines — even down to making up fabricated quotes. GPT-3 boasts an astonishing 175 billion parameters — the weights of the connections that are tuned in order to achieve performance — and reportedly cost around $12 million to train.

GPT-3 isn’t alone in being an impressive A.I. language model spawned in 2020. While it was quickly overtaken in the hype cycle by GPT-3, Microsoft’s Turing Natural Language Generation (T-NLG) made waves in February 2020. At 17 billion parameters, it was, upon release, the largest language model yet published. A Transformer-based generative language model, T-NLG is able to generate the necessary words to complete unfinished sentences, as well as generate direct answers to questions and summarize documents.

First introduced by Google in 2017, Transformers — a new type of deep learning model — have helped revolutionize natural language processing. A.I. has been focused on language at least as far back as Alan Turing’s famous hypothetical test of machine intelligence. But thanks to some of these recent advances, machines are only now getting astonishingly good at understanding language. This will have some profound impacts and applications as the decade continues.

Models are getting bigger

GPT-3 and T-NLG represented another milestone, or at least significant trend, in A.I. While there’s no shortage of startups, small university labs, and individuals using A.I. tools, the presence of major players on the scene means some serious resources are being thrown around. Increasingly, enormous models with huge training costs are dominating the cutting edge of A.I. research. Neural networks with upward of a billion parameters are fast becoming the norm. […]

Read more: www.digitaltrends.com


Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!