Natural language technologies are quickly gaining traction in the enterprise.
Copyright by www.eweek.com
We are seeing huge numbers of companies every week rolling out text analytics solutions, recognition systems, chatbots and every other language-processing use case imaginable.
It’s no surprise that over the past three years () has become one of the most dominant domains in data science. also is an umbrella term for subfields such as (NLU), natural language generation (NLG) and natural language interaction (NLI).
With companies such as Google and Microsoft producing new discoveries on a consistent basis, has made quantum leaps in terms of accuracy, speed and methodology to aid computer scientists as they tackle complex issues. Today, is one of the most researched fields in ().
In this eWEEK Data Points article, Dillon Erb, co-Founder and CEO of Paperspace, describes five important trends impacting the enterprise in 2021 and helps distinguish between trends that have legitimate promise and those that are overhyped.
Data Point No. 1: True NLU has a long way to go.
Human language is complex — language is a proxy for human thought after all. That makes NLU () one of the so-called “-hard” problems — because to “solve” NLU is to “solve” generalized .
But as a subtopic of , NLU gets a lot of attention because the possible applications are so exciting and there are so many enabling technologies that are showing real value in the wild already as partial NLU solutions.
So while the first data point is that complete NLU is not close right now, it’s also true that NLU applications are already proving themselves in the enterprise.
NLU is used to perform sentiment analysis on customer help requests and to understand questions posed to digital assistants like Siri and . It also translates text between languages in multilingual neural machine translation services, like Google Translate.
Data Point No. 2: Models are improving rapidly, and this is inspiring enterprises to prepare.
Arguably the most famous model in the world right now is GPT from OpenAI. The latest version GPT-3 was released earlier this year.
In terms of GPT’s progress, GPT-2 launched in February 2019 and made a significant impact because it was trained on 1.5B parameters. GPT-3 launched nearly 18 months later and pre-trained on 175B parameters–an increase of two orders of magnitude. These releases instantly generate worldwide media attention because of the implications for fake news generation, generative art, codebases that write themselves and more.
The progress of language models like GPT is inspiring many new approaches and applications for machine intelligence in the enterprise–from being able to write web apps by describing them in English to mimicking language patterns of public figures to training on medical literature to provide diagnoses. […]
Read more: www.eweek.com