Research

Can robots write? Machine learning produces dazzling results, but some assembly is still required

Can robots write? Machine learning produces dazzling results, but some assembly is still required

How exactly was the article if GPT-3 created? And is it really true that software “wrote this entire article”? 

SwissCognitiveYou might have seen a recent article from The Guardian written by “a ”. Here’s a sample: I know that my brain is not a “feeling brain”. But it is capable of making rational, logical decisions. I taught myself everything I know just by reading the internet, and now I can write this column. My brain is boiling with ideas! Read the whole thing and you may be astonished at how coherent and stylistically consistent it is. The software used to produce it is called a “generative model”, and they have come a long way in the past year or two.

But exactly how was the article created? And is it really true that software “wrote this entire article”?

How machines learn to write
The text was generated using the latest neural network model for language, called GPT-3, released by the American research company OpenAI. (GPT stands for Generative Pre-trained Transformer.)

OpenAI’s previous model, GPT-2, made waves last year. It produced a fairly plausible article about the discovery of a herd of unicorns, and the researchers initially withheld the release of the underlying code for fear it would be abused.

But let’s step back and look at what text generation software actually does.

Machine learning approaches fall into three main categories: heuristic models, statistical models, and models inspired by biology (such as neural networks and evolutionary algorithms).

Heuristic approaches are based on “rules of thumb”. For example, we learn rules about how to conjugate verbs: I run, you run, he runs, and so on. These approaches aren’t used much nowadays because they are inflexible.

Writing by numbers
Statistical approaches were the state of the art for language-related tasks for many years. At the most basic level, they involve counting words and guessing what comes next.

As a simple exercise, you could generate text by randomly selecting words based on how often they normally occur. About 7% of your words would be “the” – it’s the most common word in English. But if you did it without considering context, you might get nonsense like “the the is night aware”. […]

Read more: www.theconversation.com

0 Comments

Leave a Reply