The most impressive language generator yet

OpenAI’s GPT-3 language generator is lighting up the internet.

OpenAI’s new machine learning language generator, GPT-3, is currently an internet darling. With a catholic knowledge of the English canon, drawn from the vast corners of the internet, the AI can generate a piece of writing that can, at times, read as fine as any a human could compose.

It’s a comprehensive step beyond AI’s usual playing with language, and perhaps a hint that machine learning could soon assume powerful roles in writing, much like those it is assuming in medicine and robotics.

GPT-3 is the most powerful language generator yet made, according to MIT’s Technology Review. Inside any neural network’s black box are parameters, the guardrails for its training: GPT-2, released last year, had a massive 1.5 billion parameters; GPT-3, in comparison, has 175 billion — one of those hard-to-comprehend numbers.

Sucking this dataset, Charybdis-like, into its maw, GPT-3 can now spit out, based on all it has “read,” whatever words should follow a given prompt.

Styles and Shortcomings

GPT-3 uses its vast dataset to apply a mathematical prediction as to what words, and in what order, will best complete a given prompt. Some of the results have been deeply impressive, despite that ghostly quality AI writing seems to always contain.

If given a prompt with a sci-fi bent — say, the opening line to George Orwell’s 1984, from the Guardian‘s Alex Hern — it will return a suitably sci-fi result. 

“It was a bright cold day in April, and the clocks were striking thirteen,” Orwell begins.

“I was in my car on my way to a new job in Seattle. I put the gas in, put the key in, and then I let it run,” the language generator continued. “I just imagined what the day would be like. A hundred years from now. In 2045, I was a teacher in some school in a poor part of rural China. I started with Chinese history and history of science.”

It’s a hair disjointed, but read altogether, it does feel like a particularly promising freshman comp assignment. 

People playing with GPT-3 have prompted poetry and prose in the styles of certain authors, composed music, and created code. It’s even written an article on itself. (Your correspondent will fight an AI to keep his job.)

Little wonder that the Twitter cage is rattling.

GPT-3 is not a perfect mimic, or an artificial general intelligence, however. Kevin Lacker’s Turing test of GPT-3 reveals some of its shortcomings.

The language generator has little issue predicting answers to trivia-style questions; it knows who won the 1995 World Series, how many eyes a giraffe has, and the human life expectancy in the United States.

 It proved pretty good at answering common sense questions, too, knowing that an elephant is heavier than a mouse, and a pop can heavier than a paperclip. But give it a prompt that’s a bit odd — Lacker asked, is a pencil heavier than a toaster? — and you can trip it up. 

(GPT-3’s response: the pencil is heavier. This is likely because the literature on comparing pencils and toasters is probably pretty … light.)

Interestingly, it won’t admit it is wrong, and it won’t  call out your question if it’s nonsensical. When asked how to sporgle a morgle, it replied “with a sporgle” (uh, duh).

But even OpenAI’s founder, Sam Altman, tried to temper things a bit. 

“The GPT-3 hype is way too much,” he tweeted. “It’s impressive (thanks for the nice compliments!) but it still has serious weaknesses and sometimes makes very silly mistakes. AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot still to figure out.”

Still, while it’s true that GPT-3 is not an artificial general intelligence — or even intelligent, in, like, a philosophical sense (“Music is the most advanced form of mathematics” is a sophomoric, one-blunt insight, GPT-3) — that does not mean it’s as simple as a chatbot toy.

With the ability to write, with varying levels of convincingness, everything from fiction to music, poetry to technical information, journalism to code, GPT-3 could prove to be a powerful tool. Any task that requires the written word could be augmented — or even automated — with a high quality language generator.

And GPT-3 might just be the lede.

Related
The West needs more water. This Nobel winner may have the answer.
Paul Migrom has an Emmy, a Nobel, and a successful company. There’s one more big problem on the to-do list.
Can we automate science? Sam Rodriques is already doing it.
People need to anticipate the revolution that’s coming in how humans and AI will collaborate to create discoveries, argues Sam Rodrigues.
AI is now designing chips for AI
AI-designed microchips have more power, lower cost, and are changing the tech landscape.
Why futurist Amy Webb sees a “technology supercycle” headed our way
Amy Webb’s data suggests we are on the cusp of a new tech revolution that will reshape the world in much the same way the steam engine and internet did in the past.
AI chatbots may ease the world’s loneliness (if they don’t make it worse)
AI chatbots may have certain advantages when roleplaying as our friends. They may also come with downsides that make our loneliness worse.
Up Next
Exit mobile version