SAN FRANCISCO — In 2004, Geoffrey Hinton doubled down on his pursuit of a technological idea called a neural network.

copyright by www.nytimes.com

SwissCognitiveIt was a way for machines to see the world around them, recognize sounds and even understand natural language. But scientists had spent more than 50 years working on the concept of neural networks, and machines couldn’t really do any of that.

Backed by the Canadian government, Dr. Hinton, a computer science professor at the University of Toronto, organized a new research community with several academics who also tackled the concept. They included Yann LeCun, a professor at New York University, and Yoshua Bengio at the University of Montreal.

On Wednesday, the Association for Computing Machinery, the world’s largest society of computing professionals, announced that Drs. Hinton, LeCun and Bengio had won this year’s Turing Award for their work on neural networks. The Turing Award, which was introduced in 1966, is often called the Nobel Prize of computing, and it includes a $1 million prize, which the three scientists will share.

Over the past decade, the big idea nurtured by these researchers has reinvented the way technology is built, accelerating the development of face-recognition services , talking digital assistants , warehouse robots and self-driving cars . Dr. Hinton is now at Google, and Dr. LeCun works for Facebook. Dr. Bengio has inked deals with IBM and Microsoft.

“What we have seen is nothing short of a paradigm shift in the science,” said Oren Etzioni, the chief executive officer of the Allen Institute for Artificial Intelligence in Seattle and a prominent voice in the A.I. community. “History turned their way, and I am in awe.”

Loosely modeled on the web of neurons in the human brain, a neural network is a complex mathematical system that can learn discrete tasks by analyzing vast amounts of data. By analyzing thousands of old phone calls, for example, it can learn to recognize spoken words.

This allows many artificial intelligence technologies to progress at a rate that was not possible in the past. Rather than coding behavior into systems by hand — one logical rule at a time — computer scientists can build technology that learns behavior largely on its own.


Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


 

The London-born Dr. Hinton, 71, first embraced the idea as a graduate student in the early 1970s, a time when most artificial intelligence researchers turned against it. Even his own Ph.D. adviser questioned the choice.[…]

read more – copyright by www.nytimes.com