Research

Why Deep Learning, and Why Now ?

The hype around

is all the rage today, as companies across industries seek to use advanced computational techniques to find useful information hidden across huge swaths of data. While the field of artificial intelligenceArtificial Intelligence knows many different definitions, but in general it can be defined as a machine completing complex tasks intelligently, meaning that it mirrors human intelligence and evolves with time. is decades old, breakthroughs in the field of artificial neural networksNeural Networks are simplified abstract models of the human brain. Usually they have different layers and many nodes. Each layer receives input on which it carries out simple computations, and passes on the result to the next layer, by the final layer the answer to whatever problem will be produced.  are driving the explosion of .

Decade old ideas now work

Attempts at creating artificial intelligence go back decades. In the wake of World War II, the English mathematician and codebreaker Alan Turning penned his definition for true artificial intelligence. Dubbed the Turing TestAlan Turing constructed a test in the 1950's to determine a machine's intelligence. The test is considered to be passed, when a human cannot tell whether s/he is interacting with a machine or a human. The test as sparked controversy but is an important concept nevertheless. , a conversational machine would have to convince a human that he was talking to another human. It took 60 years, but a computer finally passed the Turing Test back in 2014, when a chat botChatbots are computer programs which were engineered to converse in spoken or written form with humans. They are usually used in dialogue systems with a limited topic range. For example, they can answer basic customer questions or help you buy the correct train ticket. developed by the University of Reading dubbed “Eugene” convinced 33% of the judges convened by the Royal Society in London that he was real. It was the first time that the 30% threshold had been exceeded. Since then, the field of and has exploded as computers get closer to delivering human-level capabilities. Consumers have been inundated with an array of chat botsA bot is a piece of code, which does a predefined set of actions on behalf of someone. Bots are used to manage Twitter Followers, they answer email requests or order more supplies as soon a certain item runs low. like Apple‘s Siri, Amazon‘s , and Microsoft‘s Cortana that use and machine to answer questions.

Understanding and Neural Networks

Researchers have found that the combination of advanced neural networks, ready availability of huge masses of training data, and extremely powerful distributed GPU-based systems have given us the building blocks for creating intelligent, self- machines that can rival humans in understanding. As Google Fellow Jeff Dean explains, the rapid evolution in big dataBig Data describes data collections so big that humans are not capable of sifting through all of it in a timely manner. However, with the help of algorithms it is usually possible to find patterns within the data so far hidden to human analyzers.  technologies over the past decade has positioned us well to now imbue machines with near human-level understanding. “We now have a pretty good handle on how to store and then perform computation on large data sets, so things like MapReduce and BigTable that we built at Google, and things like Spark and Hadoop really give you the tools that you need to manipulate and store data,” the legendary technologist said during last June’s Spark Summit in San Francisco. “But honestly, what we really want is not just a bunch of bits on disk that we can process,” Dean continues. “What we want [is] to be able to understand the data. We want to be able to take the data that our products or our systems can generate, and then built interesting levels of understanding.”  […]

Leave a Reply