As revenues and research output soar in the field of AI, global competition between the United States, China and Europe heats up.
Copyright by www.nature.com
The idea of artificial intelligence (AI) — systems so advanced they can mimic or outperform human cognition — first came to prominence in 1950, when British computer scientist Alan Turing proposed an ‘imitation game’ to assess whether a computer could fool humans into thinking they were communicating with another human. Soon after, researchers at Princeton University in New Jersey built MADALINE, the first artificial neural network applied to a real-world problem. Their system, modelled on the brain and nervous system, learnt to solve a maze through trial-and-error.
Since then, the rise of AI has been enabled by exponentially faster and more powerful computers and large, complex data sets. Applications such as machine learning, whereby a system identifies patterns in large sets of data, have demonstrated the potential for AI to be practical and profitable.
Today, AI forms the basis of computer systems handling tasks such as voice recognition and translation on smartphones, piloting driverless cars, and controlling robots that automate chores in homes and factories. In research, AI is being used in a growing number of applications, such as processing the enormous amounts of data that underpin fields including astronomy and genomics, producing climate models and weather forecasts, and identifying signs of disease in medical imaging.
Read more: www.nature.com
As revenues and research output soar in the field of AI, global competition between the United States, China and Europe heats up.
Copyright by www.nature.com
The idea of artificial intelligence (AI) — systems so advanced they can mimic or outperform human cognition — first came to prominence in 1950, when British computer scientist Alan Turing proposed an ‘imitation game’ to assess whether a computer could fool humans into thinking they were communicating with another human. Soon after, researchers at Princeton University in New Jersey built MADALINE, the first artificial neural network applied to a real-world problem. Their system, modelled on the brain and nervous system, learnt to solve a maze through trial-and-error.
Since then, the rise of AI has been enabled by exponentially faster and more powerful computers and large, complex data sets. Applications such as machine learning, whereby a system identifies patterns in large sets of data, have demonstrated the potential for AI to be practical and profitable.
Today, AI forms the basis of computer systems handling tasks such as voice recognition and translation on smartphones, piloting driverless cars, and controlling robots that automate chores in homes and factories. In research, AI is being used in a growing number of applications, such as processing the enormous amounts of data that underpin fields including astronomy and genomics, producing climate models and weather forecasts, and identifying signs of disease in medical imaging.
Read more: www.nature.com
Share this: