Most computers and algorithms — including, at this point, many artificial intelligence (AI) applications — run on general-purpose circuits called central processing units or CPUs.

 

Copyright: venturebeat.com – “What is AI hardware? How GPUs and TPUs give artificial intelligence algorithms a boost”


 

Though, when some calculations are done often, computer scientists and electrical engineers design special circuits that can perform the same work faster or with more accuracy. Now that AI algorithms are becoming so common and essential, specialized circuits or chips are becoming more and more common and essential.

The circuits are found in several forms and in different locations. Some offer faster creation of new AI models. They use multiple processing circuits in parallel to churn through millions, billions or even more data elements, searching for patterns and signals. These are used in the lab at the beginning of the process by AI scientists looking for the best algorithms to understand the data.

Others are being deployed at the point where the model is being used. Some smartphones and home automation systems have specialized circuits that can speed up speech recognition or other common tasks. They run the model more efficiently at the place it is being used by offering faster calculations and lower power consumption.

Scientists are also experimenting with newer designs for circuits. Some, for example, want to use analog electronics instead of the digital circuits that have dominated computers. These different forms may offer better accuracy, lower power consumption, faster training and more.

What are some examples of AI hardware?

The simplest examples of AI hardware are the graphical processing units, or GPUs, that have been redeployed to handle machine learning (ML) chores. Many ML packages have been modified to take advantage of the extensive parallelism available inside the average GPU. The same hardware that renders scenes for games can also train ML models because in both cases there are many tasks that can be done at the same time.


Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


 

Some companies have taken this same approach and extended it to focus only on ML. These newer chips, sometimes called tensor processing units (TPUs), don’t try to serve both game display and learning algorithms. They are completely optimized for AI model development and deployment.[…]

Read more: www.venturebeat.com