Could analog artificial intelligence (AI) hardware – rather than digital – tap fast, low-energy processing to solve machine learning’s rising costs and carbon footprint?
Copyright: venturebeat.com – “How analog AI hardware may one day reduce costs and carbon emissions”
Researchers say yes: Logan Wright and Tatsuhiro Onodera, research scientists at NTT Research and Cornell University, envision a future where machine learning (ML) will be performed with novel physical hardware, such as those based on photonics or nanomechanics. These unconventional devices, they say, could be applied in both edge and server settings.
Deep neural networks, which are at the heart of today’s AI efforts, hinge on the heavy use of digital processors like GPUs. But for years, there have been concerns about the monetary and environmental cost of machine learning, which increasingly limits the scalability of deep learning models.
A 2019 paper out of the University of Massachusetts, Amherst, for example, performed a life cycle assessment for training several common large AI models. It found that the process can emit more than 626,000 pounds of carbon dioxide equivalent — nearly five times the lifetime emissions of the average American car, including the manufacturing of the car itself.
At a session with NTT Research at VentureBeat Transform’s Executive Summit on July 19, CEO Kazu Gomi said machine learning doesn’t have to rely on digital circuits, but instead can run on a physical neural network. This is a type of artificial neural network in which physical analog hardware is used to emulate neurons as opposed to software-based approaches.
“One of the obvious benefits of using analog systems rather than digital is AI’s energy consumption,” he said. “The consumption issue is real, so the question is what are new ways to make machine learning faster and more energy-efficient?”
Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!
Analog AI: More like the brain?
From the early history of AI, people weren’t trying to think about how to make digital computers, Wright pointed out.
“They were trying to think about how we could emulate the brain, which of course is not digital,” he explained. “What I have in my head is an analog system, and it’s actually much more efficient at performing the types of calculations that go on in deep neural networks than today’s digital logic circuits.”
The brain is one example of analog hardware for doing AI, but others include systems that use optics.
“My favorite example is waves, because a lot of things like optics are based on waves,” he said. “In a bathtub, for instance, you could formulate the problem to encode a set of numbers. At the front of the bathtub, you can set up a wave and the height of the wave gives you this vector X. You let the system evolve for some time and the wave propagates to the other end of the bathtub. After some time you can then measure the height of that, and that gives you another set of numbers.”[…]
Read more:
Could analog artificial intelligence (AI) hardware – rather than digital – tap fast, low-energy processing to solve machine learning’s rising costs and carbon footprint?
Copyright: venturebeat.com – “How analog AI hardware may one day reduce costs and carbon emissions”
Researchers say yes: Logan Wright and Tatsuhiro Onodera, research scientists at NTT Research and Cornell University, envision a future where machine learning (ML) will be performed with novel physical hardware, such as those based on photonics or nanomechanics. These unconventional devices, they say, could be applied in both edge and server settings.
Deep neural networks, which are at the heart of today’s AI efforts, hinge on the heavy use of digital processors like GPUs. But for years, there have been concerns about the monetary and environmental cost of machine learning, which increasingly limits the scalability of deep learning models.
A 2019 paper out of the University of Massachusetts, Amherst, for example, performed a life cycle assessment for training several common large AI models. It found that the process can emit more than 626,000 pounds of carbon dioxide equivalent — nearly five times the lifetime emissions of the average American car, including the manufacturing of the car itself.
At a session with NTT Research at VentureBeat Transform’s Executive Summit on July 19, CEO Kazu Gomi said machine learning doesn’t have to rely on digital circuits, but instead can run on a physical neural network. This is a type of artificial neural network in which physical analog hardware is used to emulate neurons as opposed to software-based approaches.
“One of the obvious benefits of using analog systems rather than digital is AI’s energy consumption,” he said. “The consumption issue is real, so the question is what are new ways to make machine learning faster and more energy-efficient?”
Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!
Analog AI: More like the brain?
From the early history of AI, people weren’t trying to think about how to make digital computers, Wright pointed out.
“They were trying to think about how we could emulate the brain, which of course is not digital,” he explained. “What I have in my head is an analog system, and it’s actually much more efficient at performing the types of calculations that go on in deep neural networks than today’s digital logic circuits.”
The brain is one example of analog hardware for doing AI, but others include systems that use optics.
“My favorite example is waves, because a lot of things like optics are based on waves,” he said. “In a bathtub, for instance, you could formulate the problem to encode a set of numbers. At the front of the bathtub, you can set up a wave and the height of the wave gives you this vector X. You let the system evolve for some time and the wave propagates to the other end of the bathtub. After some time you can then measure the height of that, and that gives you another set of numbers.”[…]
Read more:
Share this: