Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Could analog artificial intelligence (AI) hardware – rather than digital – tap fast, low-energy processing to solve machine learning’s rising costs and carbon footprint?
Researchers say yes: Logan Wright and Tatsuhiro Onodera, research scientists at NTT Research and Cornell University, envision a future where machine learning (ML) will be performed with novel physical hardware, such as those based on photonics or nanomechanics. These unconventional devices, they say, could be applied in both edge and server settings.
Deep neural networks, which are at the heart of today’s AI efforts, hinge on the heavy use of digital processors like GPUs. But for years, there have been concerns about the monetary and environmental cost of machine learning, which increasingly limits the scalability of deep learning models.
A 2019 paper out of the University of Massachusetts, Amherst, for example, performed a life cycle assessment for training several common large AI models. It found that the process can emit more than 626,000 pounds of carbon dioxide equivalent — nearly five times the lifetime emissions of the average American car, including the manufacturing of the car itself.
At a session with NTT Research at VentureBeat Transform’s Executive Summit on July 19, CEO Kazu Gomi said machine learning doesn’t have to rely on digital circuits, but instead can run on a physical neural network. This is a type of artificial neural network in which physical analog hardware is used to emulate neurons as opposed to software-based approaches.
“One of the obvious benefits of using analog systems rather than digital is AI’s energy consumption,” he said. “The consumption issue is real, so the question is what are new ways to make machine learning faster and more energy-efficient?”
Analog AI: More like the brain?
From the early history of AI, people weren’t trying to think about how to make digital computers, Wright pointed out.
“They were trying to think about how we could emulate the brain, which of course is not digital,” he explained. “What I have in my head is an analog system, and it’s actually much more efficient at performing the types of calculations that go on in deep neural networks than today’s digital logic circuits.”
The brain is one example of analog hardware for doing AI, but others include systems that use optics.
“My favorite example is waves, because a lot of things like optics are based on waves,” he said. “In a bathtub, for instance, you could formulate the problem to encode a set of numbers. At the front of the bathtub, you can set up a wave and the height of the wave gives you this vector X. You let the system evolve for some time and the wave propagates to the other end of the bathtub. After some time you can then measure the height of that, and that gives you another set of numbers.”
Essentially, nature itself can perform computations. “And you don’t need to plug it into anything,” he said.
Analog AI hardware approaches
Researchers across the industry are using a variety of approaches to developing analog hardware. IBM Research, for example, has invested in analog electronics, in particular memristor technology, to perform machine learning calculations.
“It’s quite promising,” said Onodera. “These memristor circuits have the property of having information be naturally computed by nature as the electrons ‘flow’ through the circuit, allowing them to have potentially much lower energy consumption than digital electronics.”
NTT Research, however, is focused on a more general framework that isn’t limited to memristor technology. “Our work is focused on also enabling other physical systems, for instance those based on light and mechanics (sound), to perform machine learning,” he said. “By doing so, we can make smart sensors in the native physical domain where the information is generated, such as in the case of a smart microphone or a smart camera.”
Startups including Mythic also focus on analog AI using electronics – which Wright says is a “great step, and it is probably the lowest risk way to get into analog neural networks.” But it’s also incremental and has a limited ceiling, he added: “There is only so much improvement in performance that is possible if the hardware is still based on electronics.”
Long-term potential of analog AI
Several startups, such as LightMatter, Lightelligence and Luminous Computing, use light, rather than electronics, to do the computing – known as photonics. This is riskier, less-mature technology, said Wright.
“But the long-term potential is much more exciting,” he said. “Light-based neural networks could be much more energy-efficient.”
However, light and electrons aren’t the only thing you can make a computer out of, especially for AI, he added. “You could make it out of biological materials, electrochemistry (like our own brains), or out of fluids, acoustic waves (sound), or mechanical objects, modernizing the earliest mechanical computers.”
MIT Research, for example, announced last week that it had new protonic programmable resistors, a network of analog artificial neurons and synapses that can do calculations similarly to a digital neural network by repeatedly repeating arrays of programmable resistors in intricate layers. They used an “a practical inorganic material in the fabrication process,” they said, that enables their devices “to run 1 million times faster than previous versions, which is also about 1 million times faster than the synapses in the human brain.”
NTT Research says it’s taking a step further back from all these approaches and asking much bigger, much longer-term questions: What can we make a computer out of? And if we want to achieve the highest speed and energy efficiency AI systems, what should we physically make them out of?
“Our paper provides the first answer to these questions by telling us how we can make a neural network computer using any physical substrate,” said Logan. “And so far, our calculations suggest that making these weird computers will one day soon actually make a lot of sense, since they can be much more efficient than digital electronics, and even analog electronics. Light-based neural network computers seem like the best approach so far, but even that question isn’t completely answered.”
Analog AI not the only nondigital hardware bet
According to Sara Hooker, a former Google Brain researcher who currently runs the nonprofit research lab Cohere for AI, the AI industry is “in this really interesting hardware stage.”
Ten years ago, she explains, AI’s massive breakthrough was really a hardware breakthrough. “Deep neural networks did not work until GPUs, which were used for video games [and] were just repurposed for deep neural networks,” she said.
The change, she added, was almost instantaneous. “Overnight, what took 13,000 CPUs overnight took two GPUs,” she said. “That was how dramatic it was.”
It’s very likely that there’s other ways of representing the world that could be equally powerful as digital, she said. “If even one of these data directions starts to show progress, it can unlock a lot of both efficiency as well as different ways of learning representations,” she explained. “That’s what makes it worthwhile for labs to back them.”
Hooker, whose 2020 essay “The Hardware Lottery” explored the reasons why various hardware tools have succeeded and failed, says the success of GPUs for deep neural networks was “actually a bizarre, lucky coincidence – it was winning the lottery.”
GPUs, she explained, were never designed for machine learning — they were developed for video games. So much of the adoption of GPUs for AI use “depended upon the right moment of alignment between progress on the hardware side and progress on the modeling side,” she said. “Making more hardware options available is the most important ingredient because it allows for more unexpected moments where you see those breakthroughs.”
Analog AI, however, isn’t the only option researchers are looking at when it comes to reducing the costs and carbon emissions of AI. Researchers are placing bets on other areas like field-programmable gate arrays (FPGAs) as application-specific accelerators in data centers, that can reduce energy consumption and increase operating speed. There are also efforts to improve software, she explained.
Analog, she said, “is one of the riskier bets.”
Expiration date on current approach
Still, risks have to be taken, Hooker said. When asked whether she thought the big tech companies are supporting analog and other types of alternative nondigital AI future, she said, “One hundred percent. There is a clear motivation,” adding that what is lacking is sustained government investment in a long-term hardware landscape.
“It’s always been tricky when investment rests solely on companies, because it’s so risky,” she said. “It often has to be part of a nationalist strategy for it to be a compelling long-term bet.”
Hooker said she wouldn’t place her own bet on widespread analog AI hardware adoption, but insists the research efforts good for the ecosystem as a whole.
“It’s kind of like the initial NASA flight to the moon,” she said. “There’s so many scientific breakthroughs that happen just by having an objective.
And there is an expiration date on the industry’s current approach, she cautioned: “There’s an understanding among people in the field that there has to be some bet on more riskier projects.”
The future of analog AI
The NTT researchers made clear that the earliest, narrowest applications of their analog AI work will take at least 5-10 years to come to fruition – and even then will likely be used first for specific applications such as at the edge.
“I think the most near-term applications will happen on the edge, where there are less resources, where you might not have as much power,” said Onodera. “I think that’s really where there’s the most potential.”
One of the things the team is thinking about is which types of physical systems will be the most scalable and offer the biggest advantage in terms of energy efficiency and speed. But in terms of entering the deep learning infrastructure, it will likely happen incrementally, Wright said.
“I think it would just slowly come into the market, with a multilayered network with maybe the front end happening on the analog domain,” he said. “I think that’s a much more sustainable approach.”
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
Author: Sharon Goldman
Source: Venturebeat