Eversince ChatGPT brought Artificial Intelligence (AI) mainstream, the discourse on how AI will change the world as we know, has taken all sorts of dimensions. (Read our last short read for the drama around governance at AI’s most famous startup). But few would have expected AI to drive the resurgence in analog computing, something considered dead long time ago, thanks to the wonders that digital computers could achieve. In this well written piece, Charles Platt explains why analog computing is staging a comeback – the power hungry nature of AI.

“…when, say, brute-force natural-language AI systems distill millions of words from the internet, the process is insanely power hungry. The human brain runs on a small amount of electricity, he said, about 20 watts. (That’s the same as a light bulb.) “Yet if we try to do the same thing with digital computers, it takes megawatts.” For that kind of application, digital is “not going to work.”

The author cites Mike Henry, founder of Mythic, a startup claiming to market the “industry-first AI analog matrix processor.”

“…citing the brain-like neural network that powers GPT-3. “It has 175 billion synapses,” Henry said, comparing processing elements with connections between neurons in the brain. “So every time you run that model to do one thing, you have to load 175 billion values. Very large data-center systems can barely keep up.”

That’s because, Henry said, they are digital. Modern AI systems use a type of memory called static RAM, or SRAM, which requires constant power to store data. Its circuitry must remain switched on even when it’s not performing a task. Engineers have done a lot to improve the efficiency of SRAM, but there’s a limit. “Tricks like lowering the supply voltage are running out,” Henry said.

Mythic’s analog chip uses less power by storing neural weights not in SRAM but in flash memory, which doesn’t consume power to retain its state. And the flash memory is embedded in a processing chip, a configuration Mythic calls “compute-in-memory.” Instead of consuming a lot of power moving millions of bytes back and forth between memory and a CPU (as a digital computer does), some processing is done locally.”

Elsewhere, researchers across Columbia and MIT who were making progress on the analog chip shared their findings with the author:

“There are diminishing returns on the digital model, he said, yet it still dominates the industry. “If we applied as many people and as much money to the analog domain, I think we could have some kind of analog coprocessing happening to accelerate the existing algorithms. Digital computers are very good at scalability. Analog is very good at complex interactions between variables. In the future, we may combine these advantages.

…If you have billions of objects—as in a nuclear chain reaction, or synapse states in an AI engine—you’ll need a digital processor containing maybe 100 billion transistors to crunch the data at billions of cycles per second. And in each cycle, the switching operation of each transistor will generate heat. Waste heat becomes a serious issue.

Using a new-age analog chip, you just express all the factors in a differential equation and type it into Achour’s compiler, which converts the equation into machine language that the chip understands. The brute force of binary code is minimized, and so is the power consumption and the heat. The HCDC is like an efficient little helper residing secretly amid the modern hardware, and it’s chip-sized, unlike the room-sized behemoths of yesteryear.”

One of the researchers told the author this:
“People do wonder why we are doing this when everything is digital. They say digital is the future, digital is the future—and of course it’s the future. But the physical world is analog, and in between you have a big interface. That’s where this fits.”

In conclusion, the author writes:
“There’s going to be a lot of money in AI—and in smarter drug molecules, and in agile robots, and in a dozen other applications that model the muzzy complexity of the physical world. If power consumption and heat dissipation become really expensive problems, and shunting some of the digital load into miniaturized analog coprocessors is significantly cheaper, then no one will care that analog computation used to be done by your math-genius grandfather using a big steel box full of vacuum tubes.

Reality really is imprecise, no matter how much I would prefer otherwise, and when you want to model it with truly exquisite fidelity, digitizing it may not be the most sensible method. Therefore, I must conclude:
Analog is dead.

Long live analog.”

If you are up for some nerdy nostalgia, you will enjoy reading the whole piece with lots of math and electronics thrown in.

If you want to read our other published material, please visit https://marcellus.in/blog/

Note: The above material is neither investment research, nor financial advice. Marcellus does not seek payment for or business from this publication in any shape or form. The information provided is intended for educational purposes only. Marcellus Investment Managers is regulated by the Securities and Exchange Board of India (SEBI) and is also an FME (Non-Retail) with the International Financial Services Centres Authority (IFSCA) as a provider of Portfolio Management Services. Additionally, Marcellus is also registered with US Securities and Exchange Commission (“US SEC”) as an Investment Advisor.



2024 © | All rights reserved.

Privacy Policy | Terms and Conditions