We are currently witnessing the rapid evolution of large language models (LLMs), with computing power and energy demands growing at an unprecedented rate. Researchers are therefore exploring alternatives to traditional digital architectures — and one of the most promising directions involves analog computers, which perform mathematical operations not by processing bits, but by leveraging physical phenomena such as voltages, currents, resistances, or light waves.
The latest study demonstrates an analog computing system capable of performing key computations involved in neural network training significantly faster than GPUs, while using only a fraction of the energy.
Although the full experimental results have not yet been made public, the report suggests that the analog chip can perform certain operations up to 1,000 times faster than a typical graphics card and consume up to 100 times less energy under optimal conditions.
This could represent a major breakthrough in the cost and scalability of AI training — particularly as modern models continue to grow larger, requiring vast computing resources and generating an immense carbon footprint.
Beyond speed and efficiency, analog computing could also reduce barriers to AI access, since lower energy use and infrastructure costs might allow more institutions and companies to develop and run AI systems.
However, researchers note that analog computers generally have lower precision and are more error-prone than digital systems. This means that challenges remain around stability, scalability, and compatibility with existing tools. Moreover, few organizations are likely to invest heavily in reviving a technology that has long fallen out of widespread use and requires substantial maintenance costs.
While analog technology could play an important role in improving energy efficiency and democratizing access to AI, it’s unlikely that we’ll see a large-scale shift from digital to analog computing anytime soon. Still, it’s a fascinating example of how old ideas can find new life in the era of artificial intelligence.

