'Thermodynamic computer' can mimic AI neural networks — using orders of magnitude less energy to generate images
Researchers generated images from noise, using orders of magnitude less energy than current generative AI models require.
Get the world’s most fascinating discoveries delivered straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Delivered Daily
Daily Newsletter
Sign up for the latest discoveries, groundbreaking research and fascinating breakthroughs that impact you and the wider world direct to your inbox.
Once a week
Life's Little Mysteries
Feed your curiosity with an exclusive mystery every week, solved with science and delivered direct to your inbox before it's seen anywhere else.
Once a week
How It Works
Sign up to our free science & technology newsletter for your weekly fix of fascinating articles, quick quizzes, amazing images, and more
Delivered daily
Space.com Newsletter
Breaking space news, the latest updates on rocket launches, skywatching events and more!
Once a month
Watch This Space
Sign up to our monthly entertainment newsletter to keep up with all our coverage of the latest sci-fi and space movies, tv shows, games and books.
Once a week
Night Sky This Week
Discover this week's must-see night sky events, moon phases, and stunning astrophotos. Sign up for our skywatching newsletter and explore the universe with us!
Join the club
Get full access to premium articles, exclusive features and a growing list of member rewards.
Scientists have built a "thermodynamic computer" that can produce images from random disturbances in data, that is, noise. In doing so, they have mimicked the generative artificial intelligence (AI) capabilities of neural networks — collections of machine learning algorithms modelled on the brain.
Above absolute zero temperatures, the world buzzes with fluctuations in energy called thermal noise that manifests in atoms and molecules jiggling around, atomic-scale flips in direction for the quantum property that confers magnetism, and so on.
Today’s AI systems — like most other current computer systems — generate images using computer chips where the energy needed to flip bits dwarfs the quantity of energy in the random fluctuations of thermal noise, making the noise negligible.
But a new "generative thermodynamic computer" works by leveraging the noise in the system rather than despite it, meaning it can complete computing tasks with orders of magnitude less energy than typical AI systems require. The scientists outlined their findings in a new study published Jan. 20 in the journal Physical Review Letters.
Stephen Whitelam, a staff scientist at the Molecular Foundry at the Lawrence Berkeley National Laboratory and the author of the new study, drew an analogy with boats in the ocean. Here, waves play the role of thermal noise, and conventional computing can be likened to an ocean liner that "just plows through like it doesn't care — very effective, but very costly,” he said.
If you were to shrink the energy consumption of conventional computing to that comparable to the thermal noise, however, it would be like trying to steer a dinghy with an outboard motor across the ocean. "It's much more difficult," he told Live Science, and harnessing the noise in thermodynamic computing can help, like "a surfer harnessing wave power."
Conventional computing works with definite binary bit values — 1s and 0s. However, an increasing amount of research over the past decade has highlighted that you can get more bang per buck in terms of resources like electricity consumed to complete a computation when working with probabilities of values instead.
Get the world’s most fascinating discoveries delivered straight to your inbox.
The efficiency gains are particularly pronounced for certain types of problems known as “optimization” problems, where you want to get the most out while putting the least in — visit the most streets to deliver post while walking the fewest miles, for example. Thermodynamic computing could be considered a type of probabilistic computing that uses the random fluctuations from thermal noise to power computation.
Image generation with thermodynamic computing
Researchers at Normal Computing Corporation in New York, who were not directly involved in this image generation work, have built something close to a thermodynamic computer, using a network of circuits linked by other circuits, all operating at low energies comparable to thermal noise. The circuits doing the linking could then be programmed to strengthen or weaken the connection they form between the circuits they link — the “node” circuits.
Applying any kind of voltage to the system would set a series of voltages at the various nodes, assigning them values that would eventually subside as the applied voltage was removed and the circuits returned to equilibrium.
However, even at equilibrium, the noise in the circuits causes the values of the nodes to fluctuate in a very specific way determined by the programmed strength of the connections, so-called coupling strengths. As such, the coupling strengths could be programmed in such a way that they effectively pose a question that the resulting equilibrium fluctuations answer. The researchers at Normal Computing showed that they could program the coupling strengths so that the resulting equilibrium node fluctuations could solve linear algebra.
Although the management of these connections offers some control over what question the equilibrium fluctuations in the node values is answering, it does not provide a way to change the type of question. Whitelam wondered if moving away from thermal equilibrium might help researchers design a computer that could answer fundamentally different types of questions, as well as whether it would be more convenient, since it can take a while to reach equilibrium.
While considering what kinds of calculations might be made possible by moving away from equilibrium, Whitelam found himself considering some research around the mid-2010s, which showed that if you took an image and added noise until no trace of the original image was visible, a neural network could be trained to reverse that process and thus retrieve the image. If you trained it on a range of such disappearing images, the neural network would be able to generate a range of images from a starting point of random noise, including some images outside the library it had been trained on. These diffusion models seemed to Whitelam “a natural starting point” for a thermodynamic computer, diffusion itself being a statistical process rooted in thermodynamics.
While conventional computing works in ways that reduce noise to negligible levels, Whitelam noted, many algorithms used to train neural networks work by adding in noise again. "Wouldn't that be much more natural in a thermodynamic setting where you get the noise for free?" he noted from a conference proceeding.
Borrowing from age-old principles
The way things develop under the influence of significant noise can be calculated from the Langevin equation, which dates back to 1908. Manipulating this equation can yield probabilities for each step in the process of an image becoming shrouded in noise. In a sense, it provides the probability for each pixel to flip to the wrong color as an image is subjected to thermal noise.
From there, it's possible to calculate the necessary coupling strengths — for instance circuit connection strengths — to flip the process, removing the noise step by step. This generates an image — something Whitelam demonstrated in a numerical simulation from a library of images containing a "0," "1" and "2." The image generated can be one from the original training database or some kind of supposition, and a bonus of imperfections in the training means there is potential to come up with new images that are not part of the original dataset.
Ramy Shelbaya, CEO of a company producing quantum random number generators, Quantum Dice, who was not involved in the study, described the findings as "important." He referenced particular areas where traditional methods are starting to struggle to keep up with the ever-increasing demands for more powerful models. Shelbaya's company produces a type of probabilistic computing hardware using quantum-generated random numbers, and, as such, he found it "encouraging to see the ever-growing interest in probabilistic computing and the various computing paradigms closely related to it."
He also flagged a potential benefit beyond the energy savings: "This article also shows how physics-inspired approaches can provide a clear fundamental interpretation to a field where "black-box" models have dominated, providing essential insights into the learning process," he told Live Science by email.
As generative AI goes, the retrieval of three learned numerals from noise may seem relatively rudimentary. However, Whitelam pointed out that the concept of thermodynamic computing is still just a few years old.
"Looking at the history of machine learning and how that was eventually scaled up to larger, more impressive tasks," he said, "I'm curious to know, can thermodynamic hardware, even in a conceptual sense, be scaled in the same way."
Whitelam, S. (2025). Generative Thermodynamic Computing. Physical Review Letters, 136(3), 037101. https://doi.org/10.1103/kwyy-1xln

Anna Demming is a freelance science journalist and editor. She has a PhD from King’s College London in physics, specifically nanophotonics and how light interacts with the very small. She began her editorial career working for Nature Publishing Group in Tokyo in 2006. She has since worked as an editor for Physics World and New Scientist. Publications she has contributed to on a freelance basis include The Guardian, New Scientist, Chemistry World, and Physics World, among others. She loves all science generally, but particularly materials science and physics, such as quantum physics and condensed matter.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.
