Microsoft unveils new liquid-cooled computer chips — they could prevent AI data centers from massively overheating
Microsoft engineers have developed a microfluidics chip-cooling technique that removes heat more efficiently and could ratchet down heat generated by AI workloads.

Microsoft engineers have devised a new way to keep data centers cool — and it might help prevent the next generation of artificial intelligence (AI) hardware from cooking itself to death.
The technology is based on "microfluidics" and involves pumping liquid coolant through tiny channels etched directly into silicon chips.
In a statement, Microsoft representatives said the technology was up to three times more effective at removing heat compared with conventional "cold plate" methods for cooling data centers.
The company hopes that microfluidics will make it possible for data centers to run more-intensive computational workloads without the risk of overheating, particularly as newer, more powerful AI processors enter the market. These generate far more heat than earlier generations of computer chips, with Microsoft warning that current cooling technology could max out data center performance in "just a few years."
"If you're still relying heavily on traditional cold plate technology, you're stuck," Sashi Majety, senior technical program manager at Microsoft, said in the statement. "In as soon as five years, this could become a ceiling on performance."
Graphics processing units (GPUs) are often used in data centers because they can run multiple calculations in parallel. This makes them ideal for powering AI and other computationally intensive workloads.
To prevent them from overheating, GPUs are typically cooled using metal cold plates. These are mounted on top of the chip's housing and circulate coolant over and around it to draw heat away. However, cold plates are separated from the silicon by multiple layers, which limits how much heat they can extract from the chip.
Get the world’s most fascinating discoveries delivered straight to your inbox.
Cool and groovy
Microsoft's microfluidics technology involves etching grooves the size of a human hair directly into the silicon die — the densely packed computational core of the chip. When coolant is sent directly to the die via this microscopic pipework, heat is carried away much more efficiently.
The prototype chip went through four design iterations. Microsoft partnered with Swiss startup Corintis to develop a layout inspired by leaf veins and butterfly wings — patterns that distribute liquid across branching paths rather than straight lines.
The aim was to reach hotspots more precisely and avoid clogging or cracking the silicon. An AI model optimized these cooling paths by using heat maps to show where temperatures tended to be highest on the processor.
Engineers then tested the design on a GPU running a simulated Microsoft Teams workload — a mix of video, audio and transcription services used to reflect typical data center conditions. In addition to carrying away heat much more efficiently, the microfluidic cooling system reduced the peak temperature rise in the GPU's silicon by 65%, according to Microsoft representatives.
Beyond better thermal control, Microsoft hopes microfluidics could allow for overclocking — safely pushing chips beyond their normal operating limits without burning them out.
"Whenever we have spiky workloads, we want to be able to overclock," Jim Kleewein, technical fellow at Microsoft 365 Core Management, said in the statement. "Microfluidics would allow us to overclock without worrying about melting the chip down because it’s a more efficient cooler of the chip."
The company is now exploring how to apply microfluidics to its custom Cobalt and Maia chips and will now work with fabrication partners to bring the technology into broader use. Future applications may include cooling 3D-stacked chips, which are notoriously hard to design due to the heat buildup between layers.
"We want microfluidics to become something everybody does, not just something we do," Kleewein said. "The more people that adopt it the better, the faster the technology is going to develop, the better it's going to be for us, for our customers, for everybody."
Owen Hughes is a freelance writer and editor specializing in data and digital technologies. Previously a senior editor at ZDNET, Owen has been writing about tech for more than a decade, during which time he has covered everything from AI, cybersecurity and supercomputers to programming languages and public sector IT. Owen is particularly interested in the intersection of technology, life and work – in his previous roles at ZDNET and TechRepublic, he wrote extensively about business leadership, digital transformation and the evolving dynamics of remote work.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.