Article

Extropic debuts probabilistic TSU chip, claims thousands-fold energy savings and challenges Nvidia, AMD, Intel

DATE: 10/30/2025 · STATUS: LIVE

Tiny startup built a chip computing with probabilities, promising energy savings and AI breakthroughs, but then an experiment produced astonishing…

Extropic debuts probabilistic TSU chip, claims thousands-fold energy savings and challenges Nvidia, AMD, Intel
Article content

A small startup says it has produced the first functioning hardware for a chip that computes by working with probabilities instead of strict 1s and 0s, and it has demonstrated through early tests that larger systems built on the design could tackle real tasks in artificial intelligence and scientific research. The work represents an attempt to challenge the familiar processors from Nvidia, AMD, and Intel by changing what counts as a basic computational unit.

Extropic calls its processors thermodynamic sampling units, or TSUs. Rather than forcing every signal into a fixed binary state, the devices use silicon elements that exploit thermodynamic electron fluctuations and shape those fluctuations to represent the odds of different outcomes. The company says the method is well suited to problems that are inherently uncertain, from weather forecasting to generative models that produce images, text, or video.

The startup argues the architecture will be far more energy efficient when scaled. Extropic claims a potential advantage of thousands of times lower energy use for some workloads once many chips are linked together. With AI companies spending enormous sums to build massive data centers, the firm presents its design as a lower-cost, lower-power alternative to expanding fleets of conventional GPUs and CPUs.

Extropic has shipped its first working chip to a small group of partners that includes frontier AI labs, startups focused on weather modeling, and representatives from several governments. Extropic has declined to provide names. “This allows all sorts of developers to kick the tires,” says Extropic CEO Guillaume Verdon, who was known before founding the company as Based Beff Jezos and for a techno philosophy called effective accelerationism or e/acc. Verdon founded the startup with Trevor McCourt, now its CTO; both previously worked on quantum computing at Google before pivoting to this line of research.

One of the organizations testing the hardware is Atmo, a startup that builds AI models to produce higher-resolution weather forecasts. Johan Mathe, Atmo’s CEO, says the chips could make it much cheaper to compute the probability of different weather scenarios. Atmo’s customers include the Department of Defense, and Mathe says the ability to run probabilistic computation efficiently would help applications that require rapid, high-resolution risk estimates.

Extropic has also released a software package called TRHML that emulates the behavior of its probabilistic chips on a GPU. That tool is meant to let researchers and developers explore the programming model without immediate access to the physical device. Mathe has used both TRHML and the real chip. “I was able to run a few p-bits and see that they behave the way they are supposed to,” Mathe says.

The company’s current hardware, labeled XTR-0, combines a field-programmable gate array (FPGA) that can be reconfigured for different tasks with two of Extropic’s first probabilistic chips, called X-0, each of which holds a small number of p-bits. Rather than bits that are strictly 0 or 1, p-bits sample between those states with a probability that can be steered by surrounding circuitry. The prototype is limited in scale, yet company engineers say it demonstrates core elements of the design and the software stack that will be needed for larger machines.

“We have a machine-learning primitive that is far more efficient than matrix multiplication,” McCourt says. “The question is, how do you build something on the scale of ChatGPT or Midjourney.”

A paper Extropic posted to arXiv lays out how a future chip with thousands of p-bits might be assembled and used to run a different kind of diffusion model. Diffusion models are a key class of algorithms for generating media and for guiding robotic behavior, and the company suggests a probabilistic hardware substrate could perform those algorithms with lower energy cost and different performance characteristics than floating-point matrix math on GPUs.

Extropic says it expects to deliver a larger device next year, a chip the company calls Z-1 that it says will contain 250,000 p-bits. “It could be a huge win,” Mathe says of the forthcoming Z-1.

Investors and founders working in distributed AI approaches view the idea as potentially consequential because conventional transistor scaling is hitting physical limits. “Their approach to the physics of information processing could prove transformative over the next decade, particularly as conventional transistor scaling hits fundamental limits,” adds Vincent Weisser, CEO of Prime Intellect. “If scaled practically, it could deliver orders-of-magnitude improvements in energy efficiency and density, critical for workloads where energy per operation is a bottleneck.”

Verdon and McCourt stress that the current rush to build AI data centers often overlooks the energy required to run and cool those facilities, a cost that keeps rising as models grow. “Even if we have a 1 percent chance of success—and we think it’s much higher than that—it’s worth trying,” McCourt says.

Keep building
END OF PAGE

Vibe Coding MicroApps (Skool community) — by Scale By Tech

Vibe Coding MicroApps is the Skool community by Scale By Tech. Build ROI microapps fast — templates, prompts, and deploy on MicroApp.live included.

Get started

BUILD MICROAPPS, NOT SPREADSHEETS.

© 2025 Vibe Coding MicroApps by Scale By Tech — Ship a microapp in 48 hours.