About a month ago I started down the rabbit hole of what I see as a new frontier in computing, Thermodynamic Computing. This new frontier promises to bring the physics of thermodynamics in new computer hardware to efficiently run the set of algorithms that are built on principles of thermodynamics.
Coincidentally, the week after I started looking into this area the Nobel prize in Physics was awarded based on work on “AI” (machine learning) related to these principles.
At the start of my career, with degrees in Physics and Computer Science, the prospect of getting involved in Quantum Computing seemed like a natural possible direction. I had a strong interest in Quantum Mechanics (with a minor paper published in a peer-reviewed physics journal as an undergrad) and my major project for my CS degree was in computational complexity analysis (specifically, for the family of functions known as the Primitive Recursive functions).
But, I had also gained enough experience in working on real life software projects that I knew I wanted the satisfaction of building something tangible, and life is about navigating opportunity costs. I knew the types of challenges with Quantum Computing could possibly mean an entire career spent without a practical product built. And even now despite great strides made in the field in the last decades, the extreme compexity of hardware suitable for Quantum Computing still puts practical use out probably still measured in decades.
In terms of hardware Thermo Computing occupies a space between Classical Computing and Quantum Computing. To understand how consider the key properties of each paradigm.
Classical Computing is deterministic, based on boolean logic, and builds up layers of abstraction from there. It’s the only kind of computing that we currently have practical hardware for. Thermodynamic effects are the enemy because as density of hardware gets too high errors are introduced from the inability to keep circuit elements sufficiently isolated from each other (not just hypothetically, we are already contending with these issues).
Quantum Computing algorithms can be deterministic or non-deterministic (leveraging Quantum Random Number Generators – QRNGs), with operations beyond boolean that are based on Quantum Entanglement (qubits vs bits), and this leads to extreme efficiency of certain types of algorithms (like Shor’s algorithm for factoring numbers). Thermodynamic effects are also the enemy here because the isolation required for Quantum effects breaks down amidst the noise of multi-particle interactions. As a result, hardware is extremely complex and costly, requiring things like getting as close as possible to Absolute Zero temperatures, isolation of single particles, and complex error correction to account for the limitations in achieving those constraints.
In Thermodynamic Computing, thermodynamic effects are the hero, not the enemy. The high value of this computational capability arises from the extreme utility of algorithms that have been developed that require stochastic processes, most notably leveraged by the field of Probabilistic Machine Learning relevant to neural network algorithms. Today there is a large penalty paid for having Classical Computing simulate Thermodynamic Computing by artificially producing data that can be characterized as “noise”. The promise of directly accessing natural noise with hardware is orders of magnitude more efficiency and less energy use for these algorithms (ie, faster, cheaper, low-energy AI capabilities).
What does it all mean in the end?
For me personally, it’s a possible place to focus my career for a while, but also the research into Probabilistic Machine Learning is bound to be super interesting and valuable on its own.
For all of us, the implications for accelerating the capabilities of AI based capabilities are tremendous; possibly giving us an incredible unlock of capability in the coming decade(s). Imagine what we’ve seen in the area of “AI” in the last couple years with LLMs and Agents and then imagine supercharging the ability to have that functionality at orders of magnitude less cost. And less energy, which means embedding capability in a broader set of end devices or lowering the bar for inclusion.
This goes beyond “chat apps” to more obviously important applications; maybe efficiently running these inherently stochastic algorithms will unlock or lower the cost on new bioinformatics approaches that will yield new therapies, or lower the bar on cost of access, and save lives.
As always, our creativity is the limit.
For current developments in this area keep an eye on Extropic (https://www.extropic.ai/) who is solely focused on building new Thermodynamic Computing hardware and Normal Computing (https://www.normalcomputing.com/) playing in this area as well.
A great resource I’ve found on the subject of Probabilistic Machine Learning are the set of online books you can access from https://github.com/probml/pml-book/blob/main/README.md.
I found the interview below with the founders of Extropic was great to hear more about Thermodynamic Computing basics.