The Next Horizon: Exploring the Enigma of 128-bit CPUs

Maybe you have heard the term ‘bit’ before. If you have had any interaction whatsoever with computing, or indeed if you have simply heard of it, then it’s far from vacuous. The bit is central to the entire field, and the progression of generations of technology up-and-down in bit-sizes has been the very backbone of technological development. From the 8-bit pixellated nostalgia of early video games to the modern 64-bit worlds of Google Docs, bit-sizes banner new generations of computing technology. So what? So why don’t we use 128-bit CPUs? This essay will try to answer this pun beneath the pun, but not by lecturing you on Moore’s law. Instead, it’s an attempt to look at the very nature of why technological development seems to progress the way it does, why sometimes a doubling-down doesn’t do the job.

What's in a Bit? Unraveling the Basics

The humble bit — a binary digit that can represent a 0 or a 1 — is what runs through the veins of the digital world and is at the very least the most basic unit of data. When we talk about how ‘bitness’ a CPU is, we’re really talking about how many bits it can deal with — specifically, how many at one time, which correlated directly to how much memory it could address. A leap of 32-bits to 64-bit architectures wasn’t just an incremental increase; it was a massive leap that increased a system’s addressable memory space by a massive factor.

The Evolutionary Path: From 1-bit to 64-bit

Following the evolution of CPUs is a great way to understand the appeal of simply adding more bits – and how it turned into what it is today. The move from 1-bit to 8-bit opened the gate to the era of arcade machines and game consoles, bringing us to a world where digital entertainment started to bloom. From 8-bit to 16-bit and to 32-bit wasn’t just adding a few more digits to the number; it was a revolution itself, bringing better precision to timing constraints, more intensity to displayed colours, and more complexity to what computers could do.

The Paradox of 128-bit: A Solution in Search of a Problem

But what about 128-bit? The answer is both technological and practical. Individual elements within compute cores – for example, a vector calculation block known as an SIMD (single-instruction, multiple-data) unit – are naturally suited to 128-bit (and beyond). But extending this to the entire CPU is not in keeping with current requirements. A 128-bit CPU could accommodate an astronomical number of different values more in the realm of theoretical than practical.

Potential applications such as IPv6 addresses and UUIDs for unique identifiers would also work well within a 64-bit system. So, we are faced with the odd circumstance where the hardware can support 128-bit CPUs, but no-one has the motivation to build them.

A Glimpse Into the Future: Will 128-bit Ever Make Sense?

As to future-proofing deployments against 128-bit computing, this is fun to imagine. At least for now, the open-endedness of RISC-V’s ISA means that doors aren’t exactly closed. It may be that the ability to emulate universes, or that we need to solve problems of utmost incomprehensibility, is going to demand it, or could demand it. But for the moment, that’s more likely to remain a fun fantasy world than a home reality.

Conclusion: The Journey Continues, With or Without 128-bit CPUs

The 1-bit to 64-bit CPU evolution reflects a technology advancement journey grounded in need, ingenuity, and a never-ending quest to see what’s over the next horizon. As humanity pushes the limits of computed bits, it won’t be about adding more bits so much as increasing the efficiency, power and capability to tackle the hard problems of today, tomorrow, and the day after that. For all the hyperbole, will 128-bit CPUs play a role? Good question. However it’s not strictly a technical issue, but a function of what we need to excel and overcome.

Understanding CONSOLES

Having already considered the trajectory of many-bit computing from the 1990s on, it’s important to stop here to think about consoles’ role in the progression from 8-bit gaming to 64-bit immersion, and why that matters to our overall approach. The consoles’ driving role in the development of computing ‘bits’ make those bits feel tangible, allowing us to take a step back and see the trajectory of progress – the increasing number of bits translating into a corresponding increase in the ways in which computing technology can serve the user. For many people, the jump from 64 to 128 bits is meant to be more than a matter of technological progress; it’s about the new ways in which you can apply that tech to your life and make it fun.

Jun 16, 2024
<< Go Back