To Keep Innovating, We Need To Rethink The CPU

Big data processing will be the future of discovery and innovation. Our greatest discoveries will come not from carefully conceiving a theory and testing it, but by throwing every possibility up against a CPU wall and seeing what sticks.

And as GigaOm points out, we’re quickly reaching a point where the CPU in its current form just won’t do.

The shift is a move from creating scads of information in a format that can be stored cheaply, to being able to process and analyse that information more cheaply as well (all the while adding new layers of data thanks to a proliferation of devices and networks). The challenge is that under the current computing paradigm, adding more processing is problematic both because it’s becoming more difficult to cram more transistors onto a chip, and those chips and their surrounding servers are sucking up an increasing amount of power.

The solution? Rethink its design. GPU computing, while great for supercomputing, is inferior when it comes to raw number crunching. And packing more and more transistors in different configurations – Intel’s 3D chips, for example – will only go so far. HP has been reconsidering every aspect of the processor, and think they found the future of computing in a new circuitry component: the memristor.

HP’s answer is its concept of nanostores, chips that tie the memory and the processor together using a completely new kind of circuit called a memristor. The basic premise for HP is that 80 percent of the energy inside a data centre is tied to moving data from memory to the processor and then back again. We’re already seeing the trend of moving memory closer to the processor (that’s what the addition of Flash inside the data centre is about) to speed up computing.

And even beyond the preservation of Moore’s Law or market relevance, the need for companies pushing CPU design forward is important for the future of science. Take for example, this New Yorker piece on quantum computing and the leading mind behind it, David Deutsch. It’s big idea is that if we build a working quantum computer, it could theoretically process more numbers than there are believed particles in the universe.

What’s that good for? It could prime factorise absurdly large numbers in a matter of seconds. And it could prove the validity the Many Worlds Interpretation: the idea that there are parallel universes out there (could test for). The theory supposes that there is a different universe for every possible permutation of anything in the universe. One scientist, Peter Shor, developed an algorithm for quantum computers that would potentially support this theory, if we ever had a quantum computer powerful enough to run it on.

The theory also explains how quantum computers might work. Deutsch told me that a quantum computer would be “the first technology that allows useful tasks to be performed in collaboration between parallel universes.” The quantum computer’s processing power would come from a kind of outsourcing of work, in which calculations literally take place in other universes. Entangled particles would function as paths of communication among different universes, sharing information and gathering the results. So, for example, with the case of Shor’s algorithm, Deutsch said, “When we run such an algorithm,, countless instances of us are also running it in other universes.”

Now tell me you don’t want to see that happen before you die. [GigaOm and New Yorker]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.