Over 400 million transistors are packed on dual-core chips manufactured using Intel's 45nm process. That'll double soon, per Moore's Law. And it'll still be like computing with pebbles compared to quantum computing.

Quantum computing is a pretty complicated subject—uh, hello, quantum mechanics *plus* computers. I'm gonna keep it kinda basic, but recent breakthroughs like this one prove that you should definitely start paying attention to it. Some day, in the future, quantum computing will be cracking codes, powering web searches, and maybe, just maybe, lighting up our Star Trek-style holodecks.

Before we get to the quantum part, let's start with just "computing." It's about bits. They're the basic building block of computing information. They've got two states—0 or 1, on or off, true or false, you get the idea. But two defined states is key. When you add a bunch of bits together, usually eight of 'em, you get a byte. As in kilobytes, megabytes, gigabytes and so on. Your digital photos, music, documents, they're all just long strings of 1s and 0s, segmented into 8-digit strands. Because of that binary setup, a classical computer operates by a certain kind of logic that makes it good at some kinds of computing—the general stuff you do everyday—but not so great at others, like finding ginormous prime factors (those things from math class), which are a big part of cracking codes.

Quantum computing operates by a different kind of logic—it actually uses the rules of quantum mechanics to compute. Quantum bits, called qubits, are different from regular bits, because they don't just have two states. They can have multiple states, superpositions—they can be 0 or 1 or 0-1 or 0+1 or 0 *and* 1, all at the same time. It's a lot deeper than a regular old bit. A qubit's ability to exist in multiple states—the combo of all those being a superposition—opens up a big freakin' door of possibility for computational powah, because it can factor numbers at much more insanely fast speeds than standard computers.

Entanglement—a quantum state that's all about tight correlations between systems—is the key to that. It's a pretty hard thing to describe, so I asked for some help from Boris Blinov, a professor at the University of Washington's Trapped Ion Quantum Computing Group. He turned to a take on SchrÃ¶dinger's cat to explain it: Basically, if you have a cat in a closed box, and poisonous gas is released. The cat is either dead, 0, or alive, 1. Until I open the box to find out, it exists in both states—a superposition. That superposition is destroyed when I measure it. But suppose I have two cats in two boxes that are correlated, and you go through the same thing. If I open one box and the cat's alive, it means the other cat is too, even if I never open the box. It's a quantum phenomenon that's a stronger correlation than you can get in classical physics, and because of that you can do something like this with quantum algorithms—change one part of the system, and the rest of it will respond accordingly, without changing the rest of the operation. That's part of the reason it's faster at certain kinds of calculations.

The other, explains Blinov, is that you can achieve true parallelism in computing—actually process a lot of information in parallel, "not like Windows" or even other types of classic computers that profess parallelism.

So what's that good for? For example, a password that might take years to crack via brute force using today's computers could take mere seconds with a quantum computer, so there's plenty of crazy stuff that Uncle Sam might want to put it to use for in cryptography. And it might be useful to search engineers at Google, Microsoft and other companies, since you can search and index databases much, much faster. And let's not forget scientific applications—no surprise, classic computers really suck at modelling quantum mechanics. The National Institute of Science and Technology's Jonathan Home suggests that given the way cloud computing is going, if you need an insane calculation performed, you might rent time and farm it out to a quantum mainframe in Google's backyard.

The reason we're not all blasting on quantum computers now is that this quantum mojo is, at the moment, extremely fragile. And it always will be, since quantum states aren't exactly robust. We're talking about working with ions here—rather than electrons—and if you think heat is a problem with processors today, you've got no idea. In the breakthrough by Home's team at NIST—completing a full set of quantum "transport" operations, moving information from one area of the "computer" to another—they worked with a single pair of atoms, using lasers to manipulate the states of beryllium ions, storing the data and performing an operation, before transferring that information to a different location in the processor. What allowed it to work, without busting up the party and losing all the data through heat, were magnesium ions cooling the beryllium ions as they were being manipulated. And those lasers can only do so much. If you want to manipulate more ions, you have to add more lasers.

Hell, quantum computing is so fragile and unwieldy that when we talked to Home, he said much of the effort goes into methods of correcting errors. In five years, he says, we'll likely be working with a mere *tens* of qubits. The stage it's at right now, says Blinov, is "the equivalent of building a reliable transistor" back in the day. But that's not to say those of tens of qubits won't be useful. While they won't be cracking stuff for the NSA—you'll need about 10,000 qubits for cracking high-level cryptography—that's still enough quantum computing power to calculate properties for new materials that are hard to model with a classic computer. In other words, materials scientists could be developing the case for the iPhone 10G or the building blocks for your next run-of-the-mill Intel processor using quantum computers in the next decade. Just don't expect a quantum computer on your desk in the next 10 years.

*Special thanks to National Institute of Standards and Technology's Jonathan Home and the University of Washington Professor Boris Blinov!*