How The 2010s Changed Physics Forever

How The 2010s Changed Physics Forever

This decade marked not just one but a series of turning points in the history of physics.

The 2010s were an incredible decade for new knowledge, but more importantly, this decade’s discoveries—and the resounding lack thereof—have changed the way physicists think about their respective fields. Particle physics and astrophysics have entered new eras that will reshape the way researchers do science. New technology based on the framework of quantum mechanics could mark a major shift in computing, materials science, and the way we handle energy.

“It feels like we’re in the middle of a paradigm shift,” Natalia Toro, associate professor in particle physics and astrophysics at Stanford University, told Gizmodo. “It’s still not clear where we’re going, but I think that in 50 years from now, the past decade will be remembered as the beginning of a major shift in our understanding of physics.”

Finding the smallest stuff

This decade brought radical shifts in the way scientists understand both the big and the small. Perhaps most notably, scientists at the Large Hadron Collider, a 27-kilometre-round particle accelerator and collider in Geneva, Switzerland, discovered evidence of the Higgs boson, the last particle described by the central theory of particles physics, called the Standard Model.

Before 1964, some theories worked pretty well to describe the universe, but they had a problem: They predicted that certain particles physicists already knew to have mass should be massless. Then, six scientists (most famously Peter Higgs), released a trio of papers fixing the problem, detailing a mechanism by which mass could emerge in force-carrying particles, called gauge bosons, so those universe-explaining theories would still work. That mechanism required the existence of another particle, called the Higgs boson. Despite many searches, the Higgs boson went undetected—until this decade.

The Large Hadron Collider at CERN, the largest science experiment ever constructed, turned on in 2008. On July 4, 2012, researchers around the world crammed auditoriums and lecture halls to listen in as LHC researchers finally announced that they’d discovered evidence of the Higgs in two of the experiment’s building-sized detectors, called ATLAS and CMS. Many touted that all of the particles predicted by the Standard Model had been found, and thus, the model was complete…or was it?

“Saying that we’ve completed the Standard Model implies that we’re done,” Patty McBride, distinguished scientist at the Fermi National Accelerator Laboratory and deputy spokesperson for the CMS Collaboration at CERN, told Gizmodo. “And we’re not.” Plenty of mysteries and, in fact, around 96 per cent of the stuff in the universe, still go unexplained by the Standard Model.

The Large Hadron Collider has been eerily quiet since 2012. Plenty of interesting results testing the Standard Model have since come out, but no new particles have been found after the Higgs boson. Physicists hoped that CERN would discover evidence of other particles, like the superpartners. These particles were predicted to simultaneously provide an explanation for why gravity is so much weaker than the other forces (think—all of Earth’s gravity can’t stop a refrigerator magnet from picking up a paper clip) as well as serve as the true identity of dark matter, the mysterious stuff that seems to make up the scaffolding of the universe but hasn’t been observed directly. And while there’s still plenty of LHC data to sift through—and the LHC is slated to receive an upgrade to keep running with a higher rate of collisions—scientists are starting to wonder if they’ll ever find evidence of these particles.

But the lack of discovery might one day be seen as a turning point in the history of physics. Particle physicists have begun to look for particles in new ways, such as by using high-precision experiments that test various Standard Model predictions through looking for small but statistically significant deviations from what the theory predicts, rather than high-energy, brute-force supercolliders. It’s also encouraged theorists to think outside the box, looking for new explanations for things like dark matter.

“It’s getting more technologically challenging to push [particle accelerators] to higher energies” to look for new particles, Josh Frieman, professor in the department of astronomy and astrophysics at the University of Chicago, told Gizmodo. “The particle physics community has realised that we need a diversity of approaches… It’s going to be a challenging problem. When you have a challenging problem, you want to bring to bear all of the tools you have in your toolkit, because the new physics is being kind of coy.”

Rippling spacetime itself

This decade revolutionised physics on the largest scales as well. Over a century ago, Albert Einstein’s theory of general relativity predicted that high-energy events could emit disturbances that ripple at the speed of light through spacetime itself, called gravitational waves. Scientists long searched for gravitational waves produced by supernova or binary black holes orbiting one another and colliding. Indirect evidence of the waves first started showing up with the discovery of binary pulsar (a kind of spinning neutron star) called PSR 1913+16. After several years, scientists realised that its orbital period was decreasing in exactly the way that general relativity predicted such a system would lose energy to the production of gravitational waves. But despite other searches, direct evidence failed to materialise.

That is, until this decade. On September 14, 2015, at 5:51am ET (8.51pm AEDT), two L-shaped facilities, each composed of a pair of tunnels nearly two kilometres long on a side and meeting at a right angle, one in Washington state and the other in Louisiana, recorded their lasers shifting in and out of phase with one another on a detector. These wobbles were the result of two black holes, 29 and 36 times the mass of the Sun, spiraling into one another and then merging, 1.3 billion light-years away, broadcasting their gravitational waves toward Earth.

More observations followed, but perhaps the even more groundbreaking discovery came in 2017, when the detectors, now joined by the similar Virgo experiment in Italy, measured gravitational waves at the same moment that telescopes around the world spotted blips of radio, ultraviolet, infrared, and optical radiation coming from the same point in the sky. This outburst of energy was the result of the collision of two neutron stars, city-sized stellar corpses. This single event allowed scientists to learn about the origin of some of the periodic table’s heaviest elements and may one day be useful for closing a present-day “crisis” in physics over how quickly the universe is accelerating.

This paradigm-shifting discovery was a hallmark of multimessenger astronomy—that is, astronomy in which scientists use both light waves and the detection of some other particle or wave in order to observe a source. Telescopes originally just used visible light, then other wavelengths of electromagnetic radiation, like x-rays or radio waves, and now complementary observatories might include data from space coming from particles like neutrinos or gravitational waves.

“This is the golden age of multimessenger astronomy,” Peter Galison, professor of physics of the history of science at Harvard University, told Gizmodo.

The field of black holes experienced a watershed moment in other ways, when scientists operating the Event Horizon Telescope, a collaboration of radio telescopes around the world, teamed up and pointed their dishes at the 6.5-billion-solar-mass black hole at the centre of galaxy M87. This produced the world’s first-ever image of a black hole, or more accurately, the shadow that a black hole casts on the stuff behind it. Though researchers have long seen evidence of these light-bending objects—massive behemoths that warp spacetime so much that light can’t escape their pull—the observation produced the best direct view of one of them. Scientists hope that this discovery has kicked off a new era of black hole science and that they can better understand the giant jets of matter that supermassive black holes spew out from their centres.

“[Black holes] can shape cosmological-scale phenomena,” Galison said. “We see these objects that emit their light a tiny fraction of the time since the Big Bang. They’re like lighthouses of at the edge of the visible universe that flash their beams toward us. Understanding the origin of these jets is of great significance in better grasping… objects that may be shaping the distributions of matter in galaxies.”

Physics in the real world

Perhaps an unsung hero of both astrophysics and particle physics this decade is the increasing use of machine-learning algorithms to sort through huge datasets. The black hole image wouldn’t exist without machine learning—and this decade, its use in particle physics is undergoing a “turning point,” Toro told Gizmodo.

This decade also kicked off a new era in technology based on the quirks of particle physics—like quantum computers. “I think this decade is definitely the one where quantum computer turned from science fiction into something that looks like it’s going to become real,” Peter Shor, MIT mathematician behind Shor’s factoring algorithm, told Gizmodo.

These quantum devices were famously proposed by Richard Feynman in 1981. They’re intended to solve certain problems that regular computers can’t using the weird, subverted probability mathematics of atoms, rather than regular logic. Specifically, scientists hope they may one day simulate the behaviour of molecules or run certain complex algorithms using the new mathematical tweaks. Basically, it’s as if these machines just generate probability distributions from flipping coins that can be nudged in midair by pulses of energy, and unlike the regular rules of probability, these quantum probabilities can have negative signs when you add the “coins” together, leading to more complex probability distributions than regular flipped coins would have.

It was only 2007 when physicists at Yale invented the “transmon qubit,” a loop of superconducting wire that acts as an artificial atom and the smallest unit of quantum computing. Today, IBM and Google have both developed 50-plus qubit machines that are starting to show speedups over the abilities of classical computers for certain problems. Meanwhile, other companies have debuted similar-sized devices based on atoms held in place by lasers. A whole ecosystem of startups offering software tools or hardware components for these machines has grown as well.

It may be decades before these machines offer any advantages over classical computers aside from being fancy random number generators. They’re incredibly difficult to control before losing their qunatumness from stray vibrations or radiation of the outside world. They might still deliver the wrong results—a zero in a binary string when it should have spat out a one, for example. Researchers are now working to implement error correction, combining multiple qubits together to create a mega, “logical” qubit that’s not prone to error. A truly “fault-tolerant” universal quantum computer that physicists dream of might require millions of qubits to realise its full potential.

But physicists are hopeful that they might find a use for these small, noisy devices that are still doing something interesting, even if they’re not doing it well. Back in 2017, CalTech physicist John Preskill declared that we’d entered a new era of quantum computing called the Noisy Intermediate-Scale Quantum Technology (NISQ) era.

This decade, scientists have also incorporated the weirdness of quantum mechanics into new sensing technology, and scientists in China launched a satellite that used the mathematics of quantum mechanics to encrypt a video call between China and Austria. Moving beyond quantum into materials science, researchers may have created the first material that conducts electricity without resistance at nearly room temperature—another discovery decades in the making. And just last year, scientists discovered that they could switch superconductivity on and off in two sheets of graphene with just a twist, a discovery that has generated a deluge of followup work in two-dimensional systems ever since.


The 2010s might not be the best time in the history of physics—the early 20th century produced dozens of new discoveries, many of which completely subverted the way scientists thought about the universe on the largest and smallest scales. Nor was it one for surprises, and many of its discoveries were many years in the making. But it’s undeniable that historians looking back on this decade will see paradigm shifts across all of physics, including new technology, experimental methods, and ways of thinking that changed the course of history.

Said McBride: “I think it’s been a great decade for physics.”


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.