Science & Health

Now We Know How Many Ways We Can Arrange 128 Tennis Balls

Now We Know How Many Ways We Can Arrange 128 Tennis Balls

Here’s a question worthy of the ball boy at Wimbledon: if you have 128 tennis balls packed into a container, how many different ways can you arrange them? Answer: 10250 — more than the entire number of subatomic particles in the universe.

That’s the conclusion of a team of researchers at the University of Cambridge in England, as described in a recent paper in the journal Physical Review E. Well, technically they ran computer simulations of 128 “soft spheres” with properties very similar to tennis balls. Both are examples of granular media.

There have been a number of studies using spheres on what’s known as the “packing problem,” but this is different. The Cambridge team was attempting to calculate a property called configurational entropy. Actually, that’s just a fancy jargon word for good old-fashioned entropy — more precisely, the statistical definition devised by the 19th century Austrian physicist Ludwig Boltzmann.

Entropy (AKA the second law of thermodynamics) is a cornerstone of physics. It’s a measure of how much disorder there is in a given system, and hence, how much useful work (in the physics sense) that system can produce. The more order there is, the lower the entropy; the greater the disorder, the higher the entropy. It’s the reason a hot cup of coffee can never become even hotter while sitting in a colder room, or melted ice cream can never refreeze of its own accord. Instead, coffee cools, ice cream melts into liquid, things in general decay, unless some outside force intervenes to counter the process.

Now We Know How Many Ways We Can Arrange 128 Tennis BallsCredit: S. Martiniani et al./University of Cambridge

Boltzmann redefined entropy as more of a statistical law, applying to molecules en masse, not individually. A pint of ice cream contains billions of atoms. When frozen, those atoms are arranged in a highly ordered crystal lattice. As the ice cream heats up and begins to melt, that order gradually disappears. There is only one way for the atoms to be arranged in order to form a solid, but several combinations for them to form a liquid, and still more besides for them to form a gas.

Entropy always increases in a closed system. But granular systems — like sand dunes or those 128 packed tennis balls — are different, in that they don’t have thermal fluctuations. They won’t spontaneously shift the arrangement of their atoms. “If you leave them at rest, they will not change, you have to drive them to change, stir them or pack them,” Martiniani told Gizmodo. “For a pile of sand, you’ll need some wind or tapping to make it move.”

About 25 years ago, a Cambridge scientist named Samuel Edwards suggested that it should be possible to develop a statistical theory of the behaviour of granular systems, similar to the Boltzmann approach, except instead of energy, the key factor would be volume. It was an ingenious insight, but it wouldn’t work unless you could calculate that entropy — a feat far beyond the computational capabilities at the time.

Even with today’s computers, it’s a daunting challenge to do such a calculation for a system with more than 20 particles. “The brute force way of doing this would be to keep changing the system and recording the configurations,” Martiniani explained in a statement. “Unfortunately, it would take many lifetimes before you could record it all. Also, you couldn’t store the configurations, because there isn’t enough matter in the universe with which to do it.”

Their solution: take just a small sample of all those possible configurations and crunch the probabilities on those. From there, they could extrapolate how many different ways that hypothetical Wimbledon ball boy could arrange 128 tennis balls. That same method could one day help us predict the changing shape of shifting sand dunes over time, or the behaviour of snowy avalanches. Martiniani is applying it to machine learning in hopes of building more efficient AI: how many different ways can you wire a neural network and still have it be functional?

Perhaps someday it could help resolve the so-called landscape problem in string theory. Rather than providing a single unique solution, string theory predicts a vast number of worlds — roughly 10500 possible universes. But if there is an underlying minimising function (and many string theorists believe there is, although they have yet to uncover it), Martiniani’s method could be used to figure out exactly how many possible universes there could be under string theory, and how probable each would be.

It’s the universality that gives the Cambridge method its power. “The mechanics of stable packing is just the minimum of a [mathematical] function,” Martiniani said. “The methodology could be used anywhere that people are trying to work out how many possible solutions to a problem you can find.”

That’s the beauty of mathematics. It lets you see hidden connections between things that look very different from each other on the surface, but turn out to share the same underlying dynamics — whether we’re talking about sand dunes, neural networks, the multiverse or tennis balls.

[Physical Review E]

Top image: Atomic Taco/Flickr


Have you subscribed to Gizmodo Australia's email newsletter? You can also follow us on Facebook, Twitter, Instagram and YouTube.

Trending Stories Right Now