The following is an excerpt from How To: Absurd Scientific Advice for Common Real-World Problems, a new book from cartoonist and science communicator Randall Munroe. How To is available on Amazon now.
We sometimes think of our eyes as a pair of cameras, but human visual systems are so much more sophisticated than any camera — it’s just easy to miss the complexity because it happens automatically. We look at a scene, get a picture in our heads, and we don’t realise how much processing, analysis, and interaction happens to produce that picture.
Cameras generally see all the areas of an image at roughly the same resolution. If you take a picture of this page with a phone camera, a word in the centre of the picture will be made up of about the same number of pixels as a word near the edge. But your eyes don’t work that way — they see very different amounts of detail at the centre of your vision compared to the edges. The actual “pixel grid” of the eye looks very strange:
The reason we don’t notice the wildly varying resolutions is that our brains are used to it. Our visual systems process the image and give us an overall impression that what we’re looking at is simply what the scene looks like, the same thing that would be seen by a camera. This works... until we start comparing our mental picture to what’s produced by actual cameras, and discover that there are a lot of variables that our brains have been adjusting for us behind the scenes.
One of the ways in which cameras and eyes can differ is their field of view. Field of view is responsible for a lot of confusion in photography, and it has some particularly significant effects on selfies.
When you hold a camera close to your face, it makes your features look different. To understand why — and how it affects photos of all kinds — let’s talk about supermoons.
Every now and then, viral internet stories make the rounds spreading wild claims about some upcoming astronomical event.
These are sometimes accompanied by photos of the “supermoon” behind a skyline, like this.
However, when people go outside to take pictures of the Moon, this is what they get:
So what happened? Was the first photo fake?
It might have been, but often it’s not. Instead, it’s a photo taken with a very narrow angle, through a telephoto lens.
Every photo shows a certain field of view. A wide field of view shows stuff off to the sides, and a narrow one shows only the objects directly in front of the lens.
“Zooming in” means narrowing the field of view. It’s easy to think of zooming in as “getting closer” to the subject, because it makes a small subject get bigger and fill the frame. But zooming in isn’t quite the same as getting closer. When you get closer to a subject, the subject gets bigger within the picture, but the distant background stays the same size. When you zoom in, the subject and the background get bigger.
The reason people get tricked by this difference is that our eyes have only one field of view. We can focus our attention on something at the centre of our vision, but the total area covered by our eyes stays the same. Photos with unusually wide or narrow fields of view can surprise us.
For decades, the rule of thumb among photographers has been that a 50mm full-frame lens produces an image that looks “natural” to people — not too wide and not too narrow. This “natural” lens produces a surprisingly narrow field of view; it’s about 40° wide, similar to the area covered by a hardback book when you hold it about a foot from your face.
But smartphones may be in the process of changing all that, because phone cameras have much wider fields of view than old 50mm lenses.
The iPhone X, for example, has a 65° horizontal field of view, letting users fit a wider scene into the frame without having to back up. (It is not, however, quite wide enough for one common photography subject: rainbows. A rainbow covers 83° of the sky, making it slightly too wide to fit in an iPhone frame.)
These wider-angle lenses may have become more common because smartphone users want to take natural-looking pictures of scenes from life, or selfies that show multiple people. It’s hard to take a selfie with a traditional 50mm camera held at arm’s length. And phones make it easy to crop our images after the fact, so it makes sense to err on the side of “too wide” and let users do the zooming and cropping. But the wide field of view comes with a cost: when you use a wide-angle lens to take a picture of a small or faraway subject, it may not show you what you expect.
To a human, the Moon is attention-grabbing. Even if we don’t literally “zoom in” with our eyes, we narrow our attention to isolate it. We use our high-resolution vision to pick out the details of the Moon, ignoring the comparatively boring sky around it.
But a smartphone doesn’t know to “narrow its focus” the way our brain does. The Moon is just another patch of pixels, lost in its extra wide–angle camera. To get a good photo of the Moon, you need to zoom in — something that smartphones have limited ability to do.
If you do have a camera that can zoom in, all the other stuff you might want to include in the picture — like buildings and trees around you — doesn’t fit in the frame anymore. Those things look bigger than the Moon from where you’re standing, even though they’re obviously not (unless your city has unusually lax zoning regulations.)
If you want an object to appear small relative to the Moon, you have to move far enough away that it takes up a smaller angle of the sky. For a building, this distance can be pretty large.
In order to take one of those photos that shows a huge moon behind a city skyline, the photographer generally needs to stand miles away from the city. That nice- looking photo probably took a huge amount of work and planning.
The reason buildings look so big in ordinary photos, and the Moon looks so small, is because buildings are so much closer than the Moon. And this brings us back to selfies.
This same wide-angle effect that makes the Moon seem tiny can affect how selfies turn out. When someone takes a photo of their face with a smartphone, their instinct for composition might tell them to hold the phone close enough that their face fills a significant portion of the frame. But at that distance, which is much closer than where someone would usually stand when looking at you, the wide-angle smartphone lens creates an unnatural perspective.
Your nose and cheeks are substantially closer to the camera than your ears and the rest of your head, which makes them look bigger — just like a building in the foreground of a smartphone shot looks bigger than the Moon.
This distortion can make faces look subtly different in ways that we don’t expect. To reduce the effect, hold the phone farther away and zoom in — either within the camera app as you take the picture, or after the fact by cropping it.
How far away should you hold the phone? To minimise perspective distortion between several objects in a frame, the distance to the phone should be much larger than the difference in distance between the nearest and farthest objects.
The difference between the distance to the nearest and farthest visible parts of your face is probably less than a foot, which means that the distortion can change a lot depending on whether you hold the camera up at a normal distance from your face or stretch your arm out all the way. Holding a camera 5 or 6 feet away [1.5 or 2 metres] will almost entirely eliminate this kind of distortion, but our arms aren’t long enough for that — which partly helps to explain the popularity of selfie sticks.
Take Cooler Selfies by Messing With Your Field of View
Perspective distortion can change the relative size of parts of your face, but there’s another way it can affect your photos — one that can open up a whole new variety of selfie options. When you zoom in, you change the apparent size of objects in the background. If you’re standing in front of a large object that’s far away, such as a mountain, the camera’s zoom can dramatically affect how big the mountain looks.
If you set up your camera on a timer and walk far away from it, you can make even a fairly small mountain look huge.
Smartphone cameras have limits to how far they can zoom, but if you get a camera with a powerful telephoto zoom lens, you can take some really interesting selfies. You can even recreate those Moon skyline photos — but using your own body instead of a building.
We can use geometry to work out how far away your camera needs to be in order to take a photo of yourself in front of the Moon.
This tells us that the camera needs to be about 182.88m away to take a Moon selfie.
Since they don’t make selfie sticks that are 182.88m long, you’ll probably want to set up the camera on some kind of tripod and trigger it remotely.
Lining up a photo like this can be tricky; you need to find an area with a high place to stand and a long, unobstructed view path in the opposite direction from the Moon. The Moon moves quickly, so once everything is lined up, you’ll only have a short window to take a photo — about 30 seconds. It only takes a little over two minutes for the Moon to move out of view completely (Tools like Google Earth and sky chart apps like Stellarium and Sky Safari can help you plan out the shot).
With the right filters, if you’re extremely careful, you can even take a photo like this of the Sun. This may destroy your camera, so consult with your local astronomy club or photography store before trying this yourself. If you don’t, there’s a good chance you’ll set your camera on fire. And never look through an optical viewfinder when you’re pointing a camera at the Sun. Your eye may not work exactly like a camera, but it’s just as easy to burn a hole in.
In principle, you can take a similar photo using even smaller, more distant objects. After the Sun and Moon, the celestial bodies that appear largest in the sky are Jupiter and Venus, which are both around an arcminute in size when close to the Earth and most visible. Using the same geometry from the Moon example, you can work out how far away you’ll need to hold the camera in order to take a selfie with Venus or Jupiter: about 6km.
Holding a camera 6km away will present some obvious challenges.
Atmospheric distortion is greatest when Venus is closest to the horizon, so you’ll want it to be relatively high in the sky — which means you’ll need to be high above the camera. But you want the camera to be fairly high up as well, to get it out of the thick atmosphere.
A good setup would consist of a camera on a mountaintop, with the subject standing on a much higher mountaintop. But finding two climbable mountains the right distance apart that align with Venus on a particular day will take a lot of survey work and planning. You could try to avoid the alignment problem by positioning yourself on a high-altitude aircraft or balloon, but manoeuvring to get yourself in the right position will be extremely difficult and will probably require computer control.
Regardless of which method you choose, getting the alignment right will be an extremely difficult challenge, and whatever picture you take is going to be pretty blurry. Even under the best of conditions, it’s difficult to take a sharp picture of Jupiter or Venus from the ground due to atmospheric distortion. It’s possible no one has managed to take a selfie like this, so if you do, you’ll definitely earn internet bragging rights.
A selfie with Jupiter or Venus would push the bounds of optics and geometry, and would be pretty hard to top... from Earth, that is. If you travel to space, where atmospheric distortion is less of a problem, you can open up new selfie possibilities.
There are several telephoto cameras in space with very high angular resolution, although you may have trouble convincing NASA to let you borrow them. (In theory, by the time this book is published, the James Webb Space Telescope should have finally launched. [Editor’s note: It was delayed again while this chapter was being edited.])
But there’s a way to take a space selfie with an even longer “zoom” than the fanciest space telescope. It’s called occultation, and it’s one of the coolest tricks in astronomy.
When an asteroid passes in front of a star from the point of view of Earth, people with stopwatches scattered around the world can time when the star disappears and reappears, and use those measurements to build up a picture of the asteroid.
This technique can be used to see detail too small or faint for the fanciest telescopes to make out. And it could, in theory, let you take an incredibly distant selfie while in space. All you need is a network of friends on the ground to watch a distant star blink out as you drift in front of it.
Using a distant star, your friends could take a photo of you from a distance of up to several hundred miles. You won’t be able to go any farther than that because your shadow will be lost to diffraction. If you use a distant X-ray source instead of a visible star, the shorter wavelength will reduce the effects of diffraction, and you could conceivably take a picture of yourself standing on the surface of the Moon while your friends observed from the ground.
Just remember: orbital alignments used for occultations are rare and usually don’t repeat, so they take a huge amount of planning — which means you’ll only get one shot.