How Google’s Pixel 4 Is Trying To Stay Ahead In The Smartphone Camera Race

How Google’s Pixel 4 Is Trying To Stay Ahead In The Smartphone Camera Race

There was only one stand-out feature on the Pixel 3 phones: That fantastic (single-lens!) camera, which got better over time and made the Pixel 3a the best mid-ranger on the market too. Now Google has revealed the follow-ups, the Pixel 4 and the Pixel 4 XL — so can they keep the Pixels on top of the pile in terms of phone cameras?

The single-lens snapper has now become a dual-lens affair, combining a 16MP f/2.4 2x optical zoom telephoto camera and a 12.2MP f/1.7 camera. Unusually for a smartphone in 2019, there’s no wide-angle lens: “While wide-angle can be fun, we think telephoto is more important,” said Marc Levoy from Google Research (and Stanford University) at the Made by Google launch event.

The Pixel 4, like the Pixels before it, relies on “computational photography” — that’s a term that Levoy himself came up with for a Stanford course and it means “doing less with hard-wired circuitry, and more with code” in his own words.

Essentially, we’re talking about taking several photos at once, and combining them together to get a better end result — smartphone lenses and processing speeds have now evolved to the stage where this can happen in an instant, without you noticing once you’ve pressed the shutter button.

And it means one snap can adjust its exposure to capture the darker parts of a scene, and another can do the same to retain details in the lighter areas. Put them together, and you don’t lose any of the finer details. It’s a technique you’ll now see on plenty of top-end phone cameras, including those from Samsung and Apple.

The Pixel 4 camera array also includes a hyperspectral sensor, listed as a “spectral + flicker sensor” on the Google Store. Google hasn’t said too much about what this does, but we’re assuming from the ability of hyperspectral imaging to detect multiple channels of light, the data this sensor captures is going to feed into the Pixel’s algorithms to further improve how it’s new photo modes work.

What’s new

Every Pixel has featured what Google calls HDR+, where a burst of up to nine pictures are captured every time you hit the shutter button, then averaged out to reduce shadow noise. The first new Pixel 4 feature is Live HDR+, where you’ll see this effect applied as you frame a shot — you won’t have to guess at what the end result might look like.

Pixel 4 is also introducing dual exposure controls, sliders that let you adjust the brightness (the capture exposure) and shadows (the tone mapping) before you take a shot (you might already know these sorts of tweaks from apps like Photoshop). If, for example, you want a dramatic silhouette shot rather than the even balance that HDR+ gives you, dual exposure controls make this possible.

When it comes to zoom, Google says the new 2x telephoto lens on the Pixel 4, combined with its existing Super Res Zoom tech working across both lenses, results in superior hybrid zoom. Super Res Zoom, which debuted last year, uses the tiny differences between each of the nine images in a burst to fill in the details as you zoom in. It’s making guesses, but very smart ones.

The technology, Google says, works better than cropping after the picture has been taken — if you pinch-zoom before you take the photo you should get better results than if you crop it afterward, because of the calculations that are applied as you zoom in.n

The Pixel 4 is also smarter when it comes to automatic white balancing, a photography problem that’s very tricky to fix — essentially making sure that white looks white no matter what the lighting conditions are like (if you’re indoors, for example, you’ll often get an orange tinge from the lighting).

Again, it’s a question of training Google’s algorithms to recognise when white should be white: “We’ve been using learning-based white balancing in Night Sight since Pixel 3,” said Levoy on stage. “In Pixel 4 we’re using it in all photo modes, so you get truer colours, especially in tricky lighting.”

Other improvements are coming to portrait mode, the calculations for which are now applied in RAW mode, Levoy told CNET. The addition of the extra camera lens means more information for Google’s machine learning algorithms to work with, and that should result in depths getting more accurately measured across longer distances (each camera lens captures the shot at a slightly different angle).

Finally, the already impressive Night Sight is about to get even more capable with the Pixel 4 and Pixel 4 XL. You might have already seen the astrophotography shots taken by the phones, which are made possible by longer exposures and more of them: Specifically, 15 exposures of up to 16 seconds each for the Pixel 4 astrophotography mode.

Do the maths and that means your Pixel 4 has to stay still for four minutes — but the results look worth it. As the stars move and the trees wave over those four minutes, the Pixel 4 algorithms will align and merge the pictures it takes to create one crisp, noise-free end result. If there are people in the frame, you’ll have to tell them to stay very still.

As with the Pixel 3, expect the Pixel 4’s photo-taking capabilities to get better over time because so much of the process relies on software. Levoy teased a future update that would enable a photo to balance a bright moon and a dark foreground — a brightness difference of about 1,500,000 times, Levoy says, or 19 f-stops.

The competition

Google isn’t the only company working on this computational approach to photography of course. Its main rivals Samsung and Apple also have multi-lens cameras that combine several shots into one to produce the best results — the number and types of shot in the burst might vary, as well as the processing algorithms, but the idea is the same.

As you would expect, these phone makers are keeping a lot of their algorithmic secrets to themselves, but the goal is always to produce the most detail and the least amount of noise in a photo, as well as the most accurate colour reproduction — and to do all of this no matter what the lighting environment.

Apple’s Deep Fusion camera update for the iPhone 11, which is due with iOS 13.2, uses the neural processing power of the A13 Bionic chip to optimise for detail and low noise across nine separate exposures, the same number that Google uses. (It was while describing Deep Fusion that Apple exec Phil Schiller used the “mad science” term rebuffed by Levoy in the Google presentation.)

The iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max handsets have a 12MP f/2.4 ultra-wide angle lens too. The Pixel 4 does not. That’s matched with a 12MP f/1.8 wide angle lens on all three phones, plus a 12MP f/2.0 telephoto lens on the Pro and Pro Max — as well as zooming out to 0.5x, letting you fit more in the frame from the same vantage point, you can zoom in to 2x optical zoom.

Samsung’s best phone camera, meanwhile, is currently on the back of the Galaxy Note 10+. You get four lenses: A 16MP f/2.2 ultra-wide one, a 12MP f/1.5-2.4 wide angle one, a 12MP f/2.1 telephoto one (with 2x optical zoom) and a “DepthVision Camera” to measure distances more accurately.

Samsung phones typically do more processing in advance than Apple or Google ones, which is where that adjustable f-stop lens comes in handy: The lighting conditions are analysed and the exposure is adjusted while you’re framing the shot. By capturing more information to begin with (something Samsung has been doing for years), less post-processing is required.

We don’t yet know what the Pixel 4 is going to be like for taking photos day to day, but we do know the iPhone and Galaxy handsets have caught up with the Pixel this year — whether the Pixel 4 will shift the balance remains to be seen. More than ever before though, judging a phone camera is less about reading the specs on the page, and more about seeing the end results from all that on-board processing and trickery.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.