All The Ways Smartphone Cameras Have Improved Over The Years

All The Ways Smartphone Cameras Have Improved Over The Years

Year after year, smartphone cameras have become more capable, more versatile, and more of a reason to leave your DSLR at home. So what are the tech innovations that have made the Pixel 2, the iPhone X, the Galaxy S9 and others such good photo takers compared to that old iPhone 6 or Galaxy S5?

Obviously, in the technical aspects of photography and cameras can get very nuanced. But in broad strokes, here’s a look at the ways some key technologies have improved over the years to make you ‘grams sharper and you snaps brighter.

More megapixels


Photo: Sam Rutherford (Gizmodo)

At the core of the camera spec is the number of megapixels it captures – simply put, the resolution of the image that gets captured. It was by no means the first smartphone with a camera, but for comparison purposes the original 2007 iPhone came rocking a 2-megapixel rear camera with a fixed focus, capable of capturing images 1600 x 1200 pixels in size. Today’s Galaxy S9 and iPhone X have 12-megapixel cameras.

In the early days of smartphone cameras, megapixels were the yardstick that these components were measured by: More megapixels meant a better camera, generally speaking. But that isn’t necessarily true now and it wasn’t necessarily true then, because there are a whole host of other factors that affect image quality, as you can see from the extensive list below.

The problem with cramming more pixels into the same-sized sensor is the pixels get smaller, and let in less light. Remember the HTC UltraPixels introduced in 2013? That was an attempt to reduce megapixels, increase pixel size, and therefore capture more light (and therefore detail) as the camera shutter flashed open for a split second. HTC was on to something, because today the megapixel race has all but ended, with smartphone makers making improvements elsewhere instead.

Bigger sensors


Image: Google

It is a truth universally acknowledged that the bigger the image sensor in a camera, the better the end result (essentially, it lets the camera capture more light and more colour detail). With any camera, you’re relying on several components working well together, but the image sensor is a crucial one.

It’s a shame then that there’s not much room inside smartphones — mobile camera sensors tend to be between 1/2.3 and 1/3 inches, much smaller than those inside DSLRs and even quality point-and-shoot cameras, though manufacturers are often coy about the specs in this regard. In fact, sensor size hasn’t changed much over the years in smartphone photography, because of those physical limitations, and it’s usually been in other areas where improvements have been made.

You’ll have a hard time digging down into any phone’s specs to find the image sensor size for the camera advertised, but the Nexus 6P was an exception — its 1/2.3-inch sensor is on the larger end of the scale, particularly for 2015, though sensor size alone isn’t a spec where modern-day devices are all that much better than phones of yesteryear. Note too the 6P’s 1.55 μm (micrometre) pixel size, larger than the 1.4 μm pixels in the Pixel 2, with a 1/2.6-inch sensor.

And of course for all the cameras that don’t advertise their sensor size, enterprising teardown artists do the work, and usually reveal that what we’re working with is teensy.

Wider apertures


Image: JerryRigEverything

On smartphone cameras as well as regular cameras, aperture controls the amount of light that gets to the image sensor. In a regular camera, aperture is manipulated to optimise for lighting conditions, blur, and desired depth of field, but in the world of smartphone cameras, in which optics are severely constrained, phone makers tend to optimise for having the widest aperture possible. This allows cameras to capture lots of light in all of those dark settings in which we all love to take photos, while keeping the shutter speed quick enough that your photo doesn’t come out blury. (Super-wide apertures have their downsides, but we’ll set them aside for now.)

Aperture size is measured in f-stops, and the smaller the f-stop, the wider the aperture (or opening). Last year the LG V30 camera set a new high watermark with an f/1.6 aperture, since surpassed by the dual aperture tech on the Samsung Galaxy S9, which lets you switch between f/1.5 and f/2.4 apertures, depending on what you’re trying to achieve with your pictures. You can get a great close-up look at the mechanism in this JerryRigEverything video.

Wider apertures have been made possible through the years as lens manufacturing quality has increased – something that’s of paramount importance if you’re letting more light in and want to keep a sharp, focused picture.

Better flash


Photo: Alex Cranz (Gizmodo)

Maybe not as important as some other components, but the on-board camera flash has made strides in the years that smartphones have been with us. Older phones, particularly Nokia and Sony models, made use of Xenon flash — very bright, but bulky and power-hungry too.

Today, phones use LED or dual-LED flash to produce a more subtle effect.. In the case of dual-LED, two LEDs are used with slightly different colour temperatures, theoretically producing an end result with a better balance of colours that isn’t completely unnatural. Look closely at the flash on the back of your phone and you may well see the two tiny bulbs.

The most recent iPhones include even more improvements, and show how various smartphone camera specs work together to produce better results than the previous generation. As well as introducing quad-LED flash in 2016, the 2017 models debuted a feature called Slow Sync: It keeps the shutter open longer to capture more light and reduce the light needed from the flash, which can flash less brightly for less time.


Faster focus


Photo: Sam Rutherford (Gizmodo)

Maybe you’ve never thought much about the focus on your smartphone’s camera if you’re not shooting sports or wildlife, but it’s pretty significant in the overall quality of your shot. It works by moving the camera lens on tiny motors to make the object of your photo nice and clear, but a host of other hardware and software factors are at play – and down the years, phone autofocus has become much more accurate, and much faster.

Before 2015, phone cameras focused solely based on the contrast they could detect in a scene. Starting with the Galaxy S5 and iPhone 6, phase detection was added, built right into the sensor: It uses the information coming in from both sides of the lens to calculate where the perfect focus is (where the points of light should meet). It’s faster than the standard contrast detection method, but it’s still not great in low light.

Enter more smartphone camera tricks. The dual pixels used on the most recent Galaxy phones, for example, turn every pixel into a little phase detection system, improving performance in darker scenes. For its Pixel phones, Google went with a time-of-flight infrared laser to measure distances quickly in any lighting situation. Again, it shows manufacturers getting creative, and in different ways, to improve photos taken on mobile.

Optical image stabilisation


Image: Google

Optical image stabilisation is more important than you might think: It doesn’t just keep your shaky videos steady, it also means that when you’re taking a photo, the shutter can stay open for longer without any blur, and again that’s crucial in terms of collecting light. In other words, your phone camera isn’t only relying on image stabilisation when it’s shooting sports.

On the most basic level, optical image stabilisation uses floating lens and miniature electromagnetic motors to move them. As the technology has become more advanced, phones have become better able to incorporate other data (from the gyroscope, for example), to further factor out shakiness. In fact there’s a whole host of different ways that manufacturers do this, both mechanical and non-mechanical.

OIS was actually cut from the original Pixel in favour of software adjustments, though it did appear in the Pixel 2. It’s also one of the small differences between the dual cameras on the iPhone 8 Plus and the iPhone X — the more expensive handset has OIS on both cameras, not just one. It’s a tech that has been refined, rather than revolutionised, in the time that smartphones have been around.

Dual cameras


Photo: Christina Warren (Gizmodo)

What do you do when you can’t increase the size of your camera lens or your image sensor, because your components need to be as compact as possible? You HTC M8 and even before that, though they weren’t used in tandem as they are now.

The key benefit is clearly are more data for the camera to work with, whether that’s more data on colour or contrast or being able to make use of a lens with a wider angle. All the restrictions we’ve talked about above can be overcome to some extent if you add another sensor and lens set to the mix. Of course, as phones have become more powerful, they have also become better able to crunch the information coming in from two cameras simultaneously.

Use a telephoto lens for the secondary camera and you can suddenly get 2x optical zoom, as Apple did with the iPhone 7 Plus. Huawei phones, like the Mate 10 Pro, have a monochrome sensor behind the secondary camera, used to gather extra brightness and contrast information. Two cameras also make it easier to assess depth in a scene, because they have slightly differing perspectives — and that opens up the possibility of the blurred bokeh effect that’s available just about everywhere now.

Improved processing


Image: Apple

Finally, some of the biggest leaps forward in smartphone camera quality have come not through better optics, but through better software processing made possible by more powerful phones – as is the case with the Pixel 2 and the smart processing chip it has on board and which is now available to other apps.

One of the benefits you can see on a Pixel 2 phone is the way HDR effects can be calculated and applied in real-time as you frame your shot – if you’ve owned a smartphone for a while, you might remember the way HDR used to take a few seconds to process, and only then after you’d snapped the photo. Slowly but surely, processing power and algorithms are overtaking the physical limitations of the smartphone camera.

Another key area this affects is noise reduction, cleaning up the areas where the phone camera just can’t match a full-sized DSLR in terms of the light it can capture. Improved processing is also evident in something like Portrait Lighting, now available on the 2018 iPhones: using software smarts and in this case the grunt of the A11 Bionic chip to match a professional camera setup.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.