Early Thoughts on the iPhone 7 Plus Camera
After a week with it, I have some initial impressions of the new dual-camera system in the iPhone 7 Plus and what it brings to the table. I’m going to share these thoughts with you using convoluted analogies because I’m a geek and have had too little coffee this morning.
For starters, the presence of a new lens is worthy of the hype. I don’t zoom digitally, so for me the iPhone has always been a fixed focal length camera. Now, it’s suddenly a two-lens system, but changing lenses now takes a tap instead of a complicated juggling act.
That’s a big deal.
In visual terms, it’s the difference between being able to get this image:
And this image:
If you’ll forgive the banality of the subject matter, we can see that the new 56mm lens really does offer a new perspective, giving you more flexibility in how you can capture a given scene.
I also appreciate the fact that the longer lens seems to come without optical compromises, resolving the full 12MP worth of detail.
Something’s got to give though, and unfortunately we don’t have to go far to find it. The 56mm lens is an ƒ/2.8 design, meaning it’s more than a full stop slower than its 28mm counterpart.
In English, this means it’s able to gather two and a half times less light for the sensor. If you’re not a photographer, you might be left wondering how that affects your images and whether you need to care about it or not, so let’s unpack this further.
Fat Babies and Hungry Sensors
Camera sensors are like babies: the more you feed them, the less noise they make but the fatter they get. We feed light to camera sensors so they can make images for us without noise, but there’s no free lunch. Either you need a huge spoon (a lens with a wide aperture like ƒ/1.2), or you need a fat sensor (the politically correct term is “full frame”).
The Apple sensor can’t afford to be fat because it has to fit into increasingly skinny jeans like your new iPhone 7. Jet Black™ is slimming, you know.
So here comes Apple with its new 56mm lens that gathers less light (tiny spoon), trickling puny portions into the starving sensor-on-a-diet. The sensor starts crying, and lo—we get noisy images!
Except…there’s actually not much noise in the 56mm images. Not much at all.
What’s going on?
Allow me to invite another critter into the frame to illustrate: meet Millie. She’s a derp train, sometimes known as a millipede.
If you’re reading this on a phone, then you’re probably looking at my portrait of Millie and thinking “that looks pretty good, chief. What’s the problem?” The problem, if you look closely, is that Apple is on an anti-noise rampage. Their new noise reduction algorithm turns the structure of an image into a watercolour painting.
Here’s a cropped view of that image seen at 100% magnification:
On the one hand, it’s impressive that there’s no noise even in an image taken in the low light of a forest in the afternoon, but on the other hand…yikes. Fine detail has been obliterated.
For most people, this is actually the correct compromise to be made, because the way this noise reduction works is that it kills noise but replaces it with a structure that looks just fine when viewed at normal sizes on small screens by the general public. That’s 90% of image viewing circumstances right there.
Here’s another shot of Millie:
The cool thing about this processing is that it gives the illusion of crisp detail being maintained (notice the fine pattern of the leaf skeleton she’s about to trample). The illusion only falls apart if you zoom in to a 1:1 ratio, which non-photographers typically don’t do.
Especially on photos of a millipede.
By the way, fun fact about millipedes: they’re slow-moving and can’t bite, so millions of years of evolution have given them time to prepare a potent defence against predators. When threatened, they retract their dozens of legs and tighten into a firm coil, then they secrete noxious, corrosive chemicals through tiny gaps in their exoskeletal armour.
In other words, they curl up into a ball and fart.
Pretty raw, right?
Speaking of Raw…
Apple has introduced the ability for third party camera apps to allow capture of RAW photos. I’m proud of that segue, I won’t lie.
If you’re rocking an iOS device with the new 12MP sensor (iPhone SE, 6s/+, 7/+) and have updated it to iOS 10, then you can use your favourite third party camera tool to capture the full range of data from the sensor and process it however you see fit.
This is important in light of what I wrote above because it means that if you don’t like Apple’s heavy-handed noise reduction and would prefer to keep some noise (and thus detail) in your images, or reduce it yourself using a better algorithm, you finally can.
I could show you some of my experiments in processing iPhone RAW files, but quite frankly I haven’t spent enough time working this way yet to give you any meaningful information, so keep an eye out for another article where I focus on that aspect of the new camera system.
Suffice it to say that, as a photographer, I deeply appreciate this flexibility.
If you’re not a photographer and you’re somewhat concerned about the noise stuff I was talking about: don’t be. Again, Apple made the right call for 90% of circumstances, and the noise reduction is way less aggressive on the normal 28mm lens.
Not only that, but because the 28mm lens has a new ƒ/1.8 aperture (bigger spoon for the baby sensor), it’s gathering more light to begin with, meaning less noise to deal with in the first place.
I was at a wedding last weekend and was able to grab shots like this:
Normally I would expect to get unusable garbage out of a smartphone camera in those kinds of circumstances, but thanks to the combination of wider aperture, optical image stabilization, and smart noise reduction, Apple has made it possible to capture these moments to a satisfactory level of image quality.
Another nail in the coffin of dedicated compact cameras.
The strangely delayed portrait mode that Apple showed off during the iPhone 7 unveiling has now made its way into the public beta of iOS 10.1. Naturally, I have this installed and have been testing it out.
Seeing the technology in person makes it that much more impressive, and for the most part it works extremely well. Here’s an example image pair (it always saves both the unaffected shot and the processed one for you):
Those are untouched, straight from the phone. To my eyes, the bokeh is extremely convincing. Not so much in terms of optical accuracy, but in terms of conveying the effect of bokeh as a means to isolate the subject in a pleasing manner.
I have to do a lot more testing before I can offer any conclusions, but first impressions are very good and so far the only major caveat is that you need to be in brightly lit situations to make use of this feature. Because it uses the 56mm lens for the portrait (the 28mm lens is used to aid in depth mapping and other auxiliary things), you want to give it lots of light to work with so you don’t get that aggressive noise reduction kicking in.
It’s also worth mentioning that in the current beta version, the live preview is not always accurate. The degree of bokeh and even the specific edge detection results often looked wrong in the live view, but the captured photo was perfect, suggesting that there’s more processing done post shutter release to get to the final image.
- The new 56mm lens greatly expands the shooting envelope of the iPhone 7 series
- Images captured using the 56mm lens have heavy noise reduction unless you’re in good light or shoot RAW using a third-party app
- RAW capture and editing is amazing to have, but I need to test it more before I offer any thoughts
- Portrait mode is shaping up to be a huge feature, and the current beta version is already producing excellent results—I need to do more testing on this front as well
- Camera sensors are like babies and millipedes defend themselves using yoga and flatulence