iPhone 7Plus – first looks

mickyates iPhoneography, Local & Bath, Photography, United Kingdom 0 Comments

I got the iPhone 7Plus a few weeks back, and I must say that, from a photographer’s perspective, I am very impressed. It’s particularly pleasing that the phone uses its two lenses to simulate a most passable bokeh.

Briefly, the iPhone 7Plus has two lenses with the equivalent on a full frame 35mm/DSLR of 28mm and 56 mm respectively. The 28mm is the same effective focal length as the current iPhone 6. On both the 7 and 7Plus, depth of field has been improved to provide a more realistic simulation of depth than the 6. However, only the 7Plus has two lenses. The 12MP sensor delivers files around 14 meg when shot as DNG raw files. The 28mm lens has an aperture of ƒ/1.8, and the 56mm an aperture of ƒ/2.8. The real focal lengths, by the way, are 3.99mm and 6.6mm respectively.

Anyway, enough of the tech. How does it shoot? The Portrait mode on the camera triggers the use of both lenses. You have to focus within about 2 1/2 metres to allow the system to work. 56mm handles the subject, and the 28mm covers the background The Portrait mode on the camera triggers using both lenses.

Here’s a street shot from yesterday. I think the camera did a neat job, even capturing some of the hair against the blurred background.

justin-towell-1

So what is actually happening? From digital trends:  The iPhone 7 Plus doesn’t produce “real” bokeh, in the traditional camera sense. Apple is actually using a combination of software, distance measurements, and depth of field data to calculate what the bokeh should look like, and then processing to create the bokeh digitally.

And it does it blindingly fast, with very little lag after taking the image. The other cool thing is that Apple is not just using a “gaussian” random blur to create that background. In fact Apple have clarified that it is instead a custom disc blur. This is a blur with a more defined, circular shape than gaussian blur. Thus the iPhone is using a mathematical “kernel” to mimic the light shapes that you find in quality lenses when their aperture is opened wide. If that is so, then in theory, in the future, Apple might be able to actually mimic the bokeh of specific lenses. iPhone Noctilux, anyone?

Here’s another test. I like the way that the software is correctly identifying almost all of the detailed the edges of the close-in object. Not quite, if you look closely at the flower stems, but good enough for many images, and better than any other camera phone.

red-bells-blur-1

And there is impressive lens sharpness when the image is cropped 50%.

reds-bells-blur-2

That said, there are limits.

At first glance, this is a very good example, and especially with the “blur” gradually building as the railing disappears into the distance.

railings-1

However, let’s really zoom in. You can see, between the railings, that the algorithms have missed some of the bokeh effect.

So there is a mixture of “sharp” and “blur”. Still, as this is only the very first iteration of Apple’s approach (IOS 10.1), I think we can expect some ever more accurate capabilities.

railings-2

Stepping back from the “bokeh” effect, the camera of course delivers great results overall.

Hard for anyone to have an excuse now to take a really bad image!

bath-abbey-1

Here’s a more detailed review, and discussion of “bokeh” from Stu Maschwitz