Hands on with the iPhone 7 Plus’ crazy new Portrait mode

One of the most talked about features of the iPhone 7 at launch was the new Portrait mode.

It’s a software feature that uses the two lenses of the iPhone 7 Plus to create the look and feel of an image shot with portrait settings on a camera with a telephoto lens.

Simply put: a pleasing blur that separates the foreground (person) from the background (other junk). I’m going to get a bit wonky in this piece because I feel the context will be sorely lacking once this feature hits widely — and there are some that are interested.

If you’ve ever had a portrait taken in a park or seen a wedding picture and wondered why they looked so much better than the images from your phone, the answer is really a three-parter:

  1. A relatively wide aperture is being used, which causes (among other effects) the “field of focus,” or the bit of the picture that is sharp, to be very narrow. This means face in focus, background not in focus.
  2. It was likely, but not always, shot with a telephoto lens. This enhances that ‘separation’ between subject and background because tele elements in a lens cause telephoto compression, thinning out the apparent field of focus and putting faces into proper proportion. This is why a nose looks the right size in a proper portrait and looks too big with a wide angle lens.
  3. But mostly, the photographer took the time to learn how to use her equipment, positioned the subject appropriately and used her artistic judgment to provide a proper composition.

Apple can’t yet do anything about the last one for you. That’s your job. But it could tackle the first two, and that’s what it’s done with Portrait mode. Before we get into how it works, let’s break down how it does what it does.

How does it work?

betaI’ll just refer back to my iPhone review to set the scene for how Apple is making Portrait mode work:
The depth mapping that this feature uses is a byproduct of there being two cameras on the device. It uses technology from LiNx, a company Apple acquired, to create data the image processor can use to craft a 3D terrain map of its surroundings.

This does not include the full capabilities of the Primesense chip Apple purchased back in 2013 (we have yet to see this stuff fully implemented), but it’s coming.

For now, we’re getting a whole host of other benefits from the two cameras, including “Fusion,” Apple’s method of taking image data from both the wide angle and telephoto lenses and mixing them together to get the best possible image.

We’re also getting Portrait mode, which launches today in developer beta and later this week in public beta.

The Portrait mode, which prominently displays a beta notification on first launch, resides to the right of the standard photo mode in your camera app. There is no zooming, digital or otherwise, in Portrait mode. Instead, the Portrait mode exclusively uses the 56mm lens to shoot the image and the wide angle lens to gather the depth data that allows it to generate a 9-layer map.

depth-map

If you want to get a feel for how this works, hold your hand up in front of your face and close one eye. Then open that one and close the other. Do you see how you can see “around” your hand? That’s how Apple’s camera system is working. The wide angle and telephoto “see” slightly different angles on the image, allowing it to separate and ‘slice’ the image into 9 different layers of distance away from the camera’s lens.

Once it has these nine slices, it can then pick and choose which layers are sharp and which get a gaussian blur blur effect applied to them. Update: On additional inquiry, Apple clarified for me that it is in fact not gaussian effect but instead a custom disc blur. This is a blur with a more defined, circular shape than gaussian blur. So if you’re one of the few that was hankering to know exactly what kind of blur was applied here, now you know.

The preview blur effect is coming right from Apple’s Core Image API.

Once the telephoto lens detects the subject, using autofocus and other stuff we’ll talk about in a second, the image processor inside the iPhone 7 will then apply blur in greater and greater amounts to the layers that are further away from that subject.

So, for instance, if the camera analyzes the scene and pins your subject at 8 feet away, it will slice the image and apply a blur effect on a progressive gradient scale across the other layers. Things that are very close to your subject may be sharp — included in that variable-width slice of the in-focus area. Once they get further away they get a little blur, then more, then more — until things in the far foreground or far background are blurred to the “maximum” level.

Tesla and SolarCity-powered Island | Crunch Report

Tesla and SolarCity-powered Island | Crunch Report

Watch More Episodes