Here’s why Apple’s ‘Portrait Mode’ feature only works on some iPhones and not others (AAPL)

iPhone 8 plus portrait lighting

When Apple introduced the iPhone 7 Plus in 2016, it contained a new camera feature that quickly became one of the most talked about — and copied — in Apple’s lineup: Portrait mode. 

Portrait mode uses the phone’s dual cameras and Apple’s software to mimic the quality you would get from a DSLR camera, which keeps the subject of the photo in focus and slightly blurs out the background. 

These days, other smartphone makers like Samsung and Google have their own version of portrait mode, and even Instagram has added a similar feature to its app.

But Apple was the first to popularize portrait mode, and now offers it on the iPhone 7 Plus, iPhone 8 Plus, and iPhone X. 

Here’s how portrait mode works, how you can use it, and why it’s only available on some iPhones and not others:

SEE ALSO: Apple’s latest iPad may be boring, but it’s the perfect tablet for most people

Portrait mode is only available on recent “Plus” models of its iPhones — iPhone 7 Plus, iPhone 8 Plus, and iPhone X — for a simple reason: Apple’s version of portrait mode requires dual cameras.

Soon after Apple introduced portrait mode in 2016, the feature started popping up on other flagship phones like Samsung’s Galaxy Note 8 (called Live Focus) and the Google Pixel (called Lens Blur).  

In the case of the Pixel and Pixel 2 phones, which only have one lens, Google relies on software to achieve a portrait-mode quality. Apple’s iPhones require two lenses to make it happen — at least for now.

So if you buy the iPhone 8, for instance, it will not have the ability to take portrait mode photos.  

Apple’s portrait mode requires two lenses because each lens is different: One is a 12-megapixel wide-angle lens, while the other is a 12-megapixel telephoto lens.

When taking a portrait mode photo, the two lenses serve different purposes. 

The telephoto lens is what actually captures the image. While it’s doing that, the wide-angle lens is busy capturing data about how far away the subject is, which it then uses to create a nine-layer depth map. 

That depth map created by the wide-angle lens is crucial to the end result, because it helps Apple’s image signal processor figure out what should be sharp and what should be blurred.

The image above demonstrates what a standard iPhone photo looks like (left) and what a portrait mode photo looks like (right).

At a quick glance, the image on the right seems like it just has a totally blurry background, but this is where the depth map comes into play. 

In order to make the photo look natural and as close to a real DSLR photo as possible, Apple’s image processor goes through the layers one by one and blurs them in varying amounts, an effect known as “bokeh.”

The layers closer to the subject will be slightly sharper than than the layers farthest away, and if you look closely at the above photo of my colleague Melia, you can tell: The stuff that’s close to her in the photo — like the long grass and the slab of wood on the ground — is a lot easier to make out than the cliff in the distance, which is just a dark, blurry form. 

See the rest of the story at Business Insider