Ever since the first camera phone was launched in 2002, a race of sorts to beat professional cameras had been flagged off. Over the years we have seen many smartphones being marketed to have cameras as good as DSLRs if not better. Yes, smartphones have overtaken stand-alone professional cameras in terms of their adoption and use, but they have never really been able to match the quality of DSLRs. While not claiming to trump DSLRs in all respects, Apple has launched a new feature for the dual-camera toting iPhone 7 Plus, which it believes brings one of the most sought after features of DLSRs to the iPhone.
Apple had showcased what it calls “Portrait Mode” at the iPhone 7 launch event. Just to give a background, the mode uses the two cameras of the iPhone 7 Plus to create a depth map. The iPhone 7 Plus has a regular wide angle lens from the iPhone 7, which is combined with a telephoto lens, that allows it to focus on the subject while blurring out the background to give what is called as the bokeh effect. The Portrait Mode is available only in the bigger iPhone 7 Plus (it needs the dual camera setup) and has been rolled out in the recent iOS 10.1 update. It is still in beta and has its fair share of bugs.
Smartphones with dual cameras with depth sensing is not something new – HTC had introduced something similar in the One M8 that was launched in 2014. However, the implementation was a bit clunky as users could click a photo and later refocus by selecting any layer of the photo — it was always a miss and rarely did you get a good quality photo. Others have tried it too but what’s different about the iPhone 7 Plus is everything happens automatically without the user having to do any selection.
The Portrait Mode is positioned between “Photo” and “Square” modes in the camera UI. All you have to do is to ensure that the distance between you and the subject is just right that the depth effect sign starts glowing in yellow. One you have that you can see the bokeh effect directly on the display in real time and you can adjust the angle or distance to get the right effect you need. The trick is to ensure that there is ample distance between your subject and the background to get the best background blurring effect.
The camera does everything else — you don’t have to focus or select any layer later. Apple says it uses artificial intelligence and machine learning to understand the subject in the frame and create the cutout and effect in real-time. This has been made possible with the dedicated image signal processor that scans the scene and identifies subjects in the shot. It creates a depth map using the dual cameras and ensures the subjects are in focus.
The mode works in most normal conditions and gives fantastic results provided there is ample light and a subject that does not move a lot. I was able to get a lot of stunning shots that I never thought would be possible to get right off the phone’s camera. Check some of the sample shots.
But that is not saying everything is perfect. Because it is AI driven, the camera struggles when you have glass or any other shiny surface and it would be blurred out of the photo. Also, when used indoors, there seems to be more noise in portrait mode photos than what you would get in regular photo mode. Also, it does not work even in moderately low light conditions. That said, the Portrait Mode is still in beta and we can only expect it to get better over time.
I have been using the Portrait Mode for a few weeks now (thanks to iOS 10.1 Public Beta), and it has added a new dimension to the photos I click. Everyone I have clicked so far loved how their photos pop out and the focus is on them and not the distraction that most backdrops are. It is by far the best implementation of depth sensing in a camera phone even with its current flaws. If you are sitting on the fence whether to go for the iPhone 7 or iPhone 7 Plus, the Portrait Mode alone is a reason enough to go for the larger iPhone.