Andy Rubin unveiled the Essential Phone towards the end of May, and following the current flagship trend, the smartphone comes with dual rear cameras. One of the sensors is the standard RGB sensor, while the other is a monochrome sensor. Both the sensors work together to create the final image. Essential Phone was supposed to ship out in a 30-day period post launch, but there was no word from the company. June went by and so has July (well, almost), and the Essential Phone is still not available. But the company has come forward and has attributed the delay to improvisations on the camera.
After a confirmation by Andy Rubin that the Essential phone will ship out very soon, the company has posted a detailed blog on how the camera works, and what are the key factors of the system. The camera does not use just the one sensor for photos but both the RGB and the monochrome sensors work together to produce better and sharper images. The technology, the company claims, also improves low-light photography. The technology used in the Essential Phone is not new though, and the likes of Honor 8, and Moto Z2 Force also use similar tech. ALSO READ: After Essential Phone, Andy Rubin wants to take a shot at Google Glass-like smart glasses
The camera technology used in Essential Phone is detailed by Yazhu Ling, the image quality engineer working on improving the end results. Some of the images shared by her on the blog are impressive, and dramatically improve low-light results. When an image is captured using the color camera, it is assigned a pixel value and if there is no exact value, it can lead to blur or loss of true color. In the monochrome camera, each pixel is assigned a true black and white pixel value, and when the images merge, each pixel has a definite value, making the images look better. The technology was explained in detail along with images. As mentioned in the blog post, “When taking a still picture, Essential Phone activates both cameras at once. The monochrome and color images are then fused to create a final photograph with rich, deep clarity.” ALSO READ: Essential Phone by the creator of Android checks the right boxes, but would India even care?
— Essential (@essential) July 27, 2017
The company claims that over 20,000 images were captured, and tested in order to understand the exact clarity and result required. The post also mentions another tweak the company made to the Essential Phone — Objective tuning. As she explains, “Objective tuning is meant to ensure that each camera module sent to production is operating at an acceptable baseline level. It began with picking the correct golden and limit samples from the factory.” In simple terms, this means that the camera auto-adjusts the focus, exposure, white-balance, lens shading correction, and more to arrive at the perfect ISP.
This tuning process lasted three months, and the feature was tested not just in a lab, but also in real life conditions till the final tuning was achieved and the images looked satisfactory. As she says on the blog, “Our subjective tuning process began in January 2017, and during that time, we have gone through 15 major tuning iterations, along with countless smaller tuning patches and bug fixes. We have captured and reviewed more than 20,000 pictures and videos, and are adding more of them to our database every day. We’re almost there, but I’m not going to stop tuning the camera on our phone until the last possible minute to provide the best photographic experience possible.”