The Google Pixel 3 and Pixel 3 XL were launched early this month in New York. Google started the event by showing videos of YouTubers trashing the leaked design and then went ahead to reveal that same design. Best part, it neither offered any justification for the design nor suggested that it regrets it.
Google also did not speak about specifications of its new devices, not even once. A few features such as Screen Call got some attention. In case you missed reading up on it, it’s a new feature where Google Assistant takes calls on your behalf. While this spoke for itself in a loud way, another representation called Night Sight inside the camera spoke volumes, but silently.
With Night Sight, Google wants to make photography in extreme low light conditions a reality. At the launch event, Google compared low-light shots taken with the iPhone XS and Pixel 3. The difference between the two images was equivalent to that of “day and night.”
At BGR India, most of us were excited with the demo, but disappointed at the same time when Google confirmed that Night Sight won’t make it to final software until November. But we are talking Android and you don’t necessarily need to wait for Google’s engineers to push the update. A modded version of Google camera app with Night Sight was recently posted by XDA Developers user cstark27 and I’ve been testing it on a Google Pixel 3 XL and the results are surprising to say the least. Before we get started with the feature, I’d like you to take a look at two pictures shot indoors with minimal natural light.
Editor’s Note: The picture on the left is clicked with default settings. The image on the right is clicked with night sight mode enabled.
As you can see in the first picture, the only visible thing is the back of white Pixel 3 XL, with which I am trying to take the picture. With Night Sight enabled, you can see that I am trying to capture my Huawei P20 Pro placed on top of a laptop keyboard. Though Night Sight mode has over sharpened the image to produce light and in the process, lost valuable detail, it is immediately clear that the software here is nothing short of magic.
The second picture is an even better representation of what Night Sight mode can do. The first picture, as one would expect, captures the screen well, though there is a bit of an orange tone to it and focuses mainly on preserving details on my laptop’s background image. The Night Sight mode lowers ISO down to 55 and uses an exposure of half a second to produce a more soothing image. This will easily pass any camera test on Instagram.
How does Night Sight Work?
For most of the past 12 months, I’ve been using a Pixel 2 as my primary camera. Ive come to understand that its camera adapts to you and your use patterns rather than the other way around. With the Pixel 2 and Pixel 2 XL, Google impressed us with its brilliant portraits. This time around, Google seems to have set its attention on low-light photography.
This year’s Pixel 3 as well as last year’s Pixel 2 both use dual pixel autofocus in order to instantly lock on to the subject or scene. They also use a memory buffer for zero shutter lag. When you press the shutter button, the camera captures the image instantaneously. Night Sight uses this same technology with Google’s HDR+ burst mode to create a long exposure image. How it does that long exposure is where the real magic lies.
For starters, Pixel users do not need to bring in a tripod for these long exposures. It can be done by holding the phone still in your hand and Google will use scene recognition to adjust for any handshake. In order to activate Night Sight, you will need to go to Settings inside the camera app and tap on Night, which replaces the previously seen Lens option. Now, when you click the shutter, Google will selectively expose the image, which is described on the viewfinder as “collecting light” and Pixel 3 can combine up to 15 frames to produce an image equivalent to a 5 second long exposure.
Once the image is captured, Google will further enhance the image using its machine learning algorithms and adjust white balance to retail details and highlights in the image. In my experience, I found that it does remarkably well for a software feature but there are times when the pictures look artificial.
Google is not the first to try software-based night mode
Huawei was, in fact, the first to introduce night mode with multiple exposures for highlighting different parts of a scene with its P20 Pro early this year. Google gains where Huawei lacks: data. With access to abundant pools of data in the form of search results, maps, street view and satellite images, Google knows more about a scene than anyone else. With Night Sight, it’s bringing those features to the forefront.
Watch: Google Pixel 3 XL Hands-On
At the onset, the difference between the normal and night sight isn’t exactly as stark as day and night. The Night Sight tends to produce brighter images and manages to keep noise at an acceptable level. Most pictures coming out of Night Sight mode can be described as moderately noisy images with good background light and that is, for a smartphone, nothing short of an achievement.
What you see above is the scene of Mumbai after sunset and there is a lot of street light to brighten up the environment but what Google did to this scene in Night Sight is really clear. The first image is basically what you would describe as passable and the second image show how long exposure could be used to bring the place from where I am clicking the image into play. While it seems to have blown most of the other area, there is still ample amount of data captured here.
This second example, is another scenario, where Night Sight manages to pull out more details from the buildings and the highlights than possible in the standard mode. There is aggressive software processing at play, which leads to blurry images when cropped 100 percent, but the result is still something most other flagships cannot manage.
The four images above were shot outdoors with guidelights on the pavement as the only source of light. Here, the Pixel 3 XL’s camera not only manages to pull in more light but also succeeds in highlighting details that were lost in normal mode. If you look at the third sample image, in particular, you can see how much data Google can process as it reveals not only that green light with grass on it but also the under construction area at the top. There is also visible noise here but we are talking smartphone camera and it gets benefit of doubt.
I have been using the Night Sight mode for few days now and a friend, who happens to be a photographer, asked me how it does when there are moving subjects. So, to test that scenario, I moved to a railway station and the image above is a representation of how Google does long exposure. As you can understand, this is not one single timed exposure, but rather exposure done frame by frame. Hence, you can see the building shining with lots of light and the trains are blurred without distorting their motion. I would say, this is my favorite of all the Night Sight images that I have clicked.
At this point, it’s important to state that this is an early software, and hence, not perfect. The picture above of the building shows how a smartphone camera can still struggle in low-light. It has done remarkably well as far as exposure is concerned, but you cannot read the text on the top. Huawei P20 Pro with its Night Mode also could not make the text readable, but it has produced a much sharper image with striking balance between the background and foreground.
Since this is a modded version of Google’s upcoming software, it would be premature to pass judgement. However, from what I have seen so far, it definitely seems promising and if there is one thing I’ve learnt about Pixel cameras over the past year, it’s simply how these feature continue to astound you over time. Google did it with Portrait Mode last year. This year, it seems determined to make a mark with Night Sight. It appears the performance of this feature will only get better till something new comes up with the launch of the Pixel 4 in 2019.