Having been announced alongside the Pixel 3 devices in October, Google is now pushing out the Night Sight update to the Pixel phone cameras (going back to the original 2016 handsets). The mode is designed to take night-time shots to the next level, and based on our testing, it does exactly that.
That this comes as a software and not a hardware update is evidence of how mobile photography is changing. For the last couple of years – since the first Pixel phones, in fact – we've seen a greater emphasis on image processing rather than the actual hardware specs of the cameras themselves.
We took a trip out at night to test the Pixel 2 camera with and without Night Sight. We also took comparison shots with the iPhone XS for good measure. Check out the gallery for all the snaps.
Night Sight works by combining several frames together, adapting based on the amount of movement in the scene and the level of hand shake. Not only that, it can use machine learning – automatic decisions based on being trained over millions of other images – to adjust colors to match their natural hues even when there is very little light available.
Even with this fancy technology behind it, Night Sight can't work miracles. What it can do is use little bits of light and color in a scene to figure out how a scene should look if it was well lit – all thanks to combining several long exposures together (to maximize light intake) and applying some smart image algorithms on top (where Google's AI engines come in).
When these algorithms have gone about their business, what you're left with are photos that are more brightly lit, have less noise in them, and exhibit more natural colors. Not bad for a software update.
The underlying idea of taking several exposures, and then intelligently analyzing and combining them, is one that other phone makers are busy adopting too. The latest 2018 iPhones, for example, use Apple's own algorithms and the on-board Neural Engine to smartly combine several exposures together when necessary.
Then there's the OnePlus 6T, which also uses image processing algorithms to better adjust to low light situations – again the exposure time is intelligently adjusted on the fly, though based on the test shots we've seen since these phones launched, it looks like Google is still out in front when it comes to enhancing photos as if by magic.
The results can really be stupendous, but as we've said, this isn't going to transform your night photos every single time. However, if the camera and the subjects are still, and there's a low amount of light available, dark and noisy photos can appear much improved.
At the same time, photos with light and dark areas don't see much difference. Another issue is that certain shots come across as if they weren't taken at night at all – Google's algorithms are that good – though most users would probably take a picture where the night-time atmosphere has gone rather than a picture where you can't see anything at all.
Night Sight doesn't change the fact that the iPhones and the Galaxy phones take very good low light photos themselves – but there's a certain level of light, just as you approach and go over the line of what the human eye can actually see, where Night Sight can do wonders ... and that keeps the Pixel cameras ahead of the pack.
There's more to come here from Google and all of its competitors, as camera sensors and AI algorithms get more and more advanced. Remember, though, that the next time you compare phone cameras, you need to take both the hardware and the software components of the equation into consideration.
Source: Google