Phone cameras still can't match dedicated snappers for quality, but they are rapidly improving. The current Google Pixel has been widely lauded as one of the best phone cameras on the market, and previous efforts like the Nexus 6P also pushed the game forward, but a software engineer within the company is looking to improve low-light performance in Android phones with an experimental app and post-production process.
Having taken a gorgeous photo (published below) of the San Fransisco skyline at night with a professional DSLR camera, researcher Florian Kainz was challenged by members of his team at Google to replicate it using a smartphone camera. According to Kainz, even the most sophisticated HDR+ systems – which take a burst of short exposures and combine them into one image – struggle in very low light, making late-night landscapes beyond the capabilities of most phones.
Google isn't the first to try and make smartphone cameras work better in almost complete darkness. Along with engineers at rivals like Apple and Samsung, who are trying to one-up each other with their next inbuilt camera apps, third-party creations like SeeInTheDark aim to let smartphone users take images lit only by the moon. Although usable, the images from third-party apps are generally far too grainy to really wow like a DSLR photo.
The way to take quality low-light images on a DSLR is with a tripod for stable long exposures. Although the Google Pixel does support exposures of up to two seconds, and the now-defunct Nexus 6P can handle four-second exposures, focus is tricky because contrast detection and phase detection autofocus are largely useless in blackout conditions.
To work around these issues, Kainz has developed an experimental app for the Pixel that takes a burst of up to 64 images, which are then saved as DNG files for processing on a computer later on. To test the app, he headed to the Californian coast on a bright, full moon night and pointed the phone at a lighthouse. Having shot 32 four-second exposures at ISO 1600, Kainz then took an additional burst of 32 black frames by covering the lens with opaque adhesive tape.
Although the individual frames were grainy, he used Photoshop to stack them and minimize noise, while the 32 black images were used to remove the faint, grid-shaped patterns you get from variations in the sensor black level. The result (published below) is very impressive, especially when you compare it to an image taken of the same scene using HDR+.
Of course, this process isn't practical for most users. It involves a lot of post production and a better-than-average knowledge of Photoshop, not to mention the fact this app is still in the experimental phase. But the process, which also worked in environments with little or no natural light, hints at a future where our smartphones can take passable photos at night.
Kainz believes his process could be handled by the one app in the future, although there's no word on when that app might show up on our phones.
Source: Google Research Blog