- Google has explained how its Live HDR feature works.
- The feature delivers HDR in the viewfinder of the Pixel camera app.
Google debuted its Live HDR and Dual Exposure Controls features on the Pixel 4 last year, and itâ€™s since brought them to the Pixel 4a. Now, the company has given us a breakdown of how the former feature work on its AI blog.
The vast majority of smartphones show you a scene in the camera viewfinder that isnâ€™t quite representative of an HDR or HDR+ photo. In other words, what you see in the viewfinder and the final photos might be wildly different.
Thatâ€™s where Googleâ€™s Live HDR comes in, giving the viewfinder the HDR treatment as well so you have a better idea of what to expect before hitting the shutter. Google says running its HDR+ algorithm in the viewfinder would be way too slow, so it had to use another approach.
Google says it essentially divides the input image into a number of smaller tiles and then conducts HDR+ on a per-tile basis.
â€œCompared to HDR+, this algorithm is particularly well suited forÂ GPUs. Since the tone mapping of each pixel can be computed independently, the algorithm can also be parallelized,â€� the firm notes. It then uses a machine learning algorithm called HDRNet for better, more accurate results. And the results are very similar to one another, as the comparison below shows.
Itâ€™s a pretty interesting way of delivering a better camera experience, as users now have a good idea of what to expect before they hit the camera shutter key. So hopefully more Android OEMs implement similar functionality in their camera apps.
Googleâ€™s explanation also comes after a teardown of the latest Google Camera app yielded evidence of potentially upcoming features. More specifically, it looks like the firm is working on audio zoom functionality, flash intensity adjustments, and a â€œmotion blurâ€� mode.