This Google-developed chip was activated when the Pixel 2 and Pixel 2 XL systems were updated to Android 8.1, but Google only said that the chip will enhance HDR + computing power while taking pictures for better camera performance.
The specific how to use it is not too detailed description.
However, with the Pixel and Nexus phones coming February system patches, only really be activated Pixel Visual Core.
Google said that through Pixel Visual Core, applications can better call the built-in camera API, make better photos directly.
The application mentioned here, not Android system built-in camera applications, but third-party applications, more precisely those who have the permission to open the camera application.
Google provides three third-party apps currently supporting Pixel Visual Core in their documentation, Instagram, Snapchat and Whatsapp, but none of these three apps work for us.
But that does not mean that only these three applications can call Pixel Visual Core directly. Since both the Google Camera API and Pixel Visual Core are open source, in theory any developer can have their application call Pixel Visual Core to Directly shoot better photos.
Of course, Google once again illustrates the improvements that Pixel Visual Core brings to taking photos:
Pixel Visual Core is designed to handle heavy image processing but its power consumption is very low, saving battery power and, most importantly, using this additional computing power to run HDR + to improve picture quality.
In addition, Google also provided a few samples to demonstrate the effect of taking pictures after activating Pixel Visual Core.
The first set of photos shows Google's ability to improve the quality of your photos by removing light from the photo and missing details, but it does not work by boosting brightness and blurring noise, Is to enhance the brightness at the same time can retrieve more details, the difference between the two photos is very obvious.
Another set of photos shows Pixel's Visual Core handling underexposed and uneven effects, with the camera's position in a side-backlit condition that causes underexposure to the part of the house and over-exposure of the sun setting. After the Pixel Visual Core, the brightness and detail of the house and the left-hand foliage and garden have been improved, and the sunset light is more pleasing to the human eye, the details of the tree under the backlight has also been demonstrated.
In addition, Google also said that with the Pixel Visual Core is activated, it also improves the Pixel 2 digital zoom capabilities to make zooming the photos taken sharper, the details can be more reserved, after all, Pixel 2 no matter how good the camera, Also less is a telephoto lens.
The benefit of third-party applications calling Pixel Visual Core is that users of Pixel 2 no longer need to turn on the camera to take a photo, then open the app to upload the photo and upload it directly to the app.
However, Pixel Visual Core can only be used in third-party applications, and will not be called in Pixel's camera because its native HDR + tuning is good enough, but third-party applications can not be used directly. Pixel Visual Part of Core's role is to give third-party application rights.
In addition, although some people modify the native camera on Pixel 2 so that other phones can enjoy Google's HDR +, but the best camera still with Pixel 2 can be achieved.
But in the face of Pixel, I can only say that really buy the system to send mobile phone ... right.