This is the Great Camera on Pixel 6 and Pixel 6 Pro


 After various leaks, Google finally unveiled the Pixel 6 and Pixel 6 Pro. What are the camera capabilities like?

The camera is one of the big changes happening in this Pixel duo. This is interesting because the Pixel (in every generation) is arguably one of the best camera phones.


But Google has always used the same sensor in the previous four generations of Pixel, from the Pixel 3 to the Pixel 5 and Pixel 5A, namely the 12.2-megapixel Sony IMX363 sensor for the main camera.



Well, on the Pixel 6 and Pixel 6 Pro, Google finally replaced the camera sensor. The main camera used in the two new Pixels, apart from having a larger resolution, also has a larger lens size and aperture.


The sensor size is 1.31" with an aperture of f/1.85. Google claims this combination can 'capture' 150% more light than the Pixel 5.





Then the camera with the ultrawide lens uses a 12MP sensor, and specifically for the Pixel 6 Pro there is also a third camera that uses a telephoto lens with 4x optical zoom and a 48MP sensor. As for the front camera, the Pixel 6 uses an 8MP camera with 84-degree coverage, while the Pixel 6 Pro has an 11.1MP front camera with 94-degree lens coverage.


The update is not only from the hardware side, because Pixel is indeed known as one of the pioneers of computational photography. Pixel 6 and Pixel 6 Pro have some new 'weapons' on the software side.


One of them is the Portrait Mode update which promises to reproduce a more diverse range of human skin tones, and Google has named this feature Real Tone.


To develop this feature Google worked with a number of photographers and videographers to record and photograph filling a database of more diverse human skin tones.


Google is also implementing a new algorithm to reduce the uneven glow that causes different skin tones in different parts of the face.


Then there is Face Unblur, which promises to keep parts of the human face sharp and not blurry in photos. The way it works is by using multiple images taken at once every time the shutter button is pressed, even when shooting a moving object.


The process starts before the photo is taken, by setting up a second camera to shoot with a faster shutter speed when the system detects a blurry face in the photo. Then the two photos are combined to produce a sharper image of the face.



Another interesting feature is the Motion Mode, which is designed to record long exposure photos to produce interesting effects. For example photographing people with moving objects in the background.



The way it works is by using machine learning to detect objects in the photo frame. Then the camera will use the multiple shutter method to record photos, namely using a faster shutter to record people and using a slower shutter to record the background of the photo, and the photos are then combined.



This kind of photo is usually taken with the help of a tripod so that the image is not blurry, but according to Google, with the Pixel 6 and Pixel 6 Pro this effect can be obtained by shooting handheld only.


There are still some other interesting features such as the Magic Eraser to remove objects in the background and also Google's collaboration with Snap to embed the Quick Tap to Snap feature on the Pixel 6, so users can quickly activate the Snapchat camera application.

Previous Post Next Post

Contact Form