- These features will remain Pixel 6 exclusives (for now)
- Pixel 6: Sales start from 28 October
With the new Pixels, Google has also integrated its first own SoC. Google’s Tensor puts a strong focus on components that realize the computational tasks for machine learning and AI processing on the device. So which of the new features make use of the new Tensor SoC and consequently won’t be coming to other smartphones?
Below, I’ll show you which features you’ll currently only find on the Pixel 6 and what exactly they do. In my headline, I wrote “for now” because Google is surely planning a Pixel 7 as well and also because Google has expressed to AndroidAuthority that currently, it is trying to bring the features to older Pixel models. This would require further “technical investment”.
What exactly Google alludes to, can currently only speculate. Theoretically, the additional computing power offered by the Tensor SoC could be outsourced to the cloud on older devices. The images would then not be processed on the slower hardware of older models, but on Google’s servers. That’s still up in the air, though, so let’s focus on the here and now for now!
Let’s start with Google’s favorite marketing racehorse: the “Magic Eraser”. A feature that Samsung users and Photoshop experts already know. Google offers the possibility to remove disturbing objects from pictures. The use is done via the app Google Photos and you only have to draw a frame around the disturbing object. The Tensor SoC then takes care of the rest.
The challenge here is to reconstruct what was behind the subject when the picture was taken. When working with image editing programs, one usually uses the copy stamp to do this, which copies a different part of the image onto the one being edited. The process is tedious and Google’s magic feature unfortunately provides mixed feelings here. For monochrome backgrounds, the eraser is convincing, but for more complex tasks, the tool pretty much stinks.
Another of Google’s favorite advertising tricks: the “Face Unblur” feature. If you take a picture of a person who is moving and the face is blurred, Google lets the Tensor-SoC play. Algorithms are supposed to focus on the face in the best possible way and in Google’s video, it looks really handy.
In practice, I couldn’t really tell when Google was applying the feature. I couldn’t find the hint in Google Photos that Pocket Lint’s colleagues describe. In any case, you won’t see the feature on your Pixel smartphone for now. In the meantime, I’ll let Ezequiel keep jumping up and down – until the neighbors ring the bell!
Ever seen a photographer pan left or right like an idiot with the camera in front of his eye? Possibly he was trying to create an image with perfect motion blur. Because according to the old school of photography, you set the shutter speed a little too high and follow the speed of the moving subject. While the subject is in focus, the background becomes blurred.
With the new Pixels, you don’t have to make such mistakes anymore. Exclusively on the new devices, there’s a “Motion” tab in the camera app that lets you choose between long exposure and motion blur. The results are really nice. In the example above, the tram is crisp and the background is nice and blurry.
The next feature caused some confusion while writing this article. That’s because a live translator is available in the Gboard keyboard on more phones. But what Google offers new in the Pixel 6 is the menu item “Live translations”. Here, the phone automatically translates foreign language content without an Internet connection, as long as you download the required language pack. These are each about 250 megabytes in size.
Once installed and activated, here’s how it works. Camila writes me a message via WhatsApp in Portuguese. Without me having to do anything, the message is displayed to me translated into German. I can type my message in German as well and Camila will get a Portuguese text. Unfortunately, the feature is currently limited to WhatsApp and Snapchat. If more integrations follow, I’m sure it will be very handy.
Google’s Pixel smartphones are known as camera phones – video recording is less optimal from experience. In three years of Pixel 3 use, I eventually gave up on capturing good video with the phone. However, Google has now integrated the HDRnet algorithm into the Tensor SoC and it even works at 4K and 60 frames per second.
The dynamic range on the test video above is really good. Most importantly, Google somehow manages to maintain a reasonably natural look despite glaring differences in brightness. In addition to HDR mode, the Pixel 6 also offers several video stabilization features. Check out Antoine’s review of the Pixel 6 for that!
Do you fancy the Pixel 6? Then you can buy the new Google smartphone one day after the publication of this article at all known retailers. If you read this article in time, you might still benefit from the Pixel Buds that Google offers for pre-orderers. You can find all further information in the linked article.
Which feature are you the most excited to try? Let us know in the comments bellow!