Photography logo

Features of most popular Google camera (Gcam APK)

Never miss a moment with Google Camera, and take fantastic pictures and videos using features such as Portrait, Night Sight, and the video stabilization modes.

By hashan tagariPublished 3 years ago 6 min read
1

Never miss a moment with Google Camera, and take fantastic pictures and videos using features such as Portrait, Night Sight, and the video stabilization modes. The camera app has been aided with hardware accelerators to perform its image processing. The first generation of Pixel phones used Qualcomm's Hexagon DSPs and Adreno GPUs to accelerate image processing. The Pixel 2 and Pixel 3 (but not the Pixel 3a) include the Pixel Visual Core to aid with image processing. The Pixel 4 introduced the Pixel Neural Core.

HDR+

Unlike earlier versions of High-dynamic-range (HDR) imaging, HDR+, also known as HDR+, uses computational photography techniques to achieve a higher dynamic range. HDR+ takes continuous burst shots with short exposures. When the shutter is pressed, the last 5–15 frames are analyzed to pick the sharpest shots (using lucky imaging), which are selectively aligned and combined with image averaging. HDR+ also uses Semantic Segmentation to detect faces to brighten using synthetic fill flash and darken and denoise skies. HDR+ also reduces shot noise and improves colors while avoiding blowing out highlights and motion blur. HDR+ was introduced on the Nexus 6 and brought back to the Nexus 5. For the redmi phones you can use the redmi note 9 google camera for the HDR+ Mode.

HDR+ enhanced

Unlike HDR+/HDR+ On, 'HDR+ enhanced' mode does not use Zero Shutter Lag (ZSL). Like Night Sight, HDR+ enhanced features positive-shutter-lag (PSL): it captures images after the shutter is pressed. HDR+ enhanced is similar to HDR+ from the Nexus 5, Nexus 6, Nexus 5X, and Nexus 6P. It is believed to use underexposed and overexposed frames like Smart HDR from Apple. HDR+ enhanced captures increase the dynamic range compared to HDR+ on. HDR+ enhanced on the Pixel 3 uses the learning-based AWB algorithm from Night Sight.

Live HDR+

The Pixel 4, Live HDR+ replaced HDR+ on, featuring a WYSIWYG viewfinder with a real-time preview of HDR+.[9] HDR+ live uses the learning-based AWB algorithm from Night Sight and averages up to nine underexposed pictures.

Dual Exposure Controls

'Live HDR+' mode uses Dual Exposure Controls, with separate sliders for brightness (capture exposure) and shadows (tone mapping). This feature was made available for Pixel 4 and has not been retrofitted on older Pixel devices due to hardware limitations.

with Bracketing

In April 2021, Google Camera v8.2 introduced HDR+ with Bracketing, Night Sight with Bracketing, and Portrait Mode with Bracketing. Google updated their exposure bracketing algorithm for HDR+ to include an additional long-exposure frame and Night Sight to include three long exposure frames. The spatial merge algorithm was also redesigned to decide merge or not per pixel (like Super Res Zoom) & updated to handle long exposures (clipped highlights, more motion blur, and different noise characteristics). Bracketing enables further reduced read noise, improved details/texture, and more natural colors. Bracketing is automatically enabled depending on the dynamic range and motion. With Bracketing is supported in all modes for the Pixel 4a (5G) and 5. With Bracketing is supported in Night Sight for the Pixel 4 and 4a.

Motion Photos

Google Camera's Motion photo mode is similar to HTC's Zoe and iOS' Live Photo. A short, silent video clip of relatively low resolution is paired with the original photo when enabled. If RAW is enabled, only a 0.8MP DNG file is created, not the non-motion 12.2MP DNG. Motion Photos was introduced on the Pixel 2. Motion Photo is disabled in HDR+ enhanced mode.

Video Stabilization

Fused Video Stabilization, a technique that combines Optical Image Stabilization and Electronic/Digital image stabilization, can be enabled for significantly smoother video. This technique also corrects Rolling shutter distortion and Focus breathing, amongst various other problems. Fused Video Stabilization was introduced on the Pixel 2.

Super Res Zoom

Super Res Zoom is a multi-frame super-resolution technique introduced with the Pixel 3 that shifts the image sensor to achieve higher resolution, which Google claim is equivalent to 2-3x optical zoom. It is similar to drizzle image processing. Super Res Zoom can also be used with a telephoto lens. For example, Google claims the Pixel 4 can capture 8x zoom at near-optical quality.

Smart burst

Smart burst is activated by holding the shutter button down. While the button is held down, up to 10 shots per second are captured. Once released, the best pictures captured are automatically highlighted.

Different 'creations' can be produced from the captured pictures:

Moving GIF - an animated GIF to capture action or images containing a high amount of movement.

'All-smile' - a single photo in which everyone is smiling and not blinking; produced by taking different parts of every photo in the burst.

Collage - when taking 'selfies,' a collage similar to that of a Photobooth is generated.

Top Shot

When Motion Photos is enabled, Top Shot analyzes up to 90 additional frames from 1.5 seconds before and after the shutter is pressed. The Pixel Visual Core accelerates the analysis using computer vision techniques and ranks them based on object motion, motion blur, auto exposure, autofocus, and auto white balance. About ten additional photos are saved, including an additional HDR+ photo up to 3 MP. Top Shot was introduced on the Pixel 3.

Other features

Computational Raw - Google Camera supports capturing JPEG and DNG files simultaneously. The DNG files are also processed with Google's HDR+ Computational Photography. Computational Raw was introduced on the Pixel 3.

Motion Auto Focus - maintains focus on any subject/object in the frame. Motion Auto Focus was introduced in the Pixel 3.

Frequent Faces - allows the camera to remember faces. The camera will try to ensure those faces are in focus, smiling and not blinking.

Location - Location information obtained via GPS or Google's location service can be added to pictures and videos when enabled.

Functions

Like most camera applications, Google Camera offers different usage modes allowing the user to take different photos or videos.

Slow Motion - Slow-motion video can be captured in Google Camera at either 120 or, on supported devices, 240 frames per second.

Panorama- Panoramic photography is also possible with Google Camera. Four types of panoramic photos are supported; Horizontal, Vertical, Wide-angle, and Fisheye. Once the Panorama function is selected, one of these four modes can be selected from a row of icons at the top of the screen.

Photo Sphere- Google Camera allows the user to create a 'Photo Sphere,' a 360-degree panorama photo, initially added in Android 4.2 in 2012. These photos can then be embedded in a web page with custom HTML code or uploaded to various Google services.

Portrait- Portrait mode (called Lens Blur previous to the release of the Pixel line) offers an easy way for users to take 'selfies' or portraits with a Bokeh effect, in which the subject of the photo is in focus, and the background is slightly blurred. This effect is achieved via the parallax information from dual-pixel sensors when available (such as the Pixel 2 and Pixel 3) and the application of machine learning to identify what should be kept in focus and what should be blurred out. Portrait mode was introduced on the Pixel 2.

Playground- In late 2017, with the Pixel 2 and Pixel 2 XL debut, Google introduced AR Stickers, a feature that, using Google's new ARCore platform, allowed the user to superimpose augmented animated reality objects on their photos and videos. With the release of the Pixel 3, AR Stickers was rebranded to Playground.

Google Lens- The camera offers a functionality powered by Google Lens, which allows the camera to copy text it sees, identify products, books, and movies and search similar ones, identify animals and plants, and scan barcodes and QR codes among other things.

Photobooth- The Photobooth mode allows the user to automate the capture of selfies. The AI can detect the user's smile or funny faces and shoot the picture at the best time without any action from the user, similar to Google Clips. This mode also features a two-level AI processing of the subject's face that can be enabled or disabled to soften its skin. Motion Photos functionality is also available in this mode. The white balance is also adjustable to defined presets. In October 2019, Photobooth was removed as a standalone mode, becoming an "Auto" option in the shutter options, later being removed altogether.

This is why google camera is so famous now days you can download it from play store.

Source: https://en.wikipedia.org/wiki/Google_Camera

camera
1

About the Creator

hashan tagari

I am a blogger. Love to write Content on new technology, the latest tech news, gaming, gadgets review, and android. I also love to write about pets, health, business, finance, and the latest tips and tricks.

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.