Wednesday , March 3 2021

Night Sight in Google's Pixel shows how the cameras became faket

A small camera on this phone has great power: it can see things that our eyes can not.

At night in the last few weeks, I was pounding in dark places with a photo shoot in a new mode on the $ 800 pixel called Google's Night Vight. Friends in the candlestick look like they brought a lighting crew. The dark streets are red and green. Midnight city space lights up as if it were late in the afternoon. It goes to this sea through the instagram filter.

Night Sight is a great step forward for taking pictures of smartphones – and an example of how our photos are becoming, great, super fake.

It's true: you do not look like your photos. Photography was never just a capture of reality, but the latest phones are increasingly taking pictures of the unregulated territory.

For now, night view is just a way that appears in dark images on Google's Pixel phones. But it's barely yours: all kinds of phones boast how great their photos look, and not how real they are. In "upright mode", iPhone refers to imaginary background blur and identifies facial functions to reduce red-eye effect. Selfies on phones popular in Asia, automatically thin heads, brighten your eyes and smooth skin. The latest phones use a technique called HDR that combines multiple shots to create a hypertonized version of reality.

When I recently took the same sunset photo with the iPhone 6 from 2014 and this year's iPhone XR, I was disassembled for the difference – the new iPhone's image seemed like it was painted with watercolors.

What's happening? Smartphones democratized the photo for 2.5 billion people – photographing photographs that needed special hardware and user manuals.

Now, artificial intelligence and other software advancements democratize the creation of beauty. Yes, beauty. Photo editing does not require any more Photoshop skills. Now, when presented with a picturesque view or a smiling face, telephone cameras touch on algorithms that are trained for what people like to see and inflate tuned images.

Your phone has very high-tech glasses for beer. Think of your camera less than a reflection of reality, and more AI is trying to make you happy. It's faketastic.

Recording a photo on your phone has become much more than the light through the lens on the sensor. Of course, this hardware is still important and has improved over the last decade.

But more and more, it's software – not hardware – it makes our photos better. "It's hyperbole, but really," said Marc Levoy, a retired Stanford Computer Computer professor who once taught Google's founders Larry Pitch and Sergey Brin and is now working on camera projects, including a night vision.

Levoy's work is rooted in the limited limits of a smartphone. The phones can not attract large lenses (and sensors below them), such as traditional cameras, so creators had to find creative ways to compensate. Enter techniques that replace software optics, such as digitally combining multiple shots into one.

The new phones Apple, Samsung and Huawei also use it, but "we put on a ranch on software and AI," Levoj said. This made Google free to reveal images in new ways.

"From the point of view of software, Google got the edge," said Nicolas Touchard, vice president of marketing at DxOMark Image Labs, who produces independent reference ratings for cameras. (Is any of this enough to help Pixel win winners from Apple and Samsung, is a separate question.)

With Night View, Google's software is the most extreme, capturing up to 15 low-beam photos and combining them to illuminate your faces, provide sharp details and saturated colors in an eye-catching way. No pulse off – artificially increases the light that is already there.

Anyone who has tried low-light on a traditional camera knows how difficult it is to not imitate blurry photos. With a night view, before even pressing a button, the phone measures the earthquake and movement in the scene to determine how many shots to take and how long it should leave the shutter. When you press the shutter, it warns you to "hold on" and run for up to 6 seconds.

In the next second or second night view, Night Sight shares all of its clips into a bunch of small tiles, coordinates and combines the best bits to make a complete picture. Finally, AI and other software analyze the image to choose colors and tones.

Night Sight had some problems with focus and scenes almost without light. You and your subject must really hold this position. But in most of the test shots, the product was fantastic. Portraits smooth the skin and keep sharp eyes. Night scenes illuminate hidden details and color them as a chocolate factory Willy Wonke.

The problem is: how does the computer choose tones and colors of things that we experience in the dark? Should the starry sky be like darkness?

"If we do not see, we do not know what it looks like," Levoj said. "There are a lot of aesthetic decisions. We made them one-way, they could do it differently. Perhaps in the end, these phones will need" What do I see "against" What's really there "button."

So, if our phones consist of colors and lighting to ask us, is it really considered as a photo? Or is it a computer-generated work of art?

Some purists claim the latter. "This always happens with an interfering technology," Levoj said.

What it means "fake" even means, asks. Pro photographers have been repaired for a long time in Photoshop or in a dark room. Before that, filmmakers tweaked the colors for a certain look. It may be academic concern, if we are not talking about hobbies – not to mention memories – for a third of mankind.

How far phones will remove our photos from reality? What makes software easier for us to think about? What parts of images do we allow computers to edit? In a photo I took from the White House (no night view), I noticed Pixel 3 algorithms that were trained to compensate for the irregularities that actually removed the architectural details that were still visible in the iPhone XS image.

At DxOMark, the Camera Measurement Camera, it's a question of how to even evaluate images when the software interprets them for features like facial beauty.

"Sometimes manufacturers are pushing too far. We usually say it's okay if they do not destroy the information – if you want to be objective, you need to take the camera into account by the device that collects the data," Touchard said.

For the second perspective, I called Kenan Aktulun, founder of the annual iPhone Photography Awards. Over the past decade, he has studied over a million photos taken with iPhones, which discourage participants from being edited.

The line between digital art and photography "at a certain point becomes really vague," Aktulun said. In the end, however, it welcomes technological improvements that make the process of creating photos and tools invisible. The mummy of photos of smartphones is that it is accessible – one button and there you are. AI is the development of this.

"As the technical quality of the images improved, we were looking for an emotional connection," Aktulun said. "Those who get a lot more attention are not technically perfect. These are photographs that allow insight into the person's life or experience."

Source link