Sunday 23 September 2018

Could the new iPhone Xs be a real alternative to a camera?

Apple's latest range of mobile phones, the iPhone Xr, Xs and Xs Max
Apple's latest range of mobile phones, the iPhone Xr, Xs and Xs Max
Adrian Weckler

Adrian Weckler

Is the new iPhone Xs a real alternative to a camera?

At the launch of the phone at the Steve Jobs Theatre in Cupertino, Apple marketing chief Phil Schiller suggested it was, pointing to a cover of Time Magazine that had been shot on an iPhone. This, he said, showed that iPhones were worth their place even in a professional photographer’s bag.

More than that, Schiller said that Apple is at the vanguard of a new era in “computational photography”, where microprocessing power matched with cutting edge software can take a device’s photography power forward beyond the physical limitations of such a tiny piece of glass.

Is it?

Apple’s updates to its 12-megapixel camera system are impressive.

Both the Xs and Xs Max have two rear 12-megapixel cameras (at 28mm f1.8 and 52mm f2.4) and a 7-megapixel front ‘selfie’ camera. Apple has also slightly increased the size of its sensor, leading to a 30pc increase in the its ‘microns’, from 1.22 to 1.4. Generally speaking, this makes it more capable in low light than iPhones of two or three years ago.

And Apple says that it now has “deeper” pixels of 3.5 microns, compared to 3.1 microns for the last model, leading to a better accommodation of light and dark shades in the same photo.

But the real upgrade lies in the monster chip Apple has carefully placed under the hood of the iPhone Xs, its so-called A12 ‘Bionic’ processor.

This is an absolute beast. And it’s not just there for games and quick multitasking. Apple’s camera design teams — both hardware and software — have mined it to deliver some very powerful extra capability for the lenses.

For instance, one of the most arresting features on the new iPhone Xs (and Xs Max and Xr models, as they have the same specs and A12 Bionic chip) is the ability to create and alter depth of field or ‘bokeh’ in photos already taken.

In plain English, ’bokeh’ is the pleasing blurriness behind the sharpp main subject of a photo.

It is a common, beloved feature of portrait photographers. But it can only usually be achieved with expensive, large lenses on thousand-euro camera bodies.

Normally, there’s no way that a sensor as small as the iPhone’s should be able to do this. But Apple’s application of ‘computational photography’, backed up by a powerhouse of a new chip, lets it augment photos in multiple ways.

You can see this in the ‘bokeh’ feature. Not only does the iPhone Xs offer selective bokeh on photos, but it lets you adjust the background blur factor after the shot is taken — this is done with a user-friendly slider that appears at the bottom of the photo.

Does it match the creamy bokeh of a professional f1.2 DSLR camera lens? Not quite. But it looks pretty terrific nonetheless.

Another good example of this engine-oomph enhancing matters is in the phone’s high dynamic range photography.

Because of the horse power behind Apple’s A12 Bionic chip, the camera can take multiple frames in one ‘shot’, applying different settings to each frame.

In other words, it takes several photos, ranging from underexposed (too dark) to overexposed (too bright). By doing this, it makes sure that the single photo has the benefit of picking up dark and light shades (‘shadows’ and ‘highlights’) so that the combined picture gives a much more complete photo without anything being too dark or bright.

Traditionally taken digital photographs can be treated to the same surgery, but usually only in post-processing using an editing program like Adobe Lightroom or an app like Snapseed.

Even then, they arguably don’t do as it’s a single shot rather than multiple photos combined.

Again, Apple is doing this almost entirely with clever software and raw processing power.

So is ‘computational photography’ the future of all photography? If a HDR photo from an iPhone Xs ends up looking as good, or almost as good, as one from a professional DSLR, what’s the point of opting for the latter?

There are still some big physical differences that separate standalone cameras from an iPhone.

Cameraphones, in general, can’t match a proper zoom on a standalone camera. The second telephoto lens on the iPhone Xs is a 52mm equivalent.

But most decent zooms on cameras go to at least 150mm. (I have one that goes to 600mm.) You’ll see the difference if you’re trying to shoot a sunset or a sunrise — with a phone, you’ll get a wider sky shot. With a zoom, you’ll get a much more idyllic-looking red or orange ball, possibly with the outline of someone looking small against it.

Phones also have limited ability as wide-angle lenses. The iPhone’s main lens shoots at 28mm, quite a wide view. But there is a very different quality that a real wide-angle lens gives you, from GoPros to professional 16-35mm lenses. (To some extent, the recent iPhones have made up for this with an astonishingly good panoramic photo mode, which acts as an effective substitute for a wide-angle lens.)

But most people will never use one of these lenses.

So while professionals and enthusiasts (your humble reporter regards himself as existing somewhere between these two categories) will go on using professional camera gear, the rest of society is starting to get a glimpse of the quality that really good systems provide. And it’s coming via the phones in their pockets.

To be clear, Apple isn’t alone in pushing this. Right now, there’s an arms race going on between Apple, Samsung and Huawei on who can offer the most powerful camera features.

The lead often changes hands: 2018 arguably saw Huawei briefly go in front with its highly impressive P20 Pro phone. But Apple looks like it might be back on top with the iPhone Xs.

Online Editors

Business Newsletter

Read the leading stories from the world of Business.

Also in Business