Featured Post

Pinned Post, A Policy Note:

I have made a decision to keep this blog virus free from this point forward, at least until the smoke clears. This is not a judgement about ...

Thursday, September 14, 2017

iPhone X and Digital Lighting

There's a feature tucked away in the new iPhones that doesn't seem to be getting a lot of traction, but it represents a massive sea change in photography. It's the "Portrait Lighting" mode, and it's the second shot across the bows of traditional photography, from the world of computational photography.

The first shot was "fake bokeh" in which the 3D map generated by a dual (or multiple) camera is used to generate background blur, to simulate the look of contemporary fast lens portraiture. This was widely derided for a few minutes, and then improved, and now it's pretty much accepted. A few holdouts still mock it, but normal people can't really tell the difference.

This next shot is a much much bigger one. With the 3D map the only thing preventing doing photographic lighting in post is available compute power. This is exactly what Portrait Lighting mode does. In effect, it digitally alters the lighting of a portrait to make it closer to a professional lighting style. It's not perfect, and I am sure the internet will mock it roundly when it gets around to it, "looks so fake", "lame", "a professional would totally do it better", all of which may be true. This is not a technology that is going to get worse over time, though. It's going to get better.

When I wrote about this two years ago, I imagined a virtual studio for the professional photographer, with virtual lights placed and moved as needed, after the shot was taken, and the final results rendered as a standard 2D image for retouching. Apple has done me one better and worked out how to consumer-ize it. Rather than moving virtual lights around Apple simply offers a handful of styles, treating it like an Instagram filter. Pick the lighting style that makes you look best! Click click, "that one, yeah."

There's room for both, though, it's just software.

It's not even hard! This isn't even an iPhone, this is me and my rough knowledge of how my own ugly mug is shaped. Original, drop catchlights, shade in shadow, shade in highlights, and finally drop in new catchlights.


What does this mean for photographers? For the amateur it means more power, more flexibility, and potentially more fun. It's simply easier to take ever nicer looking pictures.

For the professional it means that your skill at positioning lights is gradually going to vanish as a differentiator -- if you can't direct your models well, learn how, because that's about to become the only skill that isn't being replaced by a robot.

For the photojournalist, and more importantly for the news editor, it means one more layer of potential falsehood inserted between reality and the printed page, the digital news feed. Think about what features you're going to want o disallow in future.

I can't even imagine what it means for Fine Art photographers.

It seems like a stupid little "selfie-mode stupidity" feature, but it's not. It's our second hint of a radically different future.

2 comments:

  1. In my opinion, the problem is just one: that this will be relegated to the category of "just another gimmick" right because it is implemented on an iPhone. Why? because the Apple strategy in the last years is placing their products as "cool fashionable gadgets for cool fashionable people". This is working extremely well for selling to whom want to be perceived as cool etc. but is hurting other possible category of product placement - see their iphone/ipad struggle to be accepted as an enterprise tool that, until now, has failed completely. At least here in Europe, Apple is victim of a self-fulfilling prophecy...

    ReplyDelete
  2. I would like to add two comments:

    First, only the front camera of the iPhone X has the depth map technology. The back camera does not. I think it shows that the most important camera, as far as portraits are concerned, is the one designed for selfies. Apple has pretty accurate estimates on the proportion of selfies amongst photographs, so I understand that their choice is telling.

    Second: there is a third usage for the depth map beyond background defocus and programmatic lights. You can change your face by something else, even in real time for videos if needs be. Apple demonstrated exchanging a face with a talking emoticon. I suspect there will be a lot of interest for that function when one can exchange the face by another believable human face, for example to appear younger on dating sites.

    ReplyDelete