Photography Science-Fiction

#1, iPhone XR, Adobe PS Express: HDR

Software is where the photography action is. We've got stabilization, super-ISO, HDR+, stacking, eye-tracking, super-resolution, night-shot, de-blur, portrait-mode, deep-fusion... By adding a CPU between the lens and the image, we're entering the world of computational photography.

None of this mad-science need be limited to smartphones. I've been watching the big camera companies react to this new competition - and they've taken some hits. However, all this software should work even better in a big camera. NASA does it, right?

We can overcome any physical limitations in lens or sensor - to make photos that wouldn't have been possible without re-painting the thing in photoshop. Computational photography will ultimately make photos better than expensive optical equipment alone.


#2, iPhone XR, ON1 Photo RAW: HDR, LUTs, Blur

This photo was taken in terrible light. Noise-reduction is fine - this photo is ISO 640 and perfectly usable. Newer smartphones do much better. The image is scaled-down further reducing noise and artifacts, but I wouldn't call it clean. It's one place where having a ton more pixels can help.

LUTs, or color-grading was used to match a particular film look. I used "Aachen" - so I guess I'm color matching a 16th century German painter. My goal was to better contrast the shirt against the busy background.

I subtly darkened and blurred non-subject areas to pop the subject. We can get blur with a big wide-open lenses, but I used a tilt-shift effect. Our eyes are drawn to the sharpest and brightest object.


Associated Press code of ethics does not approve.

Despite the edits, it still reads as non-fiction - albeit more cinematic. Associated Press code of ethics for photojournalists does not approve of added blur or color-grading other than grayscale. And photojournalists do buy expensive cameras, so manufacturers may resist science-fiction features that news agencies frown on.

I've got Nikon and Fuji lenses, so I look for this stuff in their marketing. There's new hotness like face-tracking, bluetooth, and electronic shutter. Fujifilm heavily markets color film-simulations. HDR-mode for jpegs exists, but they never talk about it and I can't make it work.


Photo science-fiction viral-markets itself.

Smartphones and the Instagram code of ethics on science-fiction is yes please. There are no rules in the wild west. The craziest thing will always win. Swap all the faces. Eventually one of the crazy tools is going to be useful for professionals - and hit the front-page.

I'm curious what news agencies ultimately decide about lens simulations in smartphones - given how popular these devices are. Computational photography is going to sell tons of smartphones. Will this force big camera makers to implement more science-fiction, or perhaps they will move opposite?


#3, iPhone XR, Skylum Luminar: AI Enhance, LUTs

UPDATE: I just saw the very expensive and very big $4k Olympus OM-D E-M1X with multi-shot super-resolution, and subject-recognition autofocus. These features might not be aimed at me - not looking for more pixels or more specific auto-focus. However, they're also marketing "Live ND" a long-exposure simulation, and focus-stacking.


References: 
1. https://vas3k.com/blog/computational_photography/
2. https://asia.olympus-imaging.com/product/dslr/em1x/feature4.html


No comments: