Antelope Canyon

Leica LX100, Skylum Luminar

I think everybody on that tour bus got the super flu.

Antelope Canyon is one of the natural wonders on everybody's bucket list and so you know it's going to be an expensive ordeal. You get to the bus at 5am and drive four hours, switch to off-road trucks, and get packed like cattle into a surprisingly narrow canyon.

They watch the weather super close, because it flash-floods - the whole beautiful thing is being carved by water. And our guides assured us there's just enough time to escape with our lives if you stay with the group.

Leica LX100, Skylum Luminar

Certainly there are places destroyed or disappeared that only exist now in photographs. But popular spots have already been documented by ten-thousand talented people with better equipment. And commercial tourism leaves fingerprints on the experience that's not in the pictures. For example, to me these photos represent the super-flu.


The Robot Heart

Leica LX100, f/16 1/15s, Skylum Luminar

Albert stood in the cold watching the sunset. The brochure said his new robot heart was powerful as a mountain lion. He focused his mind on the bigness, the majesty of nature, himself far below it all. Then the robot heart kicked into gear, and he felt the surge. Euphoria washed and warmed in waves from his chest. He started to run, launched forward by this unstoppable locomotive.

The brochure warned the robot heart formed addiction. First Albert was addicted to staring at his wife of fifty years. Side-effects included sighs and occasional teardrops. Chased out of the house, he became addicted to sunsets.

On Wednesdays he attended the robot heart support group. There were families struggling with excessive volunteering and terminal philanthropy. One woman had adopted a dozen children. Hearing these stories, Albert's heart audibly whirred.

Photography Science-Fiction

#1, iPhone XR, Adobe PS Express: HDR

Software is where the photography action is. We've got stabilization, super-ISO, HDR+, stacking, eye-tracking, super-resolution, night-shot, de-blur, portrait-mode, deep-fusion... By adding a CPU between the lens and the image, we're entering the world of computational photography.

None of this mad-science need be limited to smartphones. I've been watching the big camera companies react to this new competition - and they've taken some hits. However, all this software should work even better in a big camera. NASA does it, right?

We can overcome any physical limitations in lens or sensor - to make photos that wouldn't have been possible without re-painting the thing in photoshop. Computational photography will ultimately make photos better than expensive optical equipment alone.


#2, iPhone XR, ON1 Photo RAW: HDR, LUTs, Blur

This photo was taken in terrible light. Noise-reduction is fine - this photo is ISO 640 and perfectly usable. Newer smartphones do much better. The image is scaled-down further reducing noise and artifacts, but I wouldn't call it clean. It's one place where having a ton more pixels can help.

LUTs, or color-grading was used to match a particular film look. I used "Aachen" - so I guess I'm color matching a 16th century German painter. My goal was to better contrast the shirt against the busy background.

I subtly darkened and blurred non-subject areas to pop the subject. We can get blur with a big wide-open lenses, but I used a tilt-shift effect. Our eyes are drawn to the sharpest and brightest object.


Associated Press code of ethics does not approve.

Despite the edits, it still reads as non-fiction - albeit more cinematic. Associated Press code of ethics for photojournalists does not approve of added blur or color-grading other than grayscale. And photojournalists do buy expensive cameras, so manufacturers may resist science-fiction features that news agencies frown on.

I've got Nikon and Fuji lenses, so I look for this stuff in their marketing. There's new hotness like face-tracking, bluetooth, and electronic shutter. Fujifilm heavily markets color film-simulations. HDR-mode for jpegs exists, but they never talk about it and I can't make it work.


Photo science-fiction viral-markets itself.

Smartphones and the Instagram code of ethics on science-fiction is yes please. There are no rules in the wild west. The craziest thing will always win. Swap all the faces. Eventually one of the crazy tools is going to be useful for professionals - and hit the front-page.

I'm curious what news agencies ultimately decide about lens simulations in smartphones - given how popular these devices are. Computational photography is going to sell tons of smartphones. Will this force big camera makers to implement more science-fiction, or perhaps they will move opposite?


#3, iPhone XR, Skylum Luminar: AI Enhance, LUTs

UPDATE: I just saw the very expensive and very big $4k Olympus OM-D E-M1X with multi-shot super-resolution, and subject-recognition autofocus. These features might not be aimed at me - not looking for more pixels or more specific auto-focus. However, they're also marketing "Live ND" a long-exposure simulation, and focus-stacking.


References: 
1. https://vas3k.com/blog/computational_photography/
2. https://asia.olympus-imaging.com/product/dslr/em1x/feature4.html