(This post is part of an ongoing review of the iPhone 12 Pro Max.)
It was dark and getting darker as I drove away from the Teutonia Peak trailhead in the Mojave National Preserve. I turned right, heading away from my Airbnb in Baker, situated along Interstate 15, and drove south toward Kelso. I wasn’t lost—I had spent four weeks last year in the Preserve as the artist-in-residence and for two of those weeks I had stayed with National Park Service employees in dorm-style housing just down the road from Kelso Station. I wasn’t lost, I just wanted to drive through the park, to feel it again.
In the light from my FJ Cruiser’s headlamps I could see the burned Joshua trees on my right, mile after mile of them, the bark charcoaled strangely smooth, tens of thousands of black silhouettes, even with the FJ’s high-beams full upon them, passing me in the night.
The next morning I didn’t have the heart to go back to the Joshua trees, so I drove instead to the Hole-In-the-Wall campground, one of only two established campgrounds in the Preserve (the other is right next door to Hole-In-the-Wall). There’s a short trail there, the Ring Trail, that I thought would offer good subject matter for my iPhone comparison images.
The campground is nestled alongside a large rock outcropping—it must have a name but I don’t know what it is—and the trail starts at the south end of the parking lot and circles the outcrop. It’s an easy trail with the only minor difficulty occurring as it crosses through Banshee Canyon—named for the noises of the wind, which can be very high here. In the small canyon the trail reaches a point where you have to ascend a “slot” using metal rings that have been installed into the rock wall. I’d hiked this trail twice before and enjoyed it both times.
As I rounded the outcropping I found myself in full shade and I was curious how the two phones would interpret the scene. In the images the day before from the burned Joshua tree forest I suspected that the iPhone 12 Pro Max would go for a contrastier look (which by the way, gives the illusion of greater sharpness) while the iPhone 11 Pro Max would perhaps take a bit more realistic approach in its tones, more accurate albeit not as exciting on a primal level.
This pair, both shot with the cameras’ longest lenses shows very little difference at all. In both, the highlights are turned up too high for my taste—you might almost think these were shot in full sunlight at first glance save for the lack of sharp-edged shadows—but the colors are very similar. The iPhone 12 image, which is closer-in due to the longer focal length of its portrait lens, looks almost like a crop of the iPhone 11 image, it’s that similar.
Now look at these next two images. It’s the same scene, the tripod hasn’t moved, only now we are using the widest lenses on the cameras (I refer to the devices as “iPhones” and “cameras” interchangeably). Look in the center and you can see the same cactus and rocks as before, now a small part of the whole, the outcropping itself dominating the scene.
Notice anything different?
The iPhone 11 has gone for a more-dramatic-than-life look—remember, we are in full shade here—with a contrasty brown outcropping and a blue-bathed foreground. It was much lower contrast, in person. While the iPhone 11 went for a little drama the iPhone 12 doesn’t hold back at all, it is interested in full-out melodrama, the bigger the better. The iPhone 11 has cliffs? Ha! The 12 has cliffs with deeper browns and even more contrast. Foreground in shadow? Not anymore, it’s local-contrast time, the iPhone 11’s blue tint corrected away. And the sky? The iPhone 12 makes it bluer.
That’s quite an interpretation.
Now, two points to make here before we go on.
First, the “look” the phones are producing isn’t in any way an error. There’s nothing wrong with that look—it’s the same look you see from many Internet-forum photographers, (indeed, from many “professional” photo gear reviewers), pop the colors as high as good taste allows—and, screw that, just pop those colors all the way. People love it. If you were writing the software what would you do—make the colors that pleased almost everyone? Or craft image processing algorithms that will make the guy standing alone near an outcropping in the middle of the Mojave desert happy?
Second, Apple is working on making me smile. At the Apple iPhone event they not only announced the new phones (the new cameras) but also Apple ProRAW, which is a new image format. Why does it matter? Normally, when you take a photo with a camera, including with a phone, the image is taken and then processed by the camera. All cameras do this, making decisions about color and contrast for you, and then “baking in” the result. Some cameras, like the Fujis, have extraordinary good colors straight out of the camera whereas others, like Sonys, tend to have less pleasing colors, necessitating some post-shoot tweaking.
Now, you might wonder why you can’t undo whatever the camera did, turn off its choices and make your own? Well, you could undo it all in theory, except that the camera, when it cooks the image, throws away all of the data that it thinks isn’t useful anymore, only keeping the parts that you can see in the image—even using some of the principles of perceptual psychology to throw away stuff you can see but that you probably won’t notice missing. This sounds bad but it is, in fact, what you want to happen. Unprocessed images look dull and unexciting and the file size—with all of that un-thrown-away data—are much larger than the cooked ones and therefore more difficult to share with others and to store. The uncooked images are in a “raw” file and can be a pain to deal with.
However, since they are uncooked you can do things to them that are impossible with the processed images. You can make the colors much brighter and cleaner, make that sky much darker without unsightly bands of tone appearing, set the artificial sharpening to a lower setting to avoid some of the artifacts that those techniques introduce. Not stuff that most people will want to get involved with but stuff that many serious photographers will be eager to get involved with.
Apple ProRAW is Apple’s version of a raw file but it is still in beta and the details of how it will work are not yet fully clear. But the prospect of having a solid raw file for the iPhone, which incorporates some of the computational magic, holds great promise for creative work. Many of the issues I’m noting here may well be eliminated if the right controls over the raw file are made available.
Of course, to be fair, I didn’t even try to massage these files to fit my taste. I wanted to share them with you unadulterated—what would be the point otherwise? But Apple PhotoRAW will nonetheless be better as I will show you when I am able.
Let’s look at more photographs.
Two images of petroglyphs. It’s hard to say with any certainty which of these drawings are truly by the ancestral members of the local Native American tribe that once lived here and which were made later, which were done for some religious purpose and which were done by kids just messing around but it’s not hard at all at this point to say with great certainty which image was shot by which phone. I think I have the iPhone 12 Pro Max “look” down now, and the identity of the phone just jumps out at me from the images alone. You see it too, right? Higher contrast, colors cooler, even heading a tad towards green sometimes?
The rock here is pocked by “holes” all around their surfaces, big enough to comfortably put your fist into, little caves about six to eight inches deep. I thought this a good subject to test out the LiDAR of the iPhone 12, the radar-like technology that is supposed to measure the distance of all parts of the subject from the camera and then use that information to create fake blur, mimicking the depth-of-field effects of a large aperture lens on a full-sized camera. Small sensor iPhones can’t do attractive blurry backgrounds—bokeh—due to physics but they can do an algorithmic version—faux-keh. At least they can try.
The faux-keh mode of the iPhone is called Portrait Mode and I suppose it is optimized for photos of people’s faces with hair and eyes and not much optimized for Joshua trees or for weird, lichen-covered rock holes which go in with the closer parts of the image around the edges. In order words, a backward depth map where the head-shaped thing is further away than the area around the head-shaped thing, which would normally be the distant background.
Let’s see how the iPhones and their advanced technology handle this difficult assignment.
Ugh. That is a mess. A regular camera, with a dumb lens simply obeying as it must the laws of physics would have had no problem with this subject, even at f/2 (iPhone 12) and f/2.2 (iPhone 11), where I had the fake f/stops set. The iPhones, especially the LiDAR equipped iPhone 12 Pro Max, were lost, producing a strange, ugly, flat faux-keh that looks like the image file was corrupted in some unfortunate, irrevocable way and not at all like a purposeful creative choice. Things will get better, I’m sure, eventually.
After seeing those results I thought I would try again with a three-dimensional shape that was a little more like a human head, to put the software on more familiar ground.
Now, sometimes when I made these comparison images I would forget to turn on the self-timer. On both iPhones, as soon as the phone goes into sleep mode, it forgets the self-timer setting. You can set the phone to remember most everything else during sleep, but for some reason you have to reset the self-timer—a multi-tap process—every time. As a result I’d sometimes shoot an image without a self-timer and then reshoot it again with a self-timer.
With all of that in mind, have a look at the faux-keh in this example. I’ve made it into a lopping video so you can see the edges of the cactus, where the faux-keh transitions from focused to blurry, more clearly.
And here’s another example of the same thing.
Note again the differences in the transition and also the weirdness at the top of the left cactus where the dark thing is in focus in one shot and not in the other. Not good.
What you are seeing in these videos is not the iPhone 11 vs the 12 but the same phone with images taken seconds apart. The only difference is that on one I set the self-timer, on the other I didn’t. The first video is both iPhone 11 images, the second video is two iPhone 12 images.
Portrait mode and its faux-keh seem to be highly unpredictable, at least when not shooting human heads. In my experiments here and in my shooting the day before, the Portrait Mode was not useful, the technology didn’t work well enough to rely upon it.
The trail around the rock outcropping passes through a fence and it seems that you are on private property. There are cow patties everywhere and a ranch off in the distance a bit. The ranch really is private property, one of the many holdings still within the Preserve boundaries and my understanding is that they have a lease agreement in place with the National Park Service to graze their cows on this land.
They also graze their horses. I came across these two as I rounded a turn and they looked at me quizzically and I likewise at them. Then they began to walk toward me. I was torn. Though I know nothing about horses, I did, I confess, have a John Grady Cole moment, a Cormac McCarthy-land vision of me petting the horses, stroking their long faces and shooting a selfie doing it to show my kids. I glanced around for the ranchers but saw no one, then I walked on along the trail, my pretty horses stopping as they watched me go.
As I write this I regret not petting them then. I don’t know why I chose to walk on.
To my right was Banshee Canyon but to my left was another rock outcropping, much smaller than the one I was walking around on the trail but also much more picturesque. I hadn’t made any Sun-in-the-frame images yet today so I shot several here, moving around, purposefully looking to see to what degree the Sun would cause flare. Here’s a small gallery of images of that outcropping.
I enjoy the flare in the images but I find the Sun all blobby and misshapen, and far too large due to bloom. I wish I could have the flare on the iPhone images and have the Sun look like the Sun.
As I rounded into the canyon the walls closed in and I started up a rocky incline, not steep. About one hundred feet from the canyon entrance I felt my heart beating in my chest. I was surprised. The hike hadn’t been difficult at all and the approach into the canyon was hardly challenging. But I knew right away what was going on. I was dehydrated, perhaps dangerously so.
The year before, when I was finishing up my residency in the Preserve, I took the last day off and decided just to have fun. I looked in my off-road guidebook (the Preserve is almost entirely off-road) and picked out a hike not far from Hole-in-the-Wall. The tail head required me to find my way with a GPS down various tracks in the sand and then to hike into a canyon along a dried river bed.
The hike in was more challenging than I expected. There was no trail of any sort and it was hot and I was required to clamber over rocks and boulders along the river’s path to make progress. There were cow patties here, too—the ranchers, according to my guidebook, used the canyon as a natural corral, penning up their herds with a minimum of effort. There were cow patties here and also cow bones, entire cow skulls, white and sitting store-bought new-looking on the ground.
The guidebook promised a multi-colored rock wall as the payoff for all of this effort and I hiked for a while and then I thought I’d take a break. Not only were the rocks a little difficult but they gave rattlesnakes wonderful hiding places and so I had to be alert all of the time. When I stopped I noticed right away that my heart was beating much harder than normal. Then, immediately after, I noticed that it wasn’t slowing down. I could see it on my Apple Watch, my heart going and going yet I was just standing there.
There is no cell signal of any kind in this part of the Preserve, I hadn’t told anyone about my plans for the hike, my Smurf Blue FJ cruiser was too far from the main dirt road to be seen—and even if someone happened upon it nothing would have seemed amiss—and so it occurred to me that I might be in trouble. My little water bottle held only sixteen ounces—it wasn’t that long of a hike, and it was already half gone.
I drank half of the remaining water, looked down the canyon wistfully, and sat there on a rock, waiting a long time until my heart rate had lowered enough to head back. I drank the rest of the water as I started, on the theory that the water was better inside my body than in my bottle. It was a slow walk back, stopping often to rest, to slow my heart.
And so here I was again, just a couple of miles from my hike last year and I was in a similar situation. Yes, I should have brought more water—I was carrying the same sixteen-ounce bottle, and it was again already half gone. My heart rate was at one hundred and twenty-seven beats per minute and it took about twenty minutes for it to drop to eighty-five beats per minute, the number I had chosen to signal my start out of the canyon and back along the trail. At first as I waited I was standing in the Sun. I tried to sit on a rock to rest but it was uncomfortable, so I photographed the rock instead.
Look at the iPhone 12 photograph and imagine I’m holding a large white sheet of foamcore, reflecting light back into the scene. That’s what it looks like to me. Notice especially the shadowed area to the right of the rock. Isn’t that strange? In the iPhone 11 version my eye is looking at the top, sunlit surface of the stone. Everything else frames it, adds to it. On the iPhone 12 image I’m caught by the vertical surface facing us, all alight, the rest of the scene subservient to that surface. These are not just two different pictures, they are two different aesthetics.
The Sun was on me and I wondered whether the heat from it, though it was not a hot day, was a contributing factor to my elevated heart rate, and so I moved a few feet deeper into the canyon, into the shade. Here I did find a comfortable rock to wait out my twenty minutes before I started the return trip. I made a photograph of the canyon entrance, looking out, joking to myself that if they found my body here they would find this photograph, my last image which would, in some way, sum up my life’s work.
These are not the same photogrphs. In the iPhone 11 image the trail out is the subject of our attention. In the iPhone 12 the canyon walls speak loudest. It looks like I fired a powerful strobe onto the canyon walls and into the foreground. I do like the foreground colors—so full of life. What I need is something halfway between the two, halfway between drab semi-realism and exaggerated pop.
There were more horses on the reverse hike and the horses were on the trail this time, blocking it. I made a phone vs. phone comparison image using the longest lenses.
More contrast on the iPhone 12, yes, of course, but the sky is identical, which hasn’t always been the case. In this second example we see the contrast increase, plus a shift to warmer colors on the iPhone 12 (opposite of what we’ve seen many times in the past), and the sky in the iPhone 12 goes darker and perhaps bluer. With iPhone photography, at least using the standard Photos app, you never know quite what you are going to get until you get it.
One more horse photograph. This guy was behind me as I made the last pair of images and when I heard him approaching I turned and took his photo, too. Here the iPhone 12 seems most realistic, capturing that sense of sunlight in the desert, offering slightly better detail in the shadowed areas of the horse’s side and better rendering the sheen on its lighted flank.
As he approached I thought I was getting another All the Pretty Horses opportunity but then he stopped his approach and just stared at me. I waited a few seconds and then raised my arm, outstretched, fist down as if I had some treat, to encourage him to approach. He reacted immediately, ears back, shaking his head violently and exhaling strongly with a sputtering horse-sound. I got the clear impression that he wanted me to leave and to leave now. So I did.
Back at the FJ I took a long break, drinking long draughts of water. I sat in the driver’s seat, door open, for half an hour before I decided to go back to Banshee Canyon but from the other direction along the trail. I wouldn’t go as far as the rings but I would almost close the circle, at least, almost cover the entire hike. Plus the views from this side, I remembered, were worth it all by themselves, indeed they were the payoff for the entire hike going the other way.
At an overlook in the canyon the Park Service has built a safety rail over a dangerous portion where there is a steep drop and no way out from below. The Sun was peaking in through a “V” in the rocks and it gave me a perfect opportunity to shoot flare images against a dark background, maximizing the visibility of the flare with the Sun almost level with the camera. It’s maximum flare time.
On the way to the parking lot and my waiting FJ I made another image the yet again shows how different the feeling of the images made by the two phones can be.
The iPhone 12 has gone for a much bluer and greener look in the shaded areas than the iPhone 11, which looks much more accurate. The newer phone sure does love those blue blue skies.
As I pulled out onto the road I looked out over the desert at the mountains in the distance, orange-purple lit by the setting Sun, and I stopped the car, got the tripod and the phones back out, and ran a dozen yards out into the desert to shoot one last image. The phones produced pictures that are so very different from each other.
Ignore the tilt of the iPhone 11 image—I bumped the camera on its holder in my haste—and ignore the somewhat closer view of the iPhone 12’s long lens, and just look at the image, look at the colors, feel what there is to feel.
Both cameras are saying something here but they are speaking different languages.