iPhone 11 vs 12 In the Burned Joshua Tree Forest

(This post is part of an ongoing review of the iPhone 12 Pro Max.)

When I first turned down Cima road from I-15 I was relieved. The news reports in August had been depressing, nighttime videos of firemen battling a forest of Joshua trees aflame, one clip showing what appeared to be a firefighter, fully geared up, crying at the loss.

The Joshua Tree forest in the Mojave National Preserve is the largest in the world, larger than that in Joshua Tree National Park, and the most visually stunning part of that forest is upon the volcanic Cima Dome and the surrounding area.

There is no lodging within the Mojave National Preserve and in Baker, the only real town along its edge, there is only one private lodging, an Airbnb in a small house in its tiny residential area in the back. When I was the artist-in-residence in the Preserve last year I stayed two weeks at the research center in Zzyzx, about seven miles west of Baker on the shore of the dry Soda Lake, and the two additional weeks in the National Park Service employee dorms in Kelso, down in the middle of the park next to the Kelso Dunes (which everyone has seen as the desktop image of Apple’s Mojave Operating system).

No lodging means no tour buses, no hordes of flip-flop wearing, t-shirt buying day-trippers, not many people of any kind, in fact. For all practical purposes you are alone in a national park.

When I saw the fires on cable news I wanted to go back to the Mojave to see what had really happened. There’s a “weather channel effect” we are all familiar with where the television news will show just the worst of the damage from a storm and you get the impression that a major disaster has occurred only to discover that the actual damage was limited and localized.

That was the source of my relief, as I entered the park and as I drove, the desert pristine and lush on both sides of the road.

I say “lush” and I really mean to use that word. If you haven’t been to the desert you might be under the impression that it’s all sand and dead cow skulls. That’s the weather channel effect at work. But, of course, deserts vary greatly. Some really are bare, blowing sand and cow skulls and some aren’t even in the hot parts of the world, the word “desert” referring to rainfall, not temperature. The Joshua tree forest in the Mojave is lush not just from the abundant Joshua Trees but from the thick ground cover. It’s hard to walk through the desert there without stepping on some small growing thing and it’s almost impossible to walk in a straight line, various bushy plants forcing you to snake your way forward. I wish I was a botanist and knew more of what I was looking at because there is so much there.

When I saw the first burned tree on the right I thought, “Oh, there’s one,” an expected curiosity, and then I saw another, then another. Then they were in clusters, and abrutly all of the trees on the right-hand side of the road were black.

I had brought my new iPhone with me and I still had my old iPhone and I fixed them on a bracket which could be mounted to my tripod in order to shoot them side-by-side, in order to compare the photographic capabilities of the iPhone 11 Pro Max with its successor.

There are only a handful of established hiking trails in the preserve—three come to mind—so to properly experience the desert here you need an off-road or at least high-clearance vehicle in order to turn down one of the sand tracts that cut through the land in various directions. For this trip, though, given I had only two days, I limited myself to paved roads and what I could hike to quickly. The Teutonia Peak trail is one of the three established hiking trails and I’d already hiked it several times, both during my residency and just after it ended last October when my wife met me in the desert and I took her around to show her the highlights.

The trailhead starts at the small parking area along Cima Road and heads toward more or less straight towards the peak.

This is what you see as you get out of the car. Teutonia Peak is in the distance, the gradual rise of Cima Dome off to the left, the hiking trail at the right edge of the frame. The first thing I noticed about the images is that the cameras handled well having the sun in the frame. Despite the intense light hitting the lens full-on the ground is well exposed and detailed, not backlit or washed out as you might have expected.

This is shot on full automatic, as are most of the photographs I made here, in order to test the camera as part of the system. It’s the system I really care about, not each link in the image chain considered in isolation, and when testing new lenses or cameras (or, in the case, a multi-lens, multi-camera) I want to see how well it does making its own choices. If it makes good choices—and computerized cameras are making better choices all the time—I can focus my attention on adjusting those choices or thinking more deeply about the non-technical aspects of image-making. It’s like having staff. If they are good you let them do more and more of the work while you spend your limited time on high-value tasks, the stuff that you are good at, the stuff that makes you special.

In the images you can see the desert floor in front of you, thick with life. And then, about thirty feet away it all empties out and the Joshua trees look dark and silhouetted.

In the close-up of the lower middle of the images (above) you can see some of the differences in processing. The dark areas of the iPhone 12 are blacker than the 11, and the highlights may be a tad brighter, both of which result in greater contrast. Of course, astounding detail is visible in both images and if you look carefully you will start to see some smoothing and other processing compromises, but of course it is silly to be viewing this image so close and then being so critical of image artifacts that are wholly invisible under real-world conditions. But take note of that contrast. We will see this again in other images.

In this next image pair, taken a few yards up the trail, we can see better the damage of the fire, the blackened trees, and for the first time clearly note the lack of undergrowth. Compare the desert floor here with the prior images to get a better sense of what is missing.

The sky in these are nearly identical yet the trees and ground on the iPhone 12 are snappier. This is the increased contrast again, same as we just saw before, only apparently applied only to the non-sky areas of the image. The iPhone is not just a camera with settings but a camera integrated with a computer and so the programmers have done what you and I would have done—they wrote code to identify different parts of the image because it would be cool to process each part of the image in a customized way, rather than simply subject the entire frame to the same processing effects. So the sky is treated one way, as we see here, the trees and ground another. No doubt they can pick out water, faces, animals, cars, smooth areas versus textured areas, and on and on. Think of it as having an expert printer at a custom lab processing your picture. You may not agree with all of their choices but they are very smart, have lots of experience, and are masters of their craft.

But, to contradict the analogy, they are incredibly dumb sometimes. Take this example from a burned Joshua tree a few feet away.

Both iPhones have portrait mode but the new iPhone 12 promised a greatly improved Portrait Mode due to LiDAR. What is Portrait Mode and what is LiDAR?

Portrait mode is the same as the camera being set on its strongest lens but with additional software options. Some of the options involve “looks”—stylized effects that mimic different kinds of lighting. The other option is a fake background blur, fake bokeh, “faux-keh” if you will, that tries to mimic the shallow depth-of-field of real lenses on real cameras.

LiDAR is a sort of radar for cameras. It can tell you how far away things are, even small parts of the image. Imagine a grid over the screen and inside each grid is a number with that number representing the distance of whatever is in that grid. That gives the right idea. If you can make the little square in the grid very small you can make a high-resolution map of distances, which would be very useful if deciding how blurry that bit of the image should be. Do it right and you could have a sharp area of a photo (say, a person’s eyes) and as you look further back along their head to their ears to their shoulders, the image gets progressively (and artificially, due to the processing) blurrier, looking just like an image produced with a lens on a traditional film or digital camera.

Easy to do with a big sensor to piece of film, hard to achieve with a tiny sensor. LiDAR should make a big difference here.

But it doesn’t. In fact, it’s a mess. The iPhone 11 without LiDAR is greatly handicapped—it is essentially looking at an 8×10 flat print and trying to decide how to blur things while the iPhone 12 with is newfangled LiDAR has its fancy depth-map. But where the iPhone 11 image looks pleasing and plausible, the iPhone 12 image is just gross, like some big blurred thing is behind that branch of the tree, something that just happened to line up just right. The LiDAR-assisted camera has screwed up here, big time. To be fair, the feature is called “Portrait Mode,” not “Joshua Tree branch mode” but I was hoping for a little computational magic.

Here’s another try. For this example I made three images, one each from the iPhone 11 and the iPhone 12 from the same position. Because of the longer focal length of the lens on the 12 the image is a bit more “zoomed in.” So I backed the tripod up a bit, changing the perspective slightly but making the Joshua tree branch roughly the same size as on the 11. Again I expected the iPhone 12 with its LiDAR magic to make short work of the iPhone 11 and although the 12’s images aren’t complete failures this time it still doesn’t do as well as its older cousin.

The blurred background effect is one of the big things traditional cameras do much better than iPhones and closing this gap would mean that I could carry the iPhone more and drag around the Fuji less. Not yet.

In this image of a tree trunk I was curious how well the cameras would handle the whites of the inner body of the tree in close proximity to the charred out parts, especially the charred parts in shadow. This is again with the 2x and 2.5x lenses and the images show a slight color shift, warmer on the iPhone 11, but otherwise are remarkably similar. The iPhone 12 doesn’t have the increased contrast we have seen in earlier examples. If I hadn’t seen the camera info (and, of course, if I had not made the images) I could easily have been convinced that these were made with the same camera.

The white of the trees was unexpected. I don’t know if this is the natural color of the interior of a Joshua tree or the color is due to having been cooked. In any event, it was striking.

Lens flare is normally something lens makers brag about eliminating. Apple’s box art for the iPhone 12 Pro, while probably really trying to show the surfaces of their lenses, seems to show flare, big time, seems to brag about having it.

Maybe Apple is quite aware that character in a lens counts and interesting flare is a way to add some of that character?

Here are a few examples. Note that even small changes in the position of the lenses resulted in large differences in the amount and placement of flare within the image and it was difficult for me to get the flare on each iPhone to match. Except for the bright ball of colored light, which is far too attention-grabbing, the flare of these lenses is attractive and very similar to each other.

All cameras do well in good light. Their limitations are revealed in adversity.

The Sun had almost set when I made this pair of images (above) using the main camera of each phone. There’s a pronounced difference in white balance with the iPhone 11 producing the most realistic colors and the iPhone 12 again increasing the contrast in odd ways.

In this pair, just after the Sun fell behind Teutonia Peak, the iPhone 11 goes for a darker look, more brooding, while the iPhone 12 bumps both the exposure and the contrast, and changes the glow of the Sun behind the peak from a soft radiance to what looks like some sort of massive detonation, with bands of lightness and color drawing attention to itself. These are very different images.

But the choices made by each camera were erratic. Here is another pair of images (above) made just a few minutes later, along the hiking trail, which the firefighters must have used for a defensive line during the forest fire. For most of the length of the trail, until you get near the peak itself, the desert to the left is burned while the desert to the right is nearly pristine. You have to imagine on your own the left-hand side of this photo in flames, smoke pouring upwards. My photographs here look nearly the same, with the exception of the iPhone 12 blowing out the sky more than the iPhone 11.

What seems to have happened in the above images is that the iPhone increased the exposure of the image and then, to get nice blacks, either raised the black point or the overall contrast. The foreground of the iPhone 12 image is unusual but not altogether unattractive except for the sky, which is again destroyed.

In these images the foreground has been processed in the same way as we have seen before, with the iPhone 12 preferring a colder, more contrasty look, even in full shadow, while the iPhone 11’s look is more natural. There is less difference in the mountains.

It is worth noting that what you see on the screen is not always an accurate representation of what you will get in your file. This is especially true when you are shooting in low light and the phone’s computational algorithms begin to dominate. In these images (below) you can see two post-sunset pictures, both having the same sort of differences we’ve seen in the above examples, except now it is the iPhone 11 blowing out the sky.

I took screenshots of both previews. Both treat the sky much more gently, with smooth colors and graduations. The problem we are dealing with which causes the sky to blow out clearly is not some barrier of physics but instead must be due to misguided choices on the part of the phones.

As it became darker yet the iPhones found themselves on again surer footing. It seems they struggle most in the middle, twilight period. In these two images below, both at the same shutter speed, both phones render different yet attractive images, the iPhone 11 perhaps a tad too dark, yet capturing the mood of the moment, the iPhone 12 perhaps a tad too contrasty, yet a plausible and beautiful result, at least until you remember that this forest was once lush and alive just a few weeks back and wonder how many generations it will take to restore it to its full health and glory.

Leave a Reply