The puzzling, blurry iPhone 14 pro
My wife got an iPhone 14 pro when it first came out, and she has told me ever since that it’s just “blurry” —we thought for a while maybe even the hardware was bad. It took us a really long time to figure out exactly why it made so many soft photos, and I thought you’d like to know why.
The short version
Pictures of our puppy and jewelry demo videos — these might not sound like things that would trigger the same camera problem, but they are. The thing both of these have in common is that they make an area about 8" across. When you hold the new pro iPhones close, so the frame is about 6–12" across, they can’t focus with their huge, 48MP “1x” lens, and this turns out to be an important range.
To fix this in the Camera app, the “ultrawide” macro lens is used (which has a much lower-quality sensor, but which can focus close) and then also these cameras use a lot of digital zooming, cropping down to a 3MP portion from this sensor. So, short answer, at this distance it’s kind of like using an iPhone 4 from 2010. It’s blurry, and it’s worse than older phones.
The much longer version, some workarounds, and what Apple is doing to make it better
Of course, a modern iPhone makes a fantastic camera at the right distance — all the test charts are sharp. The 12/48MP sensor is quite amazing, and it collects a lot of light, and it is even used for 2x (48mm equivalent) exposures by cropping the 12MP from the center of it, though there is some very smart signal processing to handle the quad-bayer sensor. So it sounds like a huge upgrade.
But in exchange for this huge sensor that can capture better photos at night, the phone has quite an Achilles’ heel, which is that you can’t use this 1x optic close up, and the workarounds aren’t very good either.
Close focus numbers
Recently, Apple added minimumFocusDistance to their camera API, which lets us understand kind of what has happened over the years. People who use pro cameras are aware that lenses vary in how close they can focus, and they usually report a second number called “maximum magnification” to describe close focus. Maximum magnification is about how big an object you can photograph, and focus distance is about how far you have to stand back. When you get too close, everything goes blurry. Most people care a little more about the “magnification” number Apple didn’t put in the API, which in simple terms means: can you take a great picture of the puppy’s head, or not?
So, the difference between the two is subtle, but it’s important as you change focal length, as the iPhone has over the years, from 28mm to 26mm to 24mm. In each of these moves, a constant minimumFocusDistance would lead to a smaller maximum magnification. So the way Apple is describing their optics does show the change, but not as dramatically as it actually did change. We want a measurement of the smallest object you can photograph — for that you need “maximum magnification”. Anyway, let’s check out what’s happened:
iPhone XS (2018), 28mm, 100mm minimum focus
iPhone 11 pro (2019), 26mm, 120mm minimum
iPhone 14–16 pro (2022–2024), 24mm, 200mm minimum
So you have to stand twice as far with the 14 pro. However, when we convert these to “maximum magnification” like we’d do with pro cameras, we get these numbers:
iPhone XS: 0.28x
iPhone 11 pro: 0.18x
iPhone 14 pro: 0.12x
The object that “fills the frame” is now 230% as big as it was in 2018, so it’s a larger change still because the focal length has widened at the same time.
Just so you know, these numbers are the size of 35mm film width (36mm) divided by the size of a ruler you can actually focus on. So if you can focus on a 10" ruler (254mm), the lens has 0.14x magnification.
Adding it up: how does the iPhone 14 Pro compare with the 2019 model, the iPhone 11 Pro?
Well, of course the new camera is better, except at this magic distance around 8" away, where the iPhone 11 Pro is so much better than either of the options on the new phone. I didn’t believe it until I zoomed in on both of them in Lightroom.
Comparing with pro cameras
To use some examples from actual pro cameras, a Canon 24–70L lens has a maximum magnification of 0.3x, which is pretty high and gives you a lot of flexibility. A Sony 24mm GM prime achieves about 0.17x, a typical prime that’s similar to iPhone 11 pro.
But 0.12x like the 14 pro’s 1x optic is indeed low, and you’ll only find it on some pro portrait lenses (like an 85mm), which make you step back for a head & shoulders shot, nothing closer. These lenses are considered less-convenient “specialty lenses” by photographers, because you have to be aware that they don’t focus close. So for iPhone to match these things isn’t so great. You’d be happier if they were in the 0.2–0.3x range instead.
With the iPhone 14 Pro, that means that the close-in shot of your puppy’s head will be blurry due to the optics, or instead it will switch to the ultrawide lens and zoom it up into also-blurry “digital zoom with some AI” instead. There’s no way to do it that isn’t blurry one way or the other.
They’ve devised an automatic switch that kind of works, but is also sticky, because if you trip over the “too close” macro flip, it turns on the ultrawide lens (around 8–10" from the subject), but then when you pull back you have to go out much farther, around 12" to get back to the 1x sharp lens. It’s easy to trigger macro mode, so even if you pull back to the normal distance the 1x lens can use, it keeps worse quality, down to only 1.5MP at 11", which is easily triggered. All of this means it’s really easy to get a blurry picture at these distances.
Some tricks they use to fix it
iPhone 14 Pro has two tricks up its sleeve.
As we already described, the first one is the “macro” digital zoom mode with a lot of digital sharpening, and it works mainly in the Camera app. When you get too close, it activates macro mode (which is a different lens entirely) and it’s pixel-zoomed and therefore blurry at the pixel level. What’s interesting is that quite a few third party apps do not use this trick (and some have reported they can’t figure out how), so they just go blurry at close range, and complaints about close photos have been frequent for some apps.
But also, there’s a clever workaround. We must mention that the 14 pro can make a cropped 2x picture from this 12/48MP super sensor:
iPhone 14 Pro at 2x: 0.24x!
And so now we are back to a useful magnification range, it’s convenient and awesome, even though it stretches the limits of the quad-bayer sensor. But it is really hard to discover that you can do this. In fact, by far the most discoverable thing is the pixel-zoomed “macro” ultrawide lens, which just isn’t very good. So IYKYK, but otherwise you get blurry shots.
To avoid blurry pictures on the 14 pro, you have to have at least a vague understanding of all this loaded into your head, and then you have to step back and make sure to use the longer 2x (48mm) lens.
Slightly better on the 15 Pro, and better again on the 16 pro
While these optical challenges haven’t gone away (the lenses seem quite similar on the newer models from 2023–2024), the 15 Pro added “lenses” which are basically crops of the 48MP sensor. So if you use one of these 28mm or 35mm presets, it gets you partway to the 2x lens, without having to back away from your subject as much as with the 2x lens, and without all the quality loss due to the quad-bayer layout. Let’s write those magnifications, so we can see the numbers:
iPhone 15 Pro at “28mm”: 0.14x
iPhone 15 Pro at “35mm”: 0.175x
So we’re not as close as just using an old iPhone 11 pro yet, but you can kind of get there with these modes, and maybe a video won’t flip between lenses this way. Stand back a little, crop a little, at least it isn’t blurry. Also, third party apps do not get access to these modes that I can tell.
What’s new in the 2024 “16 Pro”? Is it fixed?
Well, partly. Two things have changed, and one is better.
Most importantly, the ultrawide camera is now 48MP, which means that pushing in and tripping into the “macro” camera is much less bad than before. You get a much better sensor to work with, and your photos have around 12MP to work with even when they’re zoomed up by 2x.
There are some caveats — video shooters will notice the switch. And if you pull back after it happens, it doesn’t flip back right away and so it could be lower resolution until it switches out to the 1x sensor again, but you may not notice. Most importantly, this trick doesn’t work for a lot of third party apps, which still suffer from close focus issues. So in all three ways, there are situations where an old iPhone 11 pro will shoot less jumpy video, occasionally sharper stills, and let you work closer in third party apps like barcode readers. It’s not a fantastic result for five years. But it is an improvement.
Also, telephoto for portraits
The second thing is that the telephoto side now is longer but also has a big…gap. Previously the telephoto lens was around 75mm (3x) and now it’s 120mm equivalent (5x). But what if you really really like the 3x-4x range?
A gap from 24mm to 120mm is somewhat large, and it will make a new problem for portraits (which, yes, was there last year on the 15 pro max). Often if you zoom in for a flattering portrait, the classic head and shoulders shot, you want to be between 85–100mm, and these focal lengths are now cropped from the 48MP 1x sensor (24mm). So this means ignoring and cropping out 90–95% of the sensor information and getting a lower quality shot. At 100mm (4x) you’re only using 3MP of the 48MP sensor. It makes for kind of a blurry portrait — daring you not to zoom in.
From experience, focal lengths at 120mm and up mean you can’t talk very easily to the person you’re shooting (because you have to stand back pretty far), whereas at 75mm you still can. But it might be fine for most people to shoot at 50mm (~2x), where the camera still functions as a 12MP “full resolution” sensor, and so this blurriness at traditional focal lengths might not affect as many people as the close-up case does.
This is not especially new, but upgraders and puppy & baby photographers still should care
You can find old posts about the close-up blurriness issue from 2022 with the iPhone 13 Pro Max. And posts from videographers who struggle with lens switching (which can be disabled), or people with third party apps that don’t work as well on a new phone.
Low light or close focus? You can have only one
To me, a doubling of low-light performance isn’t quite as important as the optics and the convenience of easy, close focus, so I hope we get an iteration in the future that improves some of these optical issues without all the tricks.
For quite a few uses in this range from 5–12" inches away, an older design with a smaller sensor (like the iPhone 11 Pro or iPhone X) is just simpler, easier, and better than the newest Apple iPhone Pro models. I think the base model iPhones are still bit better than the Pro in this regard, at least if you don’t care about low-light exposures.
It’s clear they’re trying, it’s just a real tradeoff, and probably a new optical formula is needed if they want to be sharp at all distances.