With its superior telephoto capabilities and an interesting 200-megapixel main camera, I wanted Samsung’s Galaxy S23 Ultra to teach Apple a lesson in how a flagship phone should handle digital photography. Not because I want Apple to lose, but because I want competition to improve everybody’s smartphone photography.
But even though the S23 Ultra camera specs look better than the iPhone 14 Pro’s in many ways, I’m not going to declare it the better option for photographers. Nor is Apple the clear leader. After shooting hundreds of photos and pixel peeping for hours, this race comes down to what you value most in smartphone cameras.
If telephoto reach are important to you, the $1,200 Samsung Galaxy S23 Ultra is the better camera, as you can tell from the photo below, and it handles night shots better. If you’re sticking with daytime images and are OK with a maximum 3x zoom, I prefer Apple’s iPhone 14 Pro and Pro Max, which start at $999 and $1,099, respectively, for more natural photos and a cleaner, faster camera app.
Read on to dig into the details.
Apple and Samsung have capable wide-angle main cameras
No matter your phone, chances are good that you’ll spend most of your time using the main, wide-angle camera, the best one for photographing nearby people. These are the cameras with the best sensors and optics, for good reason, and Samsung and Apple didn’t skimp on the hardware.
Both phones’ cameras can capture abundant detail and, with today’s computational photography technology, a reasonable range of bright and dark tones. Under ordinary circumstances, neither phone is head and shoulders above the other.
I prefer the iPhone, though. Frankly, both companies are guilty of overprocessing, but Apple less so. Samsung’s instinct is to crank up the saturation so colors are super rich so skies look unnaturally blue and grass unnaturally green. Samsung often made the clouds of overcast skies too gloomy and gray. The Galaxy phone sharpens edges so tree leaves, blades of grass and other areas with lots of detail are a crispy, jittery mess. In the photo of the boats below, the S23 Ultra suffers from artificial looking edges around the palm trees and jittery road details in the foreground.
If you’re looking at photos on a phone screen, these vivid, punchy images look more eye-catching. But I often look on laptops too, where the shots seem like an exaggerated version of reality. I prefer a bit more truth to my photos.
Samsung’s face detection on the Galaxy S23 Ultra zeroed in on subjects well for good focus. The iPhone seemed to do what I wanted in some tricky shots better, though. In one comparison, it correctly focused on a toy block construction, not the child who created it who was sitting in the background. In a variety of similar scenes, the Samsung went for the faces, every time.
Another AI-boosted technology is portrait mode, and here I preferred Samsung’s technology for better separation of foreground and background. The iPhone sometimes would blur parts of the subject while keeping other parts in focus.
I like the S23 Ultra’s ability to take 50-megapixel photos, not just the 12-megapixel default. Apple shoots only 12-megapixel shots that are easier on your iPhone and iCloud storage limits but that don’t offer as much detail.
The S23 Ultra has a significantly wider field of view, which is nice for indoor photos and some landscapes. However, I prefer Apple’s field of view. Both phones have ultrawide cameras, after all. In this courtyard photo, it’s nice to have a wide field of view, but Apple offered a more realistic rendering of the scene.
Low light: The edge goes to Samsung
When the going gets dark, though, I preferred Samsung’s computational photography skills. The Apple shots suffered more from muddines in dark areas, and Samsung generally did better preserving details and boosting shadows that might have been lost to shaky hands. Note the sharper stenciled text and brighter shadows in the Samsung photo below.
Below, the iPhone 14 Pro tried brightened the scene more, but the details are far muddier than with the S23 Ultra. Usually the iPhone had more realistic photos, but not this time.
Both phones failed at the challenging task of shooting at night toward a bright streetlight, with enormous, multicolored lens flare artifacts distracting from anything in the photo. Sadly, this is the state of the art for smartphone photography these days, and not just with these to manufacturers.
Telephoto shots: Samsung is way better, obviously
Not everyone ventures far beyond the main camera, but in my mind, giving people more photographic freedom is important. It’s the main reason these flagship phones cost so much more. And I just love the Samsung’s S23 Ultra’s 3x and 10x telephoto cameras. Telephoto shots can tightly frame distant objects, spotlighting details and perspectives that can be more surprising and engaging.
I took dozens of photos to compare the iPhone 14 Pro’s shots upscaled to 10x with the Galaxy S23 Ultra’s native 10x shots. Of course there’s no comparison. Over and over I got shots of my dog off the beaten track in deep snow, my kid running away from waves on the beach and other subjects I couldn’t just walk closer to photograph, like this speaker at a conference.
Given how hard Apple promotes its products as creative tools, the lack of a big telephoto camera on iPhones is embarrassing at this point.
Samsung, like Google and other rivals, gets the long reach with a “periscope” lens approach that uses a prism to angle light 90 degrees inside the phone body. That sideways bend accommodates the otherwise impractically long telephoto optics. It’s a smart approach, and one Apple should mimic.
I used the Samsung at 10x a lot and less commonly with its digital upscaling technology, at 30x and 100x. The 30x shots can be useful when viewed small and can help with jobs like identifying birds. 100x is mostly a novelty unless you’re limiting your photos to postage stamp size.
S23 Ultra’s 200-megapixel photos are a gimmick
One of the S23 Ultra’s headline features is a 200-megapixel mode. Sadly, as implemented, I see it as less than useful. Through an image sensor technology called pixel binning, Samsung assigns each 4×4 group of pixels to capture a single color in, effectively, one larger and more sensitive pixel. The approach works well when taking low-light photos at 12.5 megapixels. But that doesn’t capture as much detail when shooting at 50 megapixels or 200 megapixels, which requires the phone to extrapolate color information.
Indeed, at 200 megapixels, the sensor uses AI technology to try to reconstruct the color data. That, combined with sharpness limits of tiny lenses, means that 200-megapixel photos really didn’t seem to be any better. In one test, I couldn’t tell much difference between a 200-megapixel shot and a 50-megapixel shot blown up to 200 megapixels in Photoshop.
The 200-megapixel shots photos are also humongous. For ones shot of a snow-covered cedar, the 200-megapixel JPEG is 57MB compared with 21MB for the 50-megapixel shot and 7.3MB for the 12-megapixel shot.
The iPhone uses only 2×2 pixel binning for 48-megapixel photos, and with today’s phone tech, that’s a good maximum. I’d love for iPhones to offer 48-megapixel JPEG or HEIC images, though, to catch up to Samsung.
Selfie time: iPhone takes the prize
For selfies, it was a close matchup. I preferred the iPhone for its more realistic rendering of colors and details like hair and beards. Both phones struggled with my shadowed face in the example below, but the Samsung turned the sky a weird, flat gray.
When shooting portrait mode selfies — where artificial bokeh blurs the background thanks to computational photography — neither selfie camera handled flyaway hair well. But I thought Apple’s iPhone did better overall.
Ultrawide: Apple wins on sharpness
I give the edge to Apple here for its more natural rendering of colors and for considerably better sharpness toward the periphery of the frame, but mostly the cameras are a close match. As usual, the Galaxy S23 Ultra made the skies too blue and the grass too green, though, and both phones struggled with extreme lens flare shooting toward the sun.
The ultrawide lens is also used for macro shots. I’m glad macro photography has arrived on smartphones, but on both cameras, image quality degrades significantly beyond the central portion of the photo, so don’t get your hopes up. I thought Samsung kept its sharpness better across the frame, though, as in the birthday pie photo below. Both cameras overprocessed the fine structure of the flower below, but the Samsung went further overboard.
Shoot raw if you want the best image quality
Raw photos preserve more of the original photo information for better editing flexibility when it comes to color and exposure, and I recommend shooting raw if you want the best photos. Both the Apple and Samsung raw photos — taking a page from Google’s camera playbook — employ a “computational raw” approach that actually merges multiple photos into one shot, taking advantage of the image processing methods that underpin conventional JPEG and HEIC photos.
Shooting raw sidesteps most of my problems with conventional Apple and Samsung photos, especially oversaturation, plasticky skin and too much clarity — the contrast boost to a shot’s middle brightness tones. I prefer less sharpening too, and the raw photos let me dial it back, though not as far as I’d like with the S23 Ultra. I suspect Samsung is doing some sharpening in its computational raw processing.
When it comes to shooting raw, Apple’s experience is much better. You need to enable the raw option in the camera app’s formats setting and, optionally, enable the full 48-megapixel resolution option for shots taken with the main camera.
Samsung, in contrast, treats raw photos as something of an afterthought. You have to download its separate Expert Raw app from its Galaxy Store. It’s a useful app if you want to fiddle with exposure, shutter speed and other settings while shooting, but it’s more work for those of us who prefer to just take the photo and fiddle with it later in Lightroom or other editing software. There’s no quick launch option for Expert Raw, either.
You can set Expert Raw to save both JPEG and raw versions of your photos, but the JPEG’s rudimentary processing is far worse than the shots from the regular camera app, with tear-inducing oversaturation and obvious halos ringing subjects in contrasting bright and dark areas. Stick to raw only.
More annoying are the resolution settings in the app. If you set the camera to shoot in 50-megapixel mode, you can only use the main camera. If you want to use the telephoto or ultrawide cameras, you have to step back down to 12 megapixels before they’re available. There’s no option just to shoot at the highest resolution available with each camera.
There’s also no 200-megapixel raw option. Given its marginal image quality benefits, I’m not sure that’s a big problem.
Conclusion: No clear winner
The Galaxy S23 Ultra’s telephoto cameras are expensive components, but worth it if you want to express yourself creatively. Apple’s more natural photos and faster, more capable camera app are a major asset, though.
Either smartphone is a capable photographic tool, but neither is such a clear leader that it’s worth moving from Android to iOS or vice versa. If you want to get the most out of your phone’s camera, the easier decision is to start shooting raw.