Apple’s new iPhone 16 series smartphones are Apple’s best-ever handsets for mobile photographers. Among the biggest reasons for this is the suite of next-generation Photographic Styles, which intelligently and precisely adjust tones and colors in images, non-destructively and in real-time while using the iPhone’s camera.
Next-Gen Photographic Styles Are Computationally Demanding
Whenever a company introduces a new feature or function in its product, whether a camera, smartphone, or software, one of the first questions should be, “Why now?” In the case of the new Photographic Styles in the iPhone 16 family, the answer is relatively straightforward: Because it was only possible to do this now.
Part of why Apple could do this now is that its new processors, the A18 and A18 Pro, are performant enough to handle next-gen Photographic Styles in the way that Apple’s engineers wanted them to be done, including with real-time previews within the camera app and nondestructive and editable looks that can be tweaked or removed after capture.
“If you look at computational photography, we really pioneered this in smartphones starting with HDR photos in iPhone 4,” says Jeremy Hendricks, iPhone Senior Product Manager. “Then we took another huge step forward when we introduced the Neutral Engine in 2017. And since then, we’ve been improving the capability with hardware, software, and powerful silicon all combined together.”
In contrast to something like a filter, which applies globally to an image, or even a precise mask, which enables localized editing, the next-gen Photographic Styles apply to varying degrees to various tones within an image in different areas in response to semantic analysis of the subject, situation, and prevailing lighting conditions.
This is a massive computational demand, and performing these calculations while the user composes an image is no easy feat. It is only possible because of a combination of factors: the A18 chips, Apple’s increasingly sophisticated image analysis engine, and a drive to make it work.
Photographic Styles are also interesting because, unlike some of Apple’s other computational photography features, the new styles come with robust user controls and customization. For many users, the name of the game is getting a good shot with as little fuss as possible. However, for experienced photographers or those with an experimental and creative nature, having hands-on input over the look of their photos matters a lot.
“With our latest generation Photographic Styles, we’re taking that image pipeline further and now we can locally adjust color highlights in the shadows of a photo all in real-time and we have a much better understanding of skin tones,” Hendricks explains.
Getting Skin Tones Right Is Very Difficult and Even More Important
Skin tones are as important to get right as they are personal to an individual. Throughout extensive research led by Pamela Chen, Chief Aesthetics Scientist for Camera and Photos, Apple learned that people have very different preferences regarding their skin tones, so providing everyone with a look they liked for themselves was a significant focus. Apple also knew that toning an image without making people look strange would prove difficult and rely heavily upon Apple’s nearly two decades of iPhone imaging expertise.
As seasoned photographers know, every person’s skin tone is slightly different and challenging to “get right.” Further, what’s “right” is not necessarily what is most accurate — more on that later.
“We dove super deep into the entire history of photography, which is almost 200 years at this point,” Chen says. “But we also talked to people both in front of and behind the camera today. [We thought about] why people take pictures and what do they like about them.”
The iPhone 16 models have six undertone styles (Cool Rose, Neutral, Rose Gold, Gold, Amber, and Standard), which Apple built in response to what it learned from real people.
“When we developed the undertone styles, we found that even if two people have the exact same skin tone, even if that was possible, they can have genuinely different preferred renderings of themselves in pictures,” Chen says.
“We found that preference is actually not only because of the color of your skin or where you live, but also where and when you grew up. But also how you’re feeling that day and the situation you’re in. Basically, pictures that you like of yourself are deeply personal.”
This touches on something so powerful about the new Photographic Styles. As Apple has said repeatedly since the iPhone 16 launch event, the latest styles understand and respect a person’s skin tone. This makes it easy to adjust the colors of an image without turning someone into a wacky technicolor rendition of themselves. Still, more importantly, it provides users the tools and control they need to control how they appear and are represented in photos. While some people never have any issues with how they are shown in pictures, others have long struggled to appear the way they want to in photos. At least for iPhone 16 users, that is no longer a problem.
Precision Meets Artistry
As should be clear by now, Photographic Styles are not necessarily about achieving factual objective accuracy and color rendering. But Apple knows photography has always been a medium concerned with precision in representation, whether getting perfectly accurate skin tones or capturing the exact shades of blue and green in a verdant landscape.
However, just like photography has always been, at least in part, capturing reality as it truly exists, it has also been a medium rich with experimentation and creativity, often at the expense of objective accuracy. The subjective nature of all art, photography included, is a significant aspect of what makes it so powerful both as a viewer and a creator.
“Photography has always been a medium that requires precision and artistry, precision around the physics of rendering skin tone [and other colors] accuracy in cameras, and artistry to capture the realm of your preference,” Chen explains. “This principle is at the core of how we built Photographic Styles.”
“Photographic Styles allow users to control the way that the iPhone camera renders light, shadow, and color relationships deep within the pipeline.”
Jon McCormack, Vice President, Camera and Photos Software Engineering, explains that Photographic Styles are all about the human input built upon objective, hard science. From a physics perspective, there is, at some level, right and wrong when it comes to colors and rendering.
“The last 17 years of iPhone has been about deeply figuring that out,” McCormack says. Machine learning has played a vital role in cracking the code, which has required improved cameras, more processing power, and training on millions of images. Apple has done all that work to measure tones accurately. Photographic Styles is letting users mess with colors, shadows, and highlights however they want, even at the expense of the accuracy Apple’s engineers have strived for.
However, it is precisely this deep and accurate understanding of color and tone that enables Photographic Styles to work.
“A lot of the art of photograph is in the developing or editing step. You capture, and then the photograph applies their interpretation.”
“Without the last 17 years of work, we couldn’t accomplish any of our goals [for Photographic Styles],” McCormack adds.
Photographic Styles Rest on a Foundation of iPhone Camera Advances
“An important thing to think of with Photographic Styles is it sits on top of all of the photographic pipeline work we’ve done over the years,” remarks McCormack. “We have this huge tonal attitude within the scene now, and this gives us the raw materials we need to really go and create a beautiful personalized final image.”
McCormack says that when Apple talks about “personalized,” it means going from the idea of a physically accurate rendering of a scene to how a user wants it to appear, whether it’s how someone wants the photo to look overall, or something more precise like how they want the light to appear or what they want themselves to look like in an image.
“Photographic Styles are designed to give you a [high] level optionality so you can render the scene and capture the moment how you want,” McCormack says. “Because that’s what photography is all about.”
To achieve this, Apple had to rearrange and rework its entire imaging pipeline. As is often the case, there is an inverse relationship between how straightforward something sounds and the difficulty of achieving it.
“We had to go and take all of the adjustments to tone and color and move those later in the pipeline, which gives us the ability to much more easily reinterpret those,” McCormack explains.
He adds that an essential part of Photographic Styles is that the user effectively has the benefits of a RAW file, so users can reverse or change their Photographic Styles later without losing image quality. This plays into the creative and experimental spirit that has been a part of photography since its origins in the 19th century.
This nondestructive nature of Photographic Styles required Apple to develop a new extension to its image file formats.
“We keep all of the data that is effectively going to get compressed or moved around when you apply a style so that when want to go back and say, ‘Hey, I want to go back on this style or use another style,’ we can go back to ground zero without the added storage cost of having something like a RAW file,” McCormack says.
We understand things like skin, light, hair, clothing, et cetera. We’re able to comb all these independently in the context of the style. So, for example, if the style is brighter, what does a brighter style mean on skin versus hair versus the background?
This is a massive part of the intelligence used in Photographic Styles, understanding the color relationships within the scene, understanding the intent of the shot, and adjusting those in unison.”
McCormack notes a specific example that many photographers can relate to. When shooting in golden hour light, he says the iPhone understands that, and Photographic Styles respects the prevailing light and tones. Photographic Styles aren’t about entirely overriding the natural tones in a photo but rather about providing subtle variations in response to user preference.
“This is in stark contrast to traditional styles that are either just globally applied or are applied with some level of understanding but not a deep understanding of what’s actually going on in the scene.”
Giving Choice Back to the Photographer
“[Photographic Styles] are all about using our computational photography history to allow people to be seen the way they choose to be seen,” McCormack explains. “This is about giving choice back to the photographer.”
It’s also about fun. Photographers are familiar with sliders from apps like Adobe Lightroom, but needing to combine different sliders to achieve particular looks can be daunting for less experienced users. So Apple came up with a control pad for Photographic Styles, which uses color cues and visual feedback to ensure people can figure it out without any prior knowledge.
“This idea of the control pad adds a lot of power, a lot of fun, and a lot of flexibility,” McCormack notes.
“The [control pad] is where you can see the power of Photographic Styles come to life. By sliding your finger up and down on the control pad, you can control the tone of the image while protecting the skin tones,” Chen adds.
The Spirit of Photography’s Past Looks Toward a Dramatically Different Future
When creating Photographic Styles, Apple’s research, led by Chen, prioritized understanding how aesthetic preferences evolved alongside technological achievements. These distinct appearances, tied to eras in photography’s history, are found in the nine “Mood” Photographic Styles, including Vibrant, Natural, Luminous, Dramatic, Quiet, Cozy, Ethereal, Muted B&W, and Stark B&W.
While some looks have come and gone, a practical blip on photography’s visual radar, others have persisted.
When Apple was creating its new Photographic Styles, the team needed to understand how photography had changed in response to technological improvements, but even more importantly, how it hadn’t changed. What aesthetics have stood the test of time? Which looks are timeless?
“These palettes are inspired by photography’s most powerful and relevant styles over time,” Chen explains. “We went all the way back to the first stylized photographs of the late 1800s, as this during the Pictorialism movement when photographers were trying to be taken seriously as artists. Stylizing images were kind of this new playground, and they were capturing evocative, dreamy, impressionist scenes where there were dramatic light-lifted shadows. They were heavily influenced by Romantic painters of the era. We tracked that look over our timelines and realized it never actually went out of style.”
Chen says one of her favorites is Luminous, a Mood that draws inspiration from the early days of the digital era. “This palette shifts hues toward a softer rainbow of colors, a lavender-mint, dreamy look with warm bronzes, cool pinks, and a robin’s egg blue.”
“We wanted to celebrate all the things that people love about photography while also advancing the medium in a way that is meaningful. We designed the look and feel of each style to serve contemporary personal preferences and honor classical photographic techniques,” says Chen.
Photographic Styles Encourage Creative Play and Experimentation
Much like other image editing tools, whether sliders, filters, or masks, next-gen Photographic Styles allows photographers to be more creative and share the moments and people they care about most in new, personalized ways.
“We want people to be able to play when they’re capturing and when editing photos later,” McCormack says.
“Whether you’re an experience photographer and editor, or just someone trying to get a great-looking photo, both sets of people get what they’re looking for,” adds Hendricks. “It enables full creative control and basically unlimited options.”
With their nondestructive nature, Photographic Styles have absolutely no downside. They don’t permanently change an image, take up extra space, or send skin tones into crazy territory. While this sounds simple — and for the user, it is — it has been an extremely complicated effort for everyone involved. The work has undoubtedly paid off.
Image credits: Apple