
Categoría: Uncategorized
-
Madrid y Barcelona: las subidas de la vivienda se extienden en sus áreas metropolitanas
En la mayoría de los municipios de la periferia de estas dos capitales los precios inmobiliarios aumentan más de un 10% en tercer trimestre, pero atraen a los compradores porque siguen siendo más económicos que en el centro. Leer -
Así gestiona su fortuna la familia March
Con un patrimonio de 5.100 millones, los March llevan generaciones afianzando su liderazgo en el sector privado con Banca March y Corporación Alba. Gestionan Torrenova, la segunda mayor Sicav en España. Leer -
Una enfermedad que no afecta a la salud humana directa pero sí a la economía
Numerosas organizaciones científicas y sanitarias insisten en que la fiebre porcina africana no puede transmitirse a los humanos por contacto con cerdos o jabalíes o por el consumo de productos de estos animales. Pero las consecuencias económicas y de seguridad alimentaria pueden llegar a ser muy graves. Leer -
La Generalitat de Cataluña pide el despliegue de la UME ante el brote de peste porcina
Taiwán prohibió la importación de carne y productos porcinos procedentes de España tras la detección de varios casos de peste porcina africana (PPA) en jabalíes hallados muertos en la sierra de Collserola, cerca de Barcelona, el primer brote registrado en el país desde 1994. Leer -
Por qué preocupan los datos poco fiables de China
La creciente opacidad estadística y el control político en China generan dudas sobre sus cifras oficiales, e impiden conocer el alcance real de la desaceleración de la segunda economía mundial. Leer -
Physicality: the new age of UI
It’s an exciting time to be a designer on iOS. My professional universe is trembling and rumbling with a deep sense of mystery. There’s a lot of rumors and whispers of a huge redesign coming to the iPhone’s operating system — one that is set to be ‘the biggest in a long time’.
There’s only been one moment that was similar to this: the spring of 2013. On June 10th, Apple showed off what would be the greatest paradigm shift in user interface design ever: iOS 7. I remember exactly where I was and how I felt. It was a shock.

If there is indeed a big redesign happening this year, it’ll be consequential and impactful in many ways that will dwarf the iOS 7 overhaul for a multitude of reasons. The redesign is rumored to be comprehensive; a restyling of iOS, macOS, iPadOS, tvOS, watchOS and visionOS. In the intervening years between iOS 7’s announcement and today, iPhones have gone from simply a popular device to the single most important object in people’s lives. The design of iOS affected and inspired most things around, from the web to graphic design and any other computer interface.
That’s why I figured I’d take this moment of obscurity, this precious moment in time where its changes are still shrouded in fog to savor something: wholesale naivety of where things are going, so I can let my imagination run wild.
What would I do if I were Apple’s design team? What changes would I like to see, and what do I think is likely? Considering where technology is going, how do I think interface design should change to accommodate? Let’s take a look at what’s (or what could be) next.
Smart people study history to understand the future. If we were to categorize the epochs of iOS design, we could roughly separate them into the Shaded Age, the Adaptive Age, and the New Age.
The Shaded Age
iOS started out as iPhone OS, an entirely new operating system that had very similar styling to the design language of the Mac OS X Tiger Dashboard feature:

via https://numericcitizen.me/what-widgets-on-macos-big-sur-should-have-been/ 
early iPhone prototypes with Dashboard widget icons for apps The icon layout on iPhone OS 1 was a clear skeuomorph.
You might’ve heard that word being thrown around. It might surprise you that that doesn’t mean it has lots of visual effects like gradients, gloss and shadows. It actually means that to make it easier for users to transition from something they were used to — in this case, phones typically being slabs with a grid of buttons on them — to what things had become — phones were all-screen, so they could show any kind of button or interface imaginable.


At the time of iPhone 1’s launch, a cartoon of a ‘phone’ would still be drawn as the image on the left. A grid of buttons defined its interaction model and comfort zone.
And yes, there was a whole lot of visual effects in user interfaces from iPhone OS 1 to iOS 6. In this age, we saw everything from detailed gradients and shadows in simple interface elements to realistically rendered reel-to-reel tape decks and microphones for audio apps.



The Facebook share sheet had a paperclip on it! The texture of road signs on iOS maps was composed of hundreds of tiny hexagons! Having actually worked on some of the more fun manifestations of it during my time working at Apple, I can tell you from experience that the work we did in this era was heavily grounded in creating familiarity through thoughtful, extensive visual effects. We spent a lot of time in Photoshop drawing realistically shaded buttons, virtual wood, leather and more materials.
That became known as ‘skeuomorphic design’, which I find a bit of a misnomer, but the general idea stands.

Of course, the metal of the microphone was not, in fact, metal — it didn’t reflect anything like metal objects do. It never behaved like the object it mimicked. It was just an effect; a purely visual lacquer to help users understand the Voice Memos app worked like a microphone. The entire interface worked like this to be as approachable as possible.
Notably, this philosophy extended even to the smallest elements of the UI: buttons were styled to visually resemble a button by being convex and raised or recessed; disabled items often had reduced treatments to make them look less interactive. All of this was made to work with lots of static bitmap images.
The first signs of something more dynamic did begin to show: on iPad, some metal sliders’ sheen could respond to the device orientation. Deleting a note or email did not simply make it vanish off-screen, but pulled it into a recycling bin icon that went as far as to open its lid and close it as the document got sucked in.

If it had not been for Benjamin Mayo publishing this video, no trace of this was ever even findable online. Our brand new, rich, retina-density (2×) screens were about to see a radical transformation in the way apps and information were presented, however…
The Flat Age
iOS 7 introduced an entirely new design language for iOS. Much was written on it at the time, and as with any dramatic change the emotions in the community ran quite high. I’ll leave my own opinions out of it (mostly), but whichever way you feel about it, you can’t deny it was a fundamental rethinking of visual treatment of iOS.
iOS 7 largely did away with visual effects for suggesting interactivity. It went back to quite possibly the most primitive method of suggesting interactivity on a computer: some ‘buttons’ were nothing more than blue text on a white background.

The styling of this age is often referred to as ‘flat design’. You can see why it is called that: even the buttons in the calculator app visually indicate no level of protuberance:

The Home Screen, once a clear reference to the buttons on phones of yesteryear, was now much more flat-looking — one part owing to simpler visual treatment but also a distinct lack of usage of shadows.

But why did shadows have to go? They had an important function in defining depth in the interface, after all. Looking at the screenshot above actually does it no justice: the new iOS 7 home screen was anything but flat. The reason was that the shadows were static.

iOS 7 embraced a notion of distinct visual layers and using adaptive or dynamic effects to distinguish depth and separation. Why render flat highlights and shadows that are unresponsive to the actual environment of the user when you can separate the icons by rendering them on a separate plane from the background? Parallax made the icons ‘float’ distinctly above the wallpaper. The notification center sheet could simply be a frosted pane above the content which blurred its background for context.
Jony Ive proudly spoke at the iOS 7 introduction, on how ‘the simple act of setting a different wallpaper’ affected the appearance of many things. This was a very new thing.


Also a new thing in the interface was that the UI chrome was able to have the same dynamics: things like the header and keyboard could show some of the content they obscured shining through as if fashioned out of frosted glass.

While it was arguably an overcorrection in some places, iOS 7’s radical changes were here to stay — with some of its dynamic ‘effects’ getting greatly reduced (parallax is now barely noticeable). Over time, its UI regained a lot more static effects.



Type and icons grew thicker, shadows came back, and colors became a lot less neon.
One of the major changes over time was that iOS got rounder; in step with the hardware it came on, with newly curved screen corners and ever-rounder iPhones, the user interface matched it in lock-step. It even did this dynamically based on what device it was running on.

More interface elements started to blend with content through different types of blur like the new progressive blur, and button shapes were slowly starting to make a comeback. It settled into a stable state — but it was also somewhat stagnant. For bigger changes, there would have to be a rethink.
What would come next couldn’t simply be a static bitmap again: it would have to continue the trend of increasingly adaptive interfaces.
The Age of Physicality
When Apple’s designers imagined the interface of VisionOS, they had a mandate to essentially start from scratch. What does an ‘app’ look like in an augmented reality?
What appears to be a key foundational tenet of the VisionOS design language is how elements are always composed of ‘real’ materials. No flat panels of color and shapes exist as part of the interface.
This even applies to app icons: while they do have gradients of color, they occupy discrete layers of their own, with a clear intention from their introduction video of feeling like actual ‘materials’:


Alan Dye, upon introduction of the VisionOS interface, stated that every element was crafted to have a sense of physicality: they have dimension, respond dynamically to light, and cast shadows.
This is essential in Vision Pro because the interface of apps should feel like it can naturally occupy the world around you and have as much richness and texture as any of the objects that inhabit that space. Comparing to the interfaces we are familiar with, that paradigm shift is profound, and it makes older, non physicality-infused interfaces feel archaic.
If I were to position a regular interface in the Vision Pro context, the result looks almost comically bad:

I find it likely, then, that there will be more than a mere static visual style from visionOS brought to iPhone, iPad and Mac (and potential new platforms) — it seems likely that a set of new fundamental principles will underpin all of Apple’s styling across products and expressions of its brand.
It would have to be more subtle than on Vision Pro – after all, interfaces do not have to fit in with the ‘real world’ quite as much – but dynamic effects and behavior essentially make the interface come to life.
Sound familiar? Existing aspects of the iPhone user interface already do this:
0:00/0:17
Apple’s new additions to the iOS interface of the last years stand out as being materially different compared the rest of the interface.


They are completely dynamic: inhabiting characteristics that are akin to actual materials and objects. We’ve come back, in a sense, to skeuomorphic interfaces — but this time not with a lacquer resembling a material. Instead, the interface is clear, graphic and behaves like things we know from the real world, or might exist in the world. This is what the new skeuomorphism is. It, too, is physicality.
The Dynamic Island is a stark, graphic interface that behaves like an interactive, viscous liquid:




You can see it exhibiting qualities unique to its liquid material, like surface tension, as parts of it come into contact and meld together.
When it gains speed, it has momentum, much like the scrolling lists of the very first iteration of iPhoneOS, but now it reads more realistic to us as it also has directional motion blur or a plane of focus as items move on their plane:



Similarly, the new Siri animation behaves a bit more like a physically embodied glow – like a fiery gas or mist that is attracted to the edges of the device and is emitted by the user’s button press or voice.

What could be the next step?
My take on the New Age: Living Glass

I’d like to imagine what could come next. Both by rendering some UI design of my own, and by thinking out what the philosophy of the New Age could be.
A logical next step could be extending physicality to the entirety of the interface. We do not have to go overboard in such treatments, but we can now have the interface inhabit a sense of tactile realism.
Philosophically, if I was Apple, I’d describe this as finally having an interface that matches the beautiful material properties of its devices. All the surfaces of your devices have glass screens. This brings an interface of a matching material, giving the user a feeling of the glass itself coming alive.


VisionOS details from the excellent Wallpaper interview with Apple’s design team.
Buttons and other UI elements themselves can get a system-handled treatment much like visionOS handles window treatments.*
*VisionOS is an exceptionally interesting platform, visual effect-wise, as the operating system gets very little data from the device cameras to ensure privacy and security. I would imagine that the “R1” chip, which handles passthrough and camera feeds, composes the glass-like visual effects on the UI chrome. All Apple devices can do this: they already do system-level effect composition for things like background blurs.
I took some time to design and theorize what this would look like, and how it would work. For the New Design Language, it makes sense that just like on VisionOS, the material of interactivity is glass:

My mockup of a dynamic glass effect on UI controls Glass is affected by its environment. The environment being your content, its UI context, and more.

Since it is reflective, it can reflect what is around it; very bright highlights in content like photos and videos can even be rendered as HDR highlights on Glass elements:

Note the exhaust flare being reflected in the video playback bar; an interactive element like the close button in the top left has its highlights dynamically adjusted by the scene, too. Glass elements visibly occupy a place in a distinct spatial hierarchy; if it does not, elements can be ‘inlaid’: in essence, part of the plane of glass that is your display or a glass layer of the UI:

Much like the rear of your iPhone being a frosted pane of glass with a glossy Apple logo, controls or elements can get a different material treatment or color. Perhaps that treatment is even reactive to other elements in the interface emitting light or the device orientation — with the light on it slightly shifting, the way the elements do on VisionOS when looked at.
Controls may transition as they begin to overlay content. One can imagine animated states for button lifting and emerging from their backdrop as it transitions the hierarchy:

These effects can be rendered subtly and dynamically by the system. In comparison, it makes ‘regular’ static interfaces look and feel inert and devoid of life.
Glass has distinct qualities that are wonderful in separating it from content. It can blur the material below it, as we already see in modern iOS controls. It can have distinct, dynamic specular highlights from its surroundings:

A little drag control for exposure, as a spin on our modern EV adjustment in Halide. Note the material itself responding to the light the adjustment emits. It can have caustics, which is to say it separates itself from the backdrop by showing interaction with light in its environment by casting light, not shadow:

… and it can also get infused with the color and theme of the interface around it. Glass does not just blur or refract its background: it reflects, too. This isn’t totally out of left field: this is seen in the WWDC25 graphics, as well:

Elements of Style
Having a set of treatments established, let’s look at the elements of the New iOS Design.
Tab Bar
I would imagine that the era of ‘closed tab bars’, that is, the type that masks the content outright is ending. In fact, I wouldn’t be surprised if almost all UI that outright masks the interface like that as a max-width bar would be gone.
These types of static interface panels are a legacy element from the early days of iOS. The new type can float over content:

Controls like this are better suited to transition to rise from its underlying ‘pane’ as you scroll it out of view, and can similarly also hide themselves so they’re not obscuring content all the time.
Controls
It can be overwhelming for all elements in the interface to have a particularly rich treatment, so as I mentioned before, I would expect there to be various levels of this ‘depth’ applied. Core actions like the email sending button in Mail can be elevated:

Whereas other actions that are part of the same surface — like the ‘Cancel’ action here — can get more subtle treatments.
Elevated controls can be biased slightly towards warmer color balance and background elements towards cool to emphasize depth.
App Icons
Apple put considerable work into automatic masking for icons in iOS 18, and I doubt it was only for Dark Mode or tinted icons on an identical black gradient icon backdrop. The simple, dark treatment of icon backgrounds makes me imagine it was preparation for a more dynamic material backdrop.

Dynamic icon backdrops in Dark Mode – note the variable specular highlights based on their environment. Not to mention, app icons are exactly the type of interactive, raised element I spoke of before that would be suited to a living glass treatment.

Dynamic rendering of icons with a ‘content layer’, glassy effects and an overall polishing of existing designs. The corners are also slightly rounder. I’d also imagine some app icons that are due for a redesign would get updates. Many haven’t been updated since iOS 7. This would be a major change to some of Apple’s ‘core brands’, so I expect it to be significant, but consistent with the outgoing icons to maintain continuity while embracing the new visual language —kind of like the Safari icon above.

On the note of icons, I also wouldn’t be surprised if the system icons themselves got a little rounder.
Home Screen
It seems likely the Home Screen as a whole is re-thought for the first time. Its complexity has ballooned since the early days of iOS. I find myself spending a lot of time searching in my App Library.
I think there’s a great AI-first, contextual slide-over screen that can co-exist with the regular grid of apps we are used to. I was a bit too short on time to mock this up.
Sliders and Platters
Basic interactive building blocks of the iOS interface will get system-provided treatments that are responsive to their environment:

Note how the Contact platter has some environment interaction with the green ‘Now’ light Overall, one can imagine a rounding and softening of the interface through translucent materials looking pretty great.
Beyond
This ‘simple’ change in treatment — to a dynamic, glassy look — has far-reaching consequences.
Apple is unique — its predominant user interface style in the 2000s has always been linked to its branding. Its icons are also logos, its treatments a motif that stretch far beyond platforms they live on. Consider the navigation of Apple.com:

The navigation of Apple’s website has changed in step with major UI design eras. The introduction and maturation of Aqua in 2000 and beyond; iPhone and Mac OS X’s softer gradients in 2008; and finally, a flatter look after 2014. It is not a stretch to assume that this, too, would assume some kind of dynamic, new style. Therein lie some of the challenges.
I love products with innovative, novel interfaces — modern iOS isn’t a simply a product, but a platform. Its designers bear responsibility to make the system look good even in uncontrolled situations where third party developers like myself come up with new, unforeseen ideas. That leaves us with the question of how we can embrace a new, more complex design paradigm for interfaces.
A great thing that could come from this is new design tools for an era of designing interfaces that go so far beyond placing series of rounded rectangles and applying highly limited effects.
When I spoke of designing fun, odd interfaces in the ‘old days’, this was mostly done in Photoshop. Not because it was made for UI design — quite the contrary. It just allowed enough creative freedom to design anything from a collection of simple buttons to a green felt craps table.

Green felt, rich mahogany, shiny gold and linen in the span of about 450 pixels. If what is announced is similar to what I just theorized, it’s the beginning of a big shift. As interfaces evolve with our more ambient sense of computing and are infused with more dynamic elements, they can finally feel grounded in the world we are familiar with. Opaque, inert and obstructive elements might occupy the same place as full-screen command line interfaces — a powerful niche UI that was a marker in history, passed on by the windowed environment of the multi-tasking, graphical user interface revolution.
Science Fiction and Glass Fiction
The interfaces of computers of the future are often surprisingly easy to imagine. We often think of them and feature them in fiction ahead of their existence: our iPhone resembles a modern Star Trek tricorder; many modern AI applications resemble the devices in sci-fi movies like ‘Her’ and (depressingly) Blade Runner 2049. It’s not surprising, then, that concept interfaces from the likes of Microsoft often feature ‘glass fiction’:




It’s a beautiful, whimsical UI. Unfortunately, it only exists in the fictional universe of Microsoft’s ads.
The actual interface is unfortunately not nearly as inspired with such life and behavioral qualities. The reason is simple: not only is the cool living glass of the video way over the top in some places, but few companies can actually dedicate significant effort towards creating a hardware-to-software integrated rendering pipeline to enable such UI innovations.
Regardless, we like to imagine our interfaces being infused with this much life and joy. The world around us is — but our software interfaces have remained essentially lifeless.
And that brings us to Apple. There was an occasion or two where Apple announced something particularly special, and they took a beat on stage to pause and explain that only Apple could do something like this. It is a special marriage of hardware, and software — of design, and engineering. Of technology and the liberal arts.

And that still happens today. Only Apple could integrate sub pixel antialiasing and never-interrupted animations on a hardware level to enable the Dynamic Island and gestural multi-tasking; only Apple can integrate two operating systems on two chips on Vision Pro so they can composite the dynamic materials of the VisionOS UI. And, perhaps only Apple can push the state of the art to a new interface that brings the glass of your screen to life.
We’ll see at WWDC. But myself, I am hoping for the kind of well-thought out and inspired design and engineering that only Apple can deliver.
All writing, conceptual UI design and iconography in this post was made by hand by me. No artificial intelligence was used in authoring any of it.
-
Rewrites and Rollouts
iOS 26 is here. Every year we release major updates to our flagship apps alongside the new version of iOS, but not this year. Rather than stay silent and risk Silkposts, let’s share our thoughts and plans.
Deciding When It’s Time to Move On
In 2017 we launched the first version of our pro camera, Halide. In those days, the days of the iPhone 7, you just wanted manual control over focus, shutter speed, white balance… controls you expect in a classic camera.
Today, algorithms matter as much as a camera’s lens. Halide kept up with changing times by offering control over these algorithms, and it became one of our most popular features, but we have to admit we aren’t happy with how things evolved, with too many controls tucked away in settings.

This is getting busy. How did things get so complicated?
Our app grew organically from its 1.0, and while we still love its design, we believe it will hit a bit of an evolutionary dead-end. Almost 10 years later, cameras and the way we take photos have changed a lot. We have big plans, and if we’re going to be build the best camera for 2025 and beyond, we need to rethink things from the ground up.
For example, rather than bury the controls from earlier in settings, what if we put them right next to the shutter?

A change like this may sound simple, but these changes have ripple effects across our entire interface and product. I’ll spare you a few thousand words and leave Sebastiaan to walk you through our big new design sometime soon.
If our visuals show cobwebs, let’s just say the code hosts a family of possums. Since 2017, Apple’s SDKs changed faster than we could keep up. Refreshing our codebase should improve speed, reliability, polish, and cut down the time it takes to ship new features.
It sure sounds like we should rewrite Halide.
If you’ve ever taken part in a rewrite, I know your first reaction is, «Oh no,» and as someone who lived through a few big rewrites, I get it. Big rewrites kill companies. It’s irresponsible to do this in the middle of iPhone season, the time we update our apps to support the latest and greatest cameras.
So we are not rewriting Halide right now.
We rewrote it two years ago.
In Summer 2023, we began our investigation into a modern codebase. We built a fun iPad monitor app, Orion, test the maturity Apple’s new frameworks and experiment on our own new architecture. We were delighted by the results, and so were you! We were surprised Orion only took 45 days.
This gave us the confidence to test our platform on a bigger, higher-stakes project: our long-awaited filmmaking app Kino. We began work in Fall 2023, shipped in under six months, and won 2024 iPhone App of the Year.
record scratch yep, that’s me. You’re probably wondering… This signaled our new architecture was ready for prime time, so earlier this year, we drew a line in the sand. In our code, we renamed everything Halide 2.x and earlier, «Legacy Halide.» Mark III will be a clean break.

A few files in Xcode After a few weeks of knocking out new features faster than ever, it was clear this was the right decision. Kino let us skip over the hard and uncertain part, and now all that’s left is speed-running the boring part of translating the old bits to the new system.
Through The Liquid Glass
In June, Apple unveiled the new design language of iOS 26, Liquid Glass, and it threw a monkey wrench in all of our plans. As someone who worked on a big app during the iOS 7 transition, I know platform rewrites are wrought with surprises all the way up to launch.
Before we decided how to proceed with our flagship apps, and its effects on Mark III, we need to investigate. So we returned to Orion, our low-stakes app with fewer moving parts. Updating Orion’s main screen for liquid glass took about a day, but it was not without snags, like when I spent an hour in the simulator fine tuning the glass treatment of our toolbar only to discovered it rendered differently on the actual device.
We moved on to Kino, which already aligned with the iOS 26 design system pretty well. Sebastiaan updated its icon treatment, which looks great when previewed in Apple’s tools.

The version previewed on Icon Composer However, when we loaded it on the device…

The version on a real device This issue still persists in the final version of iOS 26, and filed a bug report with Apple (
FB20283658). We’ll hold off on our Kino update until it’s sorted out.None of these issues are insurmountable, but troubleshooting iOS bugs for Apple can be its own part-time job. As a team with only one developer, this left us with three options for Halide:
Option 1: Embrace Liquid Glass in Legacy Halide. Liquid Glass paradigms go beyond the special effects, such as its embrace of nested menus. Reducing the new design system to a stylistic change— a glorified Winamp skin— is a recipe for disappointment. Unfortunately, a deep rethinking of legacy Halide would force us to halt Mark III development for months, just to update a codebase on its way out.
Option 2: Rush Mark III with Liquid Glass to make the iOS 26 launch. Even before Apple unveiled the Liquid Glass treatment, Mark III was arriving at similar concepts. We’re confident that the two design systems will fit well together. So what if we tackle both challenges at once, and target an immovable iOS 26 deadline? Nope. A late app is eventually good, but a rushed app is forever bad.
Option 3: Wait to launch a full Liquid Glass redesign alongside a rock solid Mark III. This is what we did, and we think it paid off big time. Earlier this week we released an early preview of our new UI (without any liquid glass) to Halide subscribers via our Discord. The results were overwhelming positive.
The Rollout (and early upgrade perks)
That’s not to say we have nothing to show for iOS 26. Today we’re launching Orion 1.1. It retains most of its retro aesthetics, but we’re also digging how the liquid glass treatment interacts with our custom CRT effect.

We’ve also added a long-requested feature: fit and fill, for aspect-fill ratios. You can finally play your virtual console games in full screen glory!
For Kino, we’re holding off on our update until we sort out the iOS bugs. Maybe things will be fixed in an iOS 26.1 update.
We have an update ready for our award winning long exposure app, Spectre. Unfortunately, it appears the App Store portal is broken at the moment, and won’t allow us to submit the update.

Luckily, we submitted an update to Halide before running into this issue. It updates the icon, fixes a few glitches, and includes basic stylistic updates. We just released this update, moments ago.
Earlier today, we received our new phones and we’ve begun running them through the paces. We’ll submit an update to support the new hardware and fix any bugs, assuming the App Store lets us.
These updates to Halide are a swan song for the legacy codebase. After this month, all of our energy goes Mark III, which includes the real Liquid Glass alongside a redesigned camera for a new age.
If you’d like a peek at things to come, we’ve opened another thousand spots in TestFlight to Halide subscribers. It’s got tons of bugs, and parts are incomplete, but will give you an idea of where things are headed. If you’d rather wait for a polished experience, or prefer a one-time-upgrade, no problem. As we announced last winter, everyone who bought Mark II eventually gets Mark III for free.
It feels bittersweet moving on. Hopping into Legacy Halide to crank out updates feels a bit like a slog, while the new Mark III design and codebase is a joy. It makes me wish I wish I’d gutted Halide years ago. At the same time, there are moments I feel warmth for a project where I spent almost a decade of my life. It helps you understand why nostalgia means, «A pain from an old wound.»
In Summary
- We have an Orion update out, today
- We have a Spectre update, soon
- We might have a Kino update, soon?
- We have a Halide update, today
- Halide Subscribers can sign up for the Mark III TestFlight, today
- We’ll have a wider Mark III preview, this Fall
- If everything goes according to plan, we expect to launch Mark III, this Winter
This won’t be the last you’ll hear from us this Fall. Stay tuned for a post from Sebastiaan on our new design, along with our annual iPhone reviews.
-
iPhone 17 Pro Camera Review: Rule of Three
Every year I watch the Apple Event where Apple announces the latest iPhones, I can’t help but sympathize with the Camera team at Apple. They have to deliver something big, new, even ground-shaking, on a regular annual cadence.
And every year, people ask us the same thing: is it really as big of a deal as they say?

iPhone 17 Pro in Silver — shot on iPhone Air
iPhone 17 Pro looks very different at first glance. It’s the biggest departure from the style of camera module and overall Pro iPhone style since iPhone 11 Pro. It still packs three cameras on the back and one on the front. It has an actual camera button (even its svelte sibling, the iPhone Air gets one of those, albeit smaller) and a few notable spec changes, like a longer telephoto zoom. Or is it? And is that really all there is to it?





To find out, I took iPhone 17 Pro to New York, London and Iceland in just 5 days.









We do not get early access like the press: this is a phone we bought, to give you an unfiltered, real review of the camera. All the photos in this review were taken on iPhone 17 Pro, with the Apple Camera app or an in-development version of Halide Mark III with color grades.
Let’s dig in — because shooting with iPhone 17 Pro, I was surprised by quite a few things.

What’s New
iPhone 17 Pro packs what Apple calls the new ‘ultimate Pro camera system’. This is the last upgrade the camera bump — er, I mean, plateau — was arguably still lacking.
After its introduction with iPhone 11 Pro, all cameras were shooting at a fairly standard 12 megapixels. After the ultra wide camera was upgraded to 48 megapixels in iPhone 16 Pro, Apple finally upgraded the telephoto camera sensor to a 56% larger unit with 48 megapixels. Not only does this allow for sharper shots, but Apple is so confident in its center-crop imaging pipeline that it argues it allows for a 12-megapixel 8× zoom of ‘optical quality’. More on that one in its own, detailed section: I am a big telephoto fan, and this announcement had me immediately excited to test it out.
One of the biggest upgrades this year actually comes to the front camera — but its quality impacts will be far less noticeable to most people than most tech pundits initially predicted. In a classic Apple move, the company replaced the bog standard selfie camera with a much larger square-sensor packing camera, but instead of now simply shooting 24 megapixel square shots it added a very clever Center Stage system to reframe your selfie shots to include people into it automatically or save you from twisting your arm to take a landscape selfie shot.

Apple’s square sensor makes it part of a small elite lineup of square sensor cameras like the latest Hasselblad 907X This is a very impressive piece of engineering, and a classic Apple innovation in that the hardware change is essentially invisible. Us camera geeks love the idea of a square sensor, but in the Camera you will not find a way to take images with the full square image area; it just puts the square area of the 24MP sensor to use for 18 MP crops in their landscape or portrait depending on the subject matter.

Apple (in my opinion, correctly) figured that if this artistic choice being made by your iPhone offends you as an artist, you are free to use one of the better cameras on the rear of the iPhone or disable the automatic framing feature altogether, returning its behavior to a ‘normal’ front-facing selfie camera.
Finally, there’s some notable changes to processing. «More detail at every zoom range and light level». In particular, Apple stated in its keynote that deep learning was used for demosaicking raw data from the sensor’s quad pixels to get more natural (and actual, existing) detail and color in every image. In particular, Apple went to point out this also meant that its AI upscaling that’s used to make those ‘2×’ and ‘8×’ ‘lenses’ (that are actually the center portion of the 48MP Main and Telephoto cameras) is significantly improved.

Finally, and not insignificantly, the entire phone has gotten a total design overhaul. Its interface and exterior are both composed of all new materials, and some big changes under the hood (or ceramic back panel, if you will) allow for even more performance out of the latest generation Apple Silicon chip inside.
What’s Not New
While the entire iPhone looks brand new, the cameras have some familiar parts. The Main camera sensor and lens is identical to the iPhone 16 Pro’s, which in turn is identical to the iPhone 15 Pro’s. The ultra wide camera, too, is the same as last year’s 48 megapixel snapper.

The Ultra Wide camera returns to continue making wide, sweeping compositions The Camera Control from iPhone 16 Pro returns on all iPhone 17s and iPhone Air. No significant updates here, but I still find it a fantastic addition to the iPhone for opening my choice of camera app and taking a photo. The adjustments, on the other hand, still seem fiddly to me a year later. I was hoping for some more changes to it, perhaps even a face lift along with iOS 26 — but it has remained essentially the same save for some additional settings to fine-tune it to your liking.
Party in the Front, Business in the Back
This is, without a doubt, a great back camera system. With all cameras at 48MP, your creative choices are tremendous. I find Apple’s quip of it being ‘like having eight lenses in your pocket’ a bit much, but it does genuinely feel like having at least 5 or 6: Macro, 0.5×, 1×, 2×, 4× and 8× .




The — unchanged save for processing tweaks — ultra wide and main camera are still great. I find the focal lengths ideal for day-to-day use and the main camera especially is sharp and responsive. Its image quality isn’t getting old (yet).
What’s beginning to get very old is its lack of close focusing. Its new sibling camera in iPhone Air focuses a whole 5 cm (that’s basically 2 inches) closer, and it’s very noticeable. For most users, arms-length photography is an extremely common use case: think objects you hold, a dish of food or an iced matcha, your pet; you probably take photos at this distance every day. And if you do, you’ll have encountered your iPhone switching, at times rapidly, between the ultra wide ‘macro’ lens and the regular main camera — one of which produces nice natural bokeh and has far higher image quality. It’s been several years of this now, and it’s time to call it out as a serious user experience annoyance that I hope can be fixed in the future. This is, incidentally, one of the reasons why our app Halide does not auto-switch lenses.


We love a good 2× lens
Shooting at 2× on iPhone 17 Pro did produce noticeably better shots; I believe this can be chalked up to significantly better processing for these ‘crop shots’. Many people think Apple is dishonest in calling this an ‘optical quality’ zoom, but it’s certainly not a regular digital zoom either. I am very content with it, and I was a serious doubter when it was introduced.

The entire camera array continues to impress every year in working in unison: this year, more than ever, my shots were very well color and color temperature matched and zooming was more smooth between lenses than I’d seen.
It’s wild that they pull this off with 3 different camera sensors and lenses. It’s essentially invisible to the average user, and that’s a real feat. No other company does this as well: pick up an Android phone and go through their copy of the iOS Camera zoom wheel to see for yourself sometime.
4× the Charm
I have previously written perhaps one too many love letter to the 3× camera lens that the iPhone 13 Pro, 14 Pro had. While it had a small sensor, its focal length was just such a delight; one of my favorite go-to lenses is 75mm. Shooting with longer lenses is a careful balance of framing, and it’s harder the longer the focal length is.


I had to actually think about achieving a nice scene here with the telephoto lens, rather than the ultra-wide’s ‘shoot my view’ approach.
Creative compositions are much easier when you have to select what not to include rather than to focus attention on one thing; the devil is in the detail.

Satisfying compositions are everywhere if you start looking for them. Here’s a bridge vs. bridge shot. The previous iPhone traded some image quality in the common zoom range (2-4×) for reach. I found the 16 Pro’s 5× lens reach spectacular, but creatively challenging at times for that reason. There was also a tremendous gap in image quality between a 3× – 4× equivalent crop of the Main camera and the telephoto, which made missing an optical lens at that range even more painful.
4× is an elegant solution; while I do still miss 3× — 3.5× would’ve been perfect, but admittedly not nearly as numerically satisfying as 1-2-4-8× — the lens’ focal length is fantastic for portraiture and details alike, and its larger sensor renders impressive detail:

Even in low light, the lens performs admirably — due to a multitude of factors: excellent top-tier stabilization of the sensor 3D space, software stabilization, good processing and a larger sensor.
It is still is very much reliant on processing and Night Mode compared to the Main camera, however — expect those nighttime shots to require ProRAW and / or Night Mode to get the most out of a shot.


This is a pretty significant improvement over the way the previous 5× lens handled a dark scene.
Even then, things will look fairly ‘smoothed over’:


Detail in the buildings here is entirely smoothed over by noise reduction. It’s a larger sensor, but a long lens and still relatively small sensor just means noisy images at night, which shows up as heavy noise reduction in the shadows.
Regardless, this is a tremendous telephoto upgrade, and if you were as much of a telephoto lover as me it might well be reason alone to upgrade.
Are the 48 megapixel details truly visible? Well, judge for yourself:


I find that the resolution is great, though the lens is a bit soft.


I found that the softness of the lens and its lack of heavy handed sharpening in post (at least in Halide’s Process Zero or Camera’s ProRAW capture mode) felt downright atmospheric. You can always add heavy sharpening later if you want that effect.
I like this softness, myself; it is to Apple’s great credit that there isn’t some kind of heavy handed sharpening algorithm that pushes these images to look artificially sharper.
It renders very naturally, extremely flattering for portraiture, and showcases processing restraint that I haven’t seen from many modern phone makers. Bravo.



It also has an additional trick up its sleeve thanks to those extra pixels and processing: an additional lens unlocked by cropping the center 12 MP area of the image, along with some magical processing. Does it really work?
8× Feature’s a Feat
The overall experience of shooting a lens this long should not be this good. I’ve not seen it mentioned in reviews, but the matter of keeping a 200mm lens somehow steady and not an exercise in tremendous frustration is astonishing. Apple is using both its very best hardware stabilization on this camera and software stabilization, as seen in features like Action Mode.
You will notice this while using the camera at this zoom level: the image will at times appear to warp in areas of your viewfinder, or lag behind your movement a little bit. The only way to truly communicate how impressive this is is to grab a 200mm lens and hand-hold it: you’ll find that it magnifies the small movements of your hand so much that it is really hard to frame a shot unless you brace it.
And then there’s the images from this new, optimized center-crop zoom.


To say I’ve been impressed with the output would be an understatement.


Sometimes you get a little bit of a comedic effect as you realize you are seeing things through the telephoto lens you hadn’t even noticed or can’t quite make out with your own eyes:


And other times you become that stereotypical bird photographer (or in my case, a wanna-be). I will note that even with its magical stabilization, getting a picture of something in rapid motion is a bit of a challenge…


… but the results are truly magical if you do nail it. Is this tack sharp?

Well, no, but this is 500% detail of a crop of a phone sensor shooting at 200mm at a fast moving bird on a cloudy day. I am pretty impressed.
It allows for some astonishing zoom range through the entire system.





I mentioned it before, but I want to reiterate it because it’s such a fun creative exercise for anyone with this phone: I believe that the longer the lens, the more of your skills in creating beautiful compositions and photos will be challenged. It’s just not that easy — but it also means you suddenly find different beautiful photos in what was previously a single frame:



The details are often prettier than the whole thing. Now I get to choose what story my image tells. What caught my eye, or what made the moment so magical. In video this is also lots of fun; I will post some Kino shorts on our Instagram to highlight the fun of moving video details of a scene.
Another example: here, Big Ben can take the center stage. As I shoot at 4×, I get an ‘obvious’ composition:


At 8×, I am presented with a choice: I can capture the tower, or the throng of people crossing the bridge and note as the evening sun lights up the dust in the air:


I like what this does for you as a photographer. Creativity, as many things do, can function as a muscle. Training it constantly, and stimulating yourself by forcing creative thought is what helps you become better at the craft.
This is a little artistic composition gym in your pocket. Use it.

Trust the Process
As we mentioned in our last post, algorithms are about as important — perhaps more so — than the lens on your camera today. There’s a word for that: processing. We’re keenly aware of just how many people are at times frustrated with the processing an iPhone does to its imagery. It’s a phenomenon that comes from a place of exceptional luxury: without its mighty, advanced processing an iPhone would produce a far less usable image for most people in many conditions.
I believe the frustration often lies in the ‘intelligence’ of processing making decisions in image editing that you might consider heavy handed. Other times, it might be simply reducing noise that makes an image look smudgy in low light.

Processing makes curious mistakes at times. Here, a telephoto image came out looking a bit mangled. Image processing is the one area where phones handily beat dedicated cameras, for the simple reason that phones have far more processing power at their disposal and need to do more to get a great image from an exceptionally small image sensor. We review it as intensely, then, as a new bit of hardware. How does it stack up this year?
Well, it’s somewhat different.

iPhone 17 Pro above, iPhone 16 Pro below On the Main camera, don’t expect huge changes. I found detail to be somewhat more natural in the Ultra Wide camera, but even here it was somewhat random-seeming if the results were truly consistently better. Overall, image processing pipelines are so complex now that it’s hard to get a great idea of the changes over just a week. The images overall felt a bit more natural to me, though — although I still prefer shooting native RAW and Process Zero shots if I have the option to.
As I mentioned in the earlier section, it is truly noticeable that the 2× mode on the Main camera is a lot better. Not only is the result sharper, it also just looks less visibly ‘processed’; a real win considering Apple claims this is actually due to more processing!
Finally, you might wonder: if these images are a bit better processed and all this being software, why isn’t this simply being rolled out to the older iPhones just the same? Is Apple purposefully limiting the best image quality to just the latest iPhones?
The answer is yes, though not through inaction or some kind of malevolent and crooked capitalist lever to force you to upgrade. Software in itself might be easily ported across devices, but image pipelines like the ones we see on the iPhones 17 Pro are immensely integrated and optimized. It’s quite likely the chip itself, along with hardware between the chip and sensor are specifically designed to handle this series’ unique image processing. Porting it over to an older phone is likely impossible for that reason alone.
Video for Pros
This is mainly a photography review, but I also increasingly shoot video and make an app for it. iPhone 17 Pro has some absolutely wild features for pro video. They put the capital P in Pro; things like Genlock and ProRes RAW are far beyond what even advanced amateur users will likely use.




Video stills of graded Log footage from our app Kino
That being said, these features aren’t just for Hollywood. While it’s true that some of these latest ultra-powerful video pro features will allow the iPhone to become even more of a pro workhorse in terms of capturing shots and become usable in significant productions, the introduction of Apple Log with iPhone 15 Pro and other technologies are really just fuel for developers to run with.
When we built Kino, we wanted to make it so you can actually use things like Apple Log and the Pro iPhone’s video making advancements without an education in the fine art of color grading in desktop software and learning what shutter angle is.
Adding technologies like this not only make the iPhone a truly ‘serious camera’, but since it’s a platform for development, it also creates use cases for these technologies that have not been possible in traditional cameras used for photography and videography.
This is super exciting stuff, and I think we’ll see the entire field evolve significantly as a result. With this set of new features — Open Gate recording, ProRes RAW, Apple Log 2 — Apple is continuing to build an impressive set of technologies that let it rival dedicated cinematic cameras without compromising on the best part of the iPhone: that it’s really a smartphone, which can be anything you want it to be.
A Material Change
Everything’s new on this phone, appearance wise: a return to aluminum is welcome. The new design cools itself much better and that’s noticeable when you shoot a lot. It feels great in the hand and hopefully will age as nicely as my other aluminum workhorses from Apple. Apple even markets it as being especially rugged:



On the other hand, its other user-facing aspect — iOS itself — has also gotten a new material shift.
Liquid Glass is here with iOS 26, and it brings about an entirely new Camera app design, some much desired improvements to the Photos app, and a general facelift to the OS. While this isn’t an iOS review, I will say that it’s beautiful, and I’m a fan of Liquid Glass. iOS 26, however, has been a bit of a rough start: I ran into a lot of bugs even with the latest updates installed on the iPhone 17 Pro, from bad performance (OK) to photos not showing up for a long time to distorted images and the camera app freezing or being unusable (not so OK).

It seems all telephoto images shot in native RAW have this light band artifact on the left side of the frame. Not great. Big releases are ambitious, and difficult to pull off. I give tremendous credit to the teams at Apple for shipping iOS 26 along with these new devices, but in everyday use it truly felt like using a beta release. The constant issues I ran into did not make me feel like I was using a release candidate of an operating system.*
*Feedback reports on these issues have been sent to Apple.
Conclusion
I think the iPhone Air serves a very important purpose: it allows Apple to make one phone a jewelry-like, beautiful device that is like a pane of glass and one that is decidedly like the Apple Watch Ultra: bigger, bulkier and more rugged.

For years, I was a bit annoyed at the shininess and jewel-like qualities of the Pro, and to be entirely honest, I do now miss it a little bit. This is a beast in both performance and appearance, and it feels almost a little unlike Apple. I think, however, that the direction is correct and significant.
Our phones are such a central part of our lives now that it feels significant be able to have a choice for a product that prioritizes the true ‘pro’ — much like MacBook Pro did in a fantastic way with the thicker, bulkier M1 series.
This, then, might be the first ‘workhorse SLR’ of the iPhone family, if the regular iPhone is a simple Kodak Brownie. In that, some of the simplicity that delighted in the first iPhone may have been lost — but the acknowledgement that complexity is not the enemy is a significant and good step. As a camera, it is first and foremost a tool of creative expression: gaining permission to become more fine-tuned for that purpose makes it truly powerful.
It’s left as an exercise to the user to excel at their purpose as much as the phone does.
All images in this review were taken on iPhone 17 Pro (unless otherwise noted). Photos were taken with an in-development version of Halide Mark III with built in grades for adjustment, with a smaller portion taken with Apple Camera in ProRAW and stills from the Kino app using built-in grades for adjustment.
-
Requiem for the Rangefinder: An iPhone Air Review
Last week I set out to write a few thousand words on the iPhone Air, but it turns out I only need three: the lesser iPhone. Compared to the Pro and baseline models, it has fewer cameras and a smaller battery. For an extra $100, you can upgrade to an iPhone Pro and get power features like ProRAW and LiDAR. What was Apple thinking?
Every few years, Apple tests a new product category with a «wildcard» iPhone. In 2015, that was a Plus sized screen. In 2017, the iPhone X ditched the home-button and gained a notch. In 2019, the Pro introduced bleeding-edge technology at a premium price point
Some experiments flop. For years, people begged for a smaller iPhone, so Apple delivered the iPhone Mini in 2020 to lukewarm sales. I’d wager it was because, 13 years after the iPhone’s debut, we now use our phones like we used to use computers. The era of small screens is over.
From the mini’s ashes comes the Air, a phone as easy on your hands as it is on your eyes. It may be as droppable as any modern iPhone, but the double Ceramic Shield and titanium frame makes it as durable as ever.
Last week I set out to write a few thousand words on the iPhone Air, but found my mind pulled in another direction, to an iconic camera design. You may not know its name, but you know its work.



Guerrillero Heroico, V-J Day in Times Square, Rising a Flag over Reichstag,
Invented by Oskar Barnack in 1913, the compact 35mm rangefinder may be the most influential camera of the 20th century.

M6 Titanium By modern standards, early rangefinders were lesser cameras, lacking auto focus and auto exposure.


In many ways, the rangefinder is outright flawed. It’s hard to frame close shots, it doesn’t do macro, and zoom lenses don’t exist. This isn’t a camera for National Geographic. Yet thanks to its compact size, durability and stealth, the 35mm rangefinder excelled at candid portraiture, street photography and journalism.

D-Day, from Robert Capa’s The Magnificent Eleven, shot on a Contax II SLR cameras addressed the flaws, winning the hearts and wallets of consumers by trading size and noise for convenience. Still, there’s something about the rangefinder that feels perfect. When compact digital cameras removed the need for film or mirrors, a decade of experimentation converged designs that resembled 35mm rangefinders, minus one important feature: taste.

Fujifilm FinePix F10, 2006, via Wikipedia In a world of consumer electronics made of cheap plastic and garish logos, the iPod proved people would pay a premium consumer electronics with beautiful aesthetics. So in 2010, Fujifilm tried a bold experiment. They designed a camera with the conveniences of a modern point-and-shoot, a fixed 35mm lens, and wrapped it in the aesthetics of the classic rangefinders.

Fujifilm X100VI Their X100 should have been a swan song to a bygone era. In a few years, the point and shoot market collapsed as normal people realized smartphones were good enough. The X100 debuted at $1,199, twice the price of an unlocked iPhone 4, it proved a smash hit, defining a new camera category.
15 years later, Fujifilm just launched their high-end, $6,000 variant, the GFX100RF. The RF standing for rangefinder, but this refers to its design language, not the hardware. Today, «rangefinder style» means, «a beautiful, rugged point-and-shoot with a fixed, wide angle lens.» It’s a device that functions as both camera and fashion accessory. Does this sound familiar?

The Air distills an iPhone to its spirit. While the iPhone Pro’s bevy of lenses make it perfect for a trip to the Galapagos, the Air seems perfect for street photography, journalism, and candid portraits.
Is one lens really enough? Will you miss ProRAW and LiDAR? To put this to a test, I took to New York with an iPhone Air and an M6.





A mixture of iPhone and Film photos. Which is which? Keep reading.
The Natural Focal Lengths
Before we dig into the iPhone, let’s talk about lenses in general. Why are 50mm and 35mm the most popular focal lengths for documentary work? There’s a myth that 50mm approximates human vision. In fact, our entire field of view is technically 17mm, but visual perception is more nuanced than a single number.
Humans actually see on two levels. Our peripheral vision is very wide, but low detail. It probably evolved to spot predators out of the corner of our eye. We also have a narrow but high detail central vision, which you’re using right now to read these words. Central vision is about 43mm, which sits between 50mm and 35mm.
I’m not saying scientists met with lens makers to arrive at those numbers. Photographers probably just bought more of those lenses because they felt right. Still, it’s interesting there’s physiology to back it up.
Anyway, if you go from 35mm to 28mm, you get a little extra breathing room. It comes in handy in close quarters or wide expanses.

Shot on film. 28mm focal length. Of course you have to deal with more unwanted stuff in your shots.

Shot on film, 28mm focal length But you can always crop to 35mm.

Shot on film. 28mm, cropped. If you don’t know what lens you’ll need for the day, there’s a simple rule of thumb. Can you only carry one lens? Make it a 35mm. Can you carry two? Make them 50mm and 28mm.
I made the mistake only packing my 50mm for my trip to Grand Central, but the 26mm on the iPhone Air came to the rescue.


50mm shot on film vs 26mm on the iPhone Air.
Will you miss the ultra-wide lens, a stable of almost every iPhone for the last six years? There’s an easy way to check. In the Photos app on a Mac, create a new Smart Album.

Focal length is native sensor size, not full-frame equivalent I found only three photos from the last year that make me go, «I’m glad I had that ultra-wide!» The first was the 7-mile wide Hubbard Glacier in Alaska.

Glacier at Disenchantment Bay, Alaska, shot on the iPhone 16 Pro Ultra-Wide Lens The second was the exterior of the Oculus:

Shot on the iPhone 16 Pro Ultra Wide The third wasn’t wide at all! Don’t forget that lens doubles as a macro.

Shot on iPhone 16 Pro I bet I could get away with the panorama mode in Apple’s camera, but it’s a bit disappointing to lose macro. Halide may have a macro feature that works on every iPhone, but we’re the first to warn users that software cannot match a true macro lens.
If you love bug shots, the Air is not for you. But the available focal lengths are more than enough for the rangefinder crowd.
Computational Photography and (Lack of) ProRAW
Now that we’ve gotten composition out of the way, let’s talk about image quality. By that I mean algorithms.
Camera algorithms are a faustian deal. Sure, they «fix» photos, raising shadows and taming highlights, but it costs you control. Compare the earlier shot of the Oculus on film to the default shot out of the first-party camera.

I know this down to taste, but after seeing the dramatic contrast of the black and white film earlier, this all-too-perfect lighting feels wrong. It makes me as uncomfortable as staring into the cold dead eyes of generative AI.
Let me get this out of the way: I am not one of those elitists who resent how the iPhone has become Gen-Z’s gateway to photography. I’m glad we’re at the point where beginners don’t need to get bogged down in technical details like film ISO and f-stops before they can get a decent photo, let alone something you’d hang on your wall.
The issue is that «fixing» the lighting in photos means wrestling contrast from the hand of the photographer. Contrast is one of the photographer’s most powerful tools!
Apple addressed this in 2020 when they released the image format they call ProRAW. If you’re interested in its tradeoffs, we wrote a few thousand words about them, but in short, ProRAWS are not RAWs in the traditional sense. These a semi-baked version of their computational photography, with methods to turn down effects like tone-mapping and sharpening. That’s all moot in the case of the Air, as Apple restricts ProRAW to its Pro models.
ProRAW hasn’t changed much since its introduction in 2020. Instead, Apple has focused its resources on a new feature called «photographic styles.» In addition to color presets, you have access to a new «tone» control. Maybe you won’t get the latitude of ProRAW, but maybe we can match the film look?

Photographic Style Not bad, but there are two problems. One, unlike ProRAW, Apple has limited this control their Photos app. You can’t tweak tone in third party apps like Lightroom or Halide. The second problem occurs when you zoom in.

Notice a lack of texture. That’s because this photo was not generated from a single capture. My iPhone took a series of captures, and merged them together to improve dynamic range and reduce noise. There’s nothing you can do about this with Photographic Styles. Even ProRAWs have limited control over this, because noise reduction is a byproduct of Apple’s algorithms.
Whenever people accuse their phone of applying digital makeup to faces, or textured objects turning to plastic, this is what they’re talking about. When your annoying hipster friend goes on and on about «the warmth of analog,» they’re talking about film grain, the extra texture caused by the random activation of silver halides as light strikes emulsion.

Film grain Digital cameras may act different than film, but many people (myself included) find that the noise from a digital camera sensor adds an organic quality. The good news is that back by capturing a traditional, Bayer (a.k.a. «Native» a.k.a. «Real») RAW. Every iPhone since the iPhone 6S supports Bayer RAW capture.

Bayer RAW Noise shot on the iPhone Air Thanks to the binning on the 48 MP quad-bayer sensor, the noise is soft and subtle. Maybe too subtle! We’ve gotten requests on our Discord for more texture, so I whipped up synthetic grain.

Anyway, let’s compare film, photographic styles, and Bayer RAW.



M6, Photographic Styles, and Halide Mark III
One thing you’ll miss about ProRAW is editing latitude. When shooting high dynamic range scenes, you can bring out details in the shadows that you don’t even know exist. Bayer RAWs can push and pull exposure a few stops, but it can’t work the miracles. For many people, that’s a serious drawback. For me? It makes things more fun.
Like every mid-century camera, classic rangefinders lacked auto focus and auto exposure, forcing you to think through every shot. They were technically obsolete by the 1970s, with SLRs like the Canon AE-1 tackled automatic exposure. By 1980s, we had auto focus.
Yet the fully manual nature of classic rangefinders still captivates camera nerds 40 years later. There’s just something about knowing that you, not the machine, took the photo. If you feel the same way, the lack of ProRAW makes the Air more of a camera-camera than the iPhone Pro.
A Camera for the Present Moment

Billionaire’s Row, Shot on the iPhone If I could pinpoint the moment the iPhone became the definitive camera for breaking news, it was January 15, 2009.

By 2012, you’d see iPhone 4S photos on the cover of Time Magazine.

The iPhone is so important for capturing once-in-a-lifetime moments that every iPhone now ships with a dedicated capture button. But how do we test an iPhone’s ability to capture history?
Luckily, I live in a crumbling empire. Shortly before I started this review, America’s mad king assaulted the first amendment.

Film 
iPhone 
iPhone 
Film 

Film, iPhone
One hundred years later, black and white remains the best look for a nation’s spiral into fascism.
This march didn’t start as a protest for Jimmy Kimmel. Officially, this was the Make Billionaires Pay March, a protest against climate abuse by billionaires. One highlight were the paper mache effigies of Elon and Bezos.


Film


iPhone Air
The centerpiece of the march was the 160 foot long Climate Polluter’s Bill, detailing $5 trillion of damage caused by climate change in the last ten years.


Film and iPhone
I think the reason the rangefinder captured so many great candid moments came down to its humble presentation. It didn’t scream «Camera!» like its contemporaries.

Via Wikipedia Today, seeing someone with any sort of dedicated camera draws attention to itself. In the past this might have worked to a reporter’s attention, but today feels like a target.
If our country continues its descent into authoritarianism, the most important feature of our cameras will be security. At the moment, the iPhone is the most secure camera in the world. At the moment, you can download third-party apps like Signal for anonymous, end-to-end encryption. How long will this last? As long as we keep talking about it.
Film Intermission


Shot on Film The Lost Art of Building Things That Last
If I treat my decades-old cameras right, they’ll last decades more. They never beg for software updates. I never wake up one morning to find the dials changed size and shape. It makes me happy thinking of a world before software.
Yes, I’m a developer, and I can’t look away from the version of iOS that shipped on these phones. To be clear, I’m not talking about aesthetics.

Moments after launching iOS 26 for the first time I don’t think the problem rests on their designers or engineers. These small bugs seem like the same mistakes I’ve made myself countless times. Whenever they’ve slip into a release, it’s generally because I ran out of time to find and fix them.
It feels like Apple rushed things out the door to make a Fall 2025 release. With another year of work— maybe just another few months— this could have been a smash hit. Instead we read stories about battery drain, accessibility, and other unforced errors.
It’s just a bit ironic that if you hold off on upgrading your iPhone, you can wait to upgrade iOS until the bugs get worked out. The people who will have the worst experience paid $1,000 at launch for a device running a beta OS.

Shot on Film Whatever Happened to Leitz Camera?
The M6, launched in 1984, is widely regarded as Leica at its peak. It added a light meter for convenience, but if you don’t like it, just remove the battery. The camera remained fully functional without power.
In 2002, Leica launched the M7, their first model with semi-automatic exposure. It drew backlash for adding electronics, which left you with limited control if the battery dies. They responded with the Leica MP («Mechanical Perfection») in 2003, which dropped the electronics and basically backtracked to the M6.
Leica was in a bad position. While the rest of the camera industry transitioned from film to digital, Leica was stuck serving a niche fan base of analog purists. Their first consumer digital camera was nothing more than a reskinned Fujifilm point-and-shoot. They later partnered with Panasonic for compact Leica Digilux 1 point-and-shoot, which failed to pay the bills.
By 2004, Leica was the verge of financial collapse. It was saved by Andreas Kaufmann, heir to a 1.5 billion euro inheritance from his aunt. Kaufmann bought a major stake in the company and set out to return them to profitability. Two years later, they launched their first digital rangefinder, the infamous M8. The infrared filter on the sensor failed to do its job, causing ugly IR interference, a problem mitigated by recalls.
Meanwhile, the company juiced revenue by slapping its logo on everything from Panasonic point-and-shoots to Fujifilm instant cameras, and now Android phones and silly iPhone accessories. I guess the real money is in merchandising.

The Leica Supreme Collab Let’s be honest, Leica was a status symbol long before its pivot into pure-brand. While war photographers went with Contax, artists took to Leica.

Stanley Kubrick Even if the classic M was more status symbol than tool, at least the engineering justified its price tag. Every device felt like a work of art, hand assembled in their factory in Wetzler. Today, they crank out many products on Chinese assembly lines, if you couldn’t tell by the price hikes due to tariffs.
Leica’s optics used to be unparalleled, but today’s Voigtländer glass is ever just as good for a fraction of the price. In fact, every film photo in this post shot at 28mm was shot with a Voigtländer.
Influencers aside, I don’t know any working photographers shooting on Leica digital cameras. That doesn’t seem to worry the company. In their own words, they make «jewelry.»

Today, 55% of Leica continues to be owned by Kaufmann’s investment firm, and the other 45% is owned by the Blackstone private equity group. Maybe the company will continue to print money for decades to come, like Hermes and De Beers. Or maybe brand saturation will make it lose its cool, like Supreme.
Regardless, the Leitz Camera where Oskar Barnack invented the 35mm camera 112 years ago, is dead.
Leica earned its reputation from stellar engineering. Precise, hand assembled cameras require a high price, which accidentally made them a status symbol. It also put them in a precarious position as technology marched on.
Apple’s greatest strength in the new millennium was its lack of nostalgia or reverence. Had another company invented such iconic products as the iMac or iPod, they would have milked those designs for decades— I remember rumors that the first iPhone would feature a click wheel! Yet time and again, Apple has discontinued successful products years before they outstay their welcome, so they can make room for the next big thing.
Apple’s engineering and taste earned it a spot alongside Leica or Porsche, but this proved both a blessing and distraction. They tried to get into high fashion with a $10,000 solid gold Apple Watch, and it flopped because they went about things backwards. At launch, Apple didn’t fully understand why the Apple Watch should exist, and they hid that with marketing until customers told them, «This is for fitness.» It’s ironic that if they hadn’t shot their shot at launch, I bet they could release a gold Apple Watch today.
Apple is known for beautiful, well engineered products, and I worry they damaged that reputation to hit an arbitrary deadline. I worry about Apple losing its sense of taste, as they send tacky push notifications to our Wallets to promote a movie, and sacrifice valuable screen real estate to promote paid services.


Apple still makes the best products in world, and I still buy them, but I hope someone in Cupertino is minding this course. Their biggest threat isn’t an Android as good the iPhone, any more than Per Se should worry about Gray’s Papaya. The only threat to Apple is Apple.


Wall Street, Shot on iPhone
The Verdict
Since it doesn’t have rangefinder, I won’t call it the modern rangefinder. The iPhone Air is the spiritual successor to the Leica M6.
It isn’t a camera for beginners, and you won’t take it on a safari, but the Air’s small size, discreet operation, and unmatched durability make it ideal for street photography, journalism, and candid portraits. You can buy phones with similar specs for half the price, but the premium pays for a beautiful piece of kit that is one-part tool, and one-part fashion accessory.
It’s a camera that distills photography to its essence. It may have less, but that’s what makes it fun. When you tap the capture button, you know that you, not the machine, took the photo.

This article may contain affiliate links.
No AI was used in this article’s production.
All product photos were shot on an iPhone 16 Pro with Halide. All street photography was captured on an M6 or iPhone Air running a pre-release build of Halide Mark III and its built-in grades.
-
What is HDR, anyway?
It’s not you. HDR confuses tons of people.
Last year we announced HDR or «High Dynamic Range» photography was coming to our popular photography app, Halide. While most customers celebrated, some were confused, and others showed downright concern. That’s because HDR can mean two different, but related, things.
The first HDR is the «HDR mode» introduced to the iPhone camera in 2010.

September, 2010 The second HDR involves new screens that display more vibrant, detailed images. Shopped for a TV recently? No doubt you’ve seen stickers like this:

This post finally explains what HDR actually means, the problems it presents, and three ways to solve them.
What is Dynamic Range?
Let’s start with a real world problem. Before smart phones, it was impossible to capture great sunsets with point-and-shoot cameras. No matter how you fiddled with the dials, everything came out too bright or too dark.


The result of trying to capture a sunset with an old-school camera.
In that photo, the problem has to do with the different light levels coming from the sky and the buildings in shadow, the former emitting thousands of times more light than the latter. Our eyes can see both just fine. Cameras? They can deal with overall bright lighting, or overall dim lighting, but they struggled with scenes contain both really dark and really bright spots.


Dynamic range simply means, «the difference between the darkest and brightest bits of a scene.» For example, this foggy morning is an example of a low dynamic range scene, because everything is sort of gray.

Screens have no trouble showing this low-contrast photo. Shot with Halide in Osaka. Most of our photos aren’t as extreme as bright sunsets or foggy mornings. We’ll just call those «standard dynamic range» or SDR scenes.
Before we move on, we need to highlight that the HDR problem isn’t limited to cameras. Even if you had a perfect camera that could match human vision, most screens cannot produce enough contrast to match the real world.
Regardless of your bottleneck, when a scene contains more dynamic range than your camera can capture or your screen can pump out, you lose highlights, shadows, or both.
Solution 1: «HDR Mode»
In the 1990s researchers came up with algorithms to tackle the dynamic range problem. The algorithms started by taking a bunch of photos with different settings to capture more highlights and shadows:



This full sequence has 16 photos. Via Paul Debevec.
Then the algorithms combined everything into a single «photo» that matches human vision… a photo that was useless, since computer screens couldn’t display HDR. So these researchers also came up with algorithms to squeeze HDR values onto an SDR screen, which they called «Tone Mapping.»

The Reinhard Tone Mapper, invented in 2002. It is one of many. These algorithms soon found their way into commercial software for camera nerds.

Photomatix Circa 2008 Unfortunately, these packages required a lot of fiddling, and too many photographers in the mid-2000s… lacked restraint.

The Ed Hardy t-shirt of photography. Via Wikipedia. Taste aside, average people don’t like fiddling with sliders. Most people want to tap a button and get a photo that looks closer to what they see without thinking about it. So Google and Apple went an extra step in their camera apps.
Your modern phone’s camera first captures a series of photos at various brightness levels, like we showed a moment ago. From this burst of photos, the app calculates an HDR image, but unlike that commercial software from earlier, it uses complex logic and AI to make the tone mapping choices for you.

Phil Schiller at the iPhone XS introduction showing off a newer Smart HDR Apple and Google called this stuff «HDR» because «HDR Construction Followed By Automatic Tone Mapping» doesn’t exactly roll off the tongue. But just to be clear, the HDR added to the iPhone in 2010 was not HDR. The final JPEG was an SDR image that tries to replicate what you saw with your eyes. Maybe they should have called it «Fake HDR Mode.»
I know quibbling over names feels as pedantic as going, «Well actually, ‘Frankenstein’ was the doctor, you’re thinking of ‘Frankenstein’s Monster,’» but if you’re going to say you hate HDR, remember that it’s bad tone mapping that is the actual monster. That brings us to…
The First HDR Backlash
Over the years, Apple touted better and better algorithms in their camera, like Smart HDR and Deep Fusion. As this happened, we worried that our flagship photography app, Halide, would become irrelevant. Who needs a manual controls when AI can do a better job?
We were surprised to watch the opposite play out. As phone cameras got smarter, users asked us to turn off these features. One issue was how the algorithms make mistakes, like this weird edge along my son Ethan’s face.

When life gives you lemons, you… eat them. That’s because Smart HDR and Deep Fusion require that the iPhone camera capture a burst of photos and stitch them together to preserve the best parts. Sometimes it goofs. Even when the algorithms behave, they come with tradeoffs.
Consider these photos I took from a boat in the Galapagos: the ProRAW version, which uses multi-photo algorithms, looks smudgier than the single-shot capture I took moments later.


Left: Merged from Multiple Photos. Right: A single exposure.
What’s likely happening? When things move in the middle of a burst capture— which always happens when shooting handheld— these algorithms have to nudge pixels around to make things line up. This sacrifices detail.
Since 2020, we’ve offered users the option of disabling Smart HDR and Deep Fusion, and it quickly became one of our most popular features.

This lead us to Process Zero, our completely AI-free camera mode, which we launched last year and became a smash hit. However, without any algorithms, HDR scene end up over and under exposed. Some people actually prefer the look — more on that later — but many were bummed. They just accepted this as a tradeoff for the natural aesthetic of AI-free photos.
But what if we don’t need that tradeoff? What if I told you that analog photographers captured HDR as far back as 1857?
The Great Wave by Gustave Le Gray, via The Met Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes.

The Tetons and the Snake River via Wikipedia It’s even more incredible that this was done on paper, which has even less dynamic range than computer screens!
From studying these analog methods, we’ve arrived at a single-shot process for handling HDR.

Halide’s new, optional tone-mapping. How do we accomplish this from a single capture? Let’s step back in time.
Learning From Analog
In the age of film negatives, photography was a three step process.
- Capture a scene on film
- Develop the film in a lab
- Transfer the film to paper
It’s important to break down these steps because— plot twist— film is actually a high dynamic range medium. You just lose the dynamic range when you transfer your photo from a negative to paper. So in the age before Photoshop, master photographers would «dodge and burn» photos to preserve details during the transfer.

An excerpt from The Print, the Ansel Adams Photography Series 3 
«Clearing Winter Storm, Yosemite National Park» Via Wikipedia. Is it a lie to dodge and burn a photo? According to Ansel Adams in The Print:
When you are making a fine print you are creating, as well as re-creating. The final image you achieve will, to quote Alfred Stieglitz, reveal what you saw and felt.
I’m inclined to agree. I don’t think people reject processing your photos, whether it’s dodging-and-burning a print, or fiddling with multi-exposure algorithms. The problem is that algorithms are not artists.
AI cannot read your mind, so it cannot honor your intent. For example, in this shot, I wanted stark contrast between light and dark. AI thought it was doing me a favor by pulling out detail in the shadow, flattening the whole image in the process. Thanks Clippy.


Same lighting, moments apart. Left: Process Zero. Right: the iPhone camera’s automatic tone mapping.
Even when tone mapping can help a photo, AI may take things too far, creating hyper-realistic images that exist in an uncanny valley. Machines cannot reason their way to your vision, or even good taste.
We think there’s room for a different approach.
A Different Approach: Opt-In, Single Shot Tone Mapping
After considerable research, experimentation, trial and error, we’ve arrived on a tone mapper that feels true to the dodging and burning of analog photography. What makes it unique? For starters, it’s derived from a single capture, as opposed to the multi-exposure approaches that sacrifice detail. While a single capture can’t reach the dynamic range of human vision, good sensors have dynamic range approaching film.
However, the best feature is that this tone mapping is off by default. If you come across a photo that feels like it could use a little highlight or shadow recovery, you can now hop into Halide’s updated Image Lab.
0:00/0:08
In the Image Lab we have an exposure slider for adjusting overall brightness just like before. But to its right, we have a single dial that tames or boosts dynamic range. We think it’s up to the photographer to decide what feels right.
To be clear, the tone mapper works different than simply bringing your photo into an editor and dragging the «shadows» and «highlights» sliders. It also does it best to preserve local contrast.



Left and Middle: a shot with simple exposure adjustments. Right: a tone-mapped version.
Don’t worry: adjusting this stuff after-the-fact won’t sacrifice quality. Since Halide captures DNG or «digital negative» files, it contains all of the information that your screen cannot display. The shadow and highlight details are already in there, and the tone-mapping simply brings it out selectively.
Solution 2: Genuine HDR Displays
I went to all that trouble explaining the difference between HDR and Tone Mapping because… drum roll please… today’s screens are HIGHer DYNAMIC RANGE!
0:00/0:10
0:00/0:10
0:00/0:10
The atrium of the Hyatt Centric in Cambridge
Ok, today’s best screens still can’t match the high dynamic range of real life, but they’re way higher than the past. Spend a few minutes watching Apple TV’s mesmerizing screensavers in HDR, and you get why this feels as big as the move from analog TV to HDTV. So… nine years after the introduction of HDR screens, why hasn’t the world moved on?
A big problem is that it costs the TV, Film, and Photography industries billions of dollars (and a bajillion hours of work) to upgrade their infrastructure. For context, it took well over a decade for HDTV to reach critical mass.
Another issue is taste. Much like adding a spice to your meal, you don’t want HDR to overpower everything. The garishness of bad HDR has left many filmmakers lukewarm on the technology. Just recently, cinematographer Steve Yedlin published a two hour lecture on the pitfalls of HDR in the real world.
If you want to see how bad HDR displays can get, look no further than online content creators. At some point these thirsty influencers realized that if you make your videos uncomfortably bright, people will pause while swiping through their Instagram reels. The abuse of brightness has lead to people disabling HDR altogether.

For all these reasons, I think HDR could end up another dead-end technology of the 2010s, alongside 3D televisions. However, Apple turned out to be HDR’s best salesperson, as iPhones have captured and rendered HDR photos for years.
In fact, after we launched Process Zero last year, quite a few users asked us why their photos aren’t as bright as the ones produced by Apple’s camera. The answer was compatibility, which Apple improved with iOS 18. So HDR is coming to Process Zero!

To handle the taste problem, we’re offering three levels of HDR:
- Standard: increases detail in shadows, and bumps up highlights while giving a tasteful rolloff in highlights
- Max: HDR that pushes the limits of the iPhone display
- Off: turns HDR off altogether.
Compatibility Considerations
Once you’ve got an amazing HDR photo, you’re probably wondering where you can view it, today. The good news is that every iPhone that has shipped for the last several years supports HDR. It just isn’t always available.
As we mentioned earlier, some users turn off HDR because the content hurts their eyes, but even if it’s on, it isn’t always on. Because HDR consumes more power, iOS turns it off in low-power mode. It also turns it off when using your phone in bright sunlight, so it can pump up SDR as bright as it can go.
An even bigger issue is where you can share it online. Unfortunately, most web browser can’t handle HDR photos. Even if you encode HDR into a JPEG, the browser might butcher the image, either reducing the contrast and making everything look flat, or clipping highlights, which is about as ugly as bad digital camera photos from the 1990s.
But wait… how did I display these HDR examples? If you look carefully those are short HDR videos that I’ve set to loop. You might need these kinds of silly hacks to get around browser limitations.
Until recently, the best way to view HDR was with Instagram’s native iPhone app. While Instagram is our users’ most popular place to share photos… it’s Instagram. Fortunately, things are changing.
iOS 18 adopted Adobe’s approach to HDR, which Apple calls «Adaptive HDR.» In this system, your photos contain both SDR and HDR information in a single file. If an app doesn’t know what to do with the HDR information, or it can’t render HDR, there’s an SDR fallback. This stuff even works with JPEGs!

From Apple’s Adaptive HDR Presentation Browser support is halfway there. Google beat Apple to the punch with their own version of Adaptive HDR they call Ultra HDR, which Android 14 now supports. Safari has added HDR support into its developer preview, then it disabled it, due to bugs within iOS.
Speaking of iOS bugs, there’s a reason we aren’t launching the Halide HDR update with today’s post: HDR photos sometimes render wrong in Apple’s own Photos app! Oddly enough, they render just fine in Instagram and other third-party apps. We’ve filed a bug report with Apple, but due to how Apple releases software, we doubt we’ll see a fix until iOS 19.
Rather than inundate customer support with angry emails about how photos don’t look right in Apple’s photos app, we’ve decided to release HDR support in our Technology Preview beta that we’re offering to 1,000 Halide subscribers. Why limit it to 1,000? Apple restricts how many people can sign up for TestFlight, so we want to make sure we stay within our limits. This is the start of our preview of some very exciting big features in Halide which are part of our big Mark III update.
If this stuff excites you and you want to try it out, go to the Members section in Settings right now.

Solution 3: Embrace SDR
As mentioned earlier, some users actually prefer SDR. And that’s OK. I think this about more than just the lo-fi aesthetic, and touches on a paradox of photography. Sometimes a less-realistic photo is more engaging.
But aren’t photos about capturing reality? If that were true, we would all use pinhole cameras, ensuring we capture everything in sharp focus. If photos were about realism, nobody would shoot black and white film.

Shot on Ilford HP5, ƒ/1.4 Consider this HDR photo of my dad.
0:00/0:10
Shot in ProRAW
HDR reveals every wrinkle and pore on his face, and the bright whites in his beard draw too much attention. Just as you might use shallow focus to draw attention on your subject, this is one situation where less dynamic range feels better than hyper-realism. Consider the Process Zero version, with HDR disabled.

Process Zero, without Tone Mapping While we have plenty of work before Process Zero achieves all of our ambitions, we think dynamic range is a huge factor in recapturing the beauty of analog photography in the digital age.

Shot on film. We think tone mapping is an invaluable tool that dates back hundreds of years. We think HDR displays have amazing potential to create images we’ve never seen before. We see a future where SDR and HDR live side by side. We want to give you that choice — whether it is tone-mapping, HDR, or any combination thereof. It’s the artists’ choice — and that artist doesn’t have to be an algorithm.
We think the future of sunsets looks bright.