<![CDATA[Lux — iPhone camera apps, camera reviews and more]]>https://www.lux.camera/https://www.lux.camera/favicon.pngLux — iPhone camera apps, camera reviews and morehttps://www.lux.camera/Ghost 6.22Fri, 13 Mar 2026 22:58:45 GMT60<![CDATA[Apple Log Explained]]>Editor's note: We make photography and video apps, so it's about time we released video versions of our essays. Check out the companion video shot in Kino 1.4, or read on for text!

Today we're launching support for Apple Log 2 in our

]]>
https://www.lux.camera/apple-log-explained/69a3448ac2c76c0001ce5382Thu, 05 Mar 2026 18:31:19 GMT

Editor's note: We make photography and video apps, so it's about time we released video versions of our essays. Check out the companion video shot in Kino 1.4, or read on for text!

Today we're launching support for Apple Log 2 in our award winning video app, Kino. Apple Log 2 is a new format that launched alongside the iPhone 17 Pro. To understand what makes it special, let's dig into the concept of Log and RAW video, why every camera maker uses a unique format, and why we took our time to add support to Kino.

A Brief Intro to Camera Processing

Every camera is opinionated, and this pre-dates modern cameras which use AI and computational photography to "fix" your photos for you. Even digital cameras from 20 years ago made creative choices on your behalf.

Both of the following photos came from the exact same sensor data. Which one reflects reality?

In real life, you might have seen either version, because our perception constantly shifts based on our surroundings. If you were looking out the window of a room lit with bright, warm lights, you probably saw the version on the left. If you turned off the lights and let your eyes adjust, you probably saw the image on the right.

That's all to say that a camera is not an objective source of truth. A lot is subject to interpretation. Let's consider a more benign, unambiguous photo, starting with a crude version of the sensor data.

Apple Log Explained

Why the green? If we zoom in, we see the separate red, green and blue values living alongside each other.

Apple Log Explained

If you look close, you'll see there are twice as many green pixels as red or blue. That's just how most sensors are designed.

The first step of camera software is to fuse all these values together, and that requires some sophisticated algorithms to avoid introducing weird visual quirks. That "demosaic" step gets us:

Apple Log Explained

No, it wasn't that blue the day I captured the photo. That's because our minds adjust what we see so that black, whites, and grays appear neutral. Cameras use an algorithm to guess the right "white balance," and then apply a tint to account for it. That gives us…

Apple Log Explained

It's getting close, so let's dig into details. It looks like the clouds were too bright, blowing out the highlights, but that actually didn't happen. The data for those clouds is still there, but it's too bright to display on screen.

At the same time, the overall image feels a bit flat. That's because we're used to viewing photos with more contrast. Let's make some tonal adjustments.

Apple Log Explained

While I like the final look of my photo above, I refuse to call it correct. The amount of contrast and highlight recovery are subjective. Here's a different interpretation:

Apple Log Explained

This is just a broad overview of all the steps a camera goes through processing an image. I didn't talk about removing dead pixels or correcting for lens imperfections. I just wanted to focus on a few big decisions that that can lead to dramatically different photos.

It's challenging for a photographer to make these decisions at the time they capture a photo. It's much easier to sit down at a computer and experiment with different versions before committing. Unfortunately, when your camera saves a JPEG, HEIC, or MP4, you need to commit to your creative choice at the time you capture the image or video. It's really hard to edit those files after the fact.

That's because these "processed" file formats are designed for viewing, not editing. When they're saved, your camera throws away a bunch of data in order to shrink the file, to make it quick to download and display on screen.

What if we saved the image data as soon as it's captured by a sensor, but before all of these creative decisions have been made? That's a "raw" file. It's just a snapshot of sensor data, before a camera interpreted it.

The best analogy I've heard involves cake. When your camera saves a JPEG, that's like buying a pre-made cake from a bakery. It's super convenient, assuming you like how the cake was made. If you don't like raisins, you'll be miserable plucking them out.

When your camera saves a raw file, it saves the ingredients to bake a cake. Now you can bake the cake however you want. The tradeoff is that developing these files, like baking a cake from scratch, requires a little extra work.

Let's take a break for self-promotion! Halide makes developing RAW files super easy. That photo above was developed with one tap in Halide Mark III, which is right now available as a technical preview. Ok, let's move on.

What is Log Video?

RAW files are more unwieldy than JPEGs. Their file sizes make them slow to download. Once downloaded, they take longer to load than JPEGs, because those algorithms I talked about can get pretty sophisticated. These were big challenges, decades ago, but today's phones are so fast that it's easy for photographers to work with RAW files as easily as JPEGs.

Video is a very different matter. Capturing images at 24, 60, or even 120 times per second requires fast storage, and a ton of it.

The solution is Log video, which you could think of as a slightly developed RAW. It keeps most of the important data, but encodes it in a file that's much faster than RAW to read and write. It also uses much less space.

The first trick is to save the image after that first step we showed above, where the red, green, and blue pixels are merged together. In the real world, you get little in the way of creative control during the demosaic step, and skipping it makes things way easier.

The next trick involves how light values are written to the file, which takes advantage of how humans perceive light. It's worth taking a moment to explain.

Consider this ramp of color patches:

Apple Log Explained

You might think it gets 10% lighter with every patch.

Apple Log Explained

This doesn't reflect the actual light being emitted from your screen. If you measured that with a light meter, the values would look like this:

Apple Log Explained

Sorry about this tangent, but it's probably the most important concept to understand if you really want to master advanced photography and video. Humans don't have a straightforward perception of light. It's closer to this curve:

Apple Log Explained
Gray line: how we see. Black line: actual light.

This is important because camera sensors record physical light values, not human perception. This means that RAW files usually waste a ton of bits on brightness values that most human beings won't notice!

Log video exploits human perception to save those bits. Before writing the image information to the file, it boosts the shadows using a logarithmic function, hence the "log" in the name. This is why the videos look so low contrast if you play them back in app that doesn't know it's playing a Log video.

Apple Log Explained

Things look more normal in a video editor, once it applies the opposite of that log function.

Apple Log Explained

We're getting close, but the images look really desaturated, which has to do with how the colors are saved, which we'll talk about in a moment. Luckily, our editor has a toggle to fix this for us.

Apple Log Explained

Great! The color looks much closer to what I remember. With this neutral image, we're ready to make creative choices…

0:00
/0:04

Tyler Stalman's Film 03

Why did we have to take that extra step to fix color saturation?

Log Squashes Color

Camera sensors record a wider range of color than you can show on-screen. Consider this diagram that represents every color humans see.

Apple Log Explained
It's approximate. Please don't use it for your class assignments.

Why does it look like a horseshoe? It's a simplified scientific chart with a bit of a backstory that isn't worth going into.

Apple Log Explained
Via Wikipedia

Let's just cut out the clutter and focus on a small region of our chart.

Apple Log Explained

That triangle covers the range of colors computer screens could display for decades. Today screens can display a bit more color, but we're often beholden to this limited range, due to backward compatibility.

What happens if you try to display colors outside that triange? Things look weird, as in the case of this blue LED sign.

Apple Log Explained

Luckily, camera sensors can record color outside of this range, whether or not you can show them. Professional colorists can use this extra color to ease things out in a more visually pleasing way.

Apple Log Explained

So what is the range of colors a camera can record?

Apple Log Explained

The dark region falls outside the camera range, but this is still a huge upgrade.

If you got this far, you now understand the concepts of Log Video. Unfortunately, there's one more quirk that can be confusing: every camera manufacturer comes up with their own log video format. Sony has S-Log, Canon has C-Log, and Panasonic has, wait for it… V-Log. I guess they didn't get the memo.

These companies don't do this to lock you into their ecosystem. In fact, they all publish white papers to make it easy for people to create software that reads and writes their log variants. But why go to all that trouble in the first place? Why doesn't everyone agree on a single standard?

Different cameras use different sensors, and each sensor responds to light in slightly different ways. Every log variant is crafted to only include the most important data from the sensor, to make the most of the available bits.

Ok, we're finally able to talk about the difference between Apple Log and Apple Log 2.

Apple Log vs Apple Log 2

When Apple Log launched alongside the iPhone 15 Pro in 2023 it included a new log function, which meant you had greater control over the exposure and contrast of your videos. Unfortunately, it skimped out on color.

Instead of providing ranges specific to the iPhone, it specified a range identical to HDR televisions. Apple Log 2 provides unique real color ranges best suited for Apple's sensors. What's the difference?

Apple Log Explained
Black outline: Apple Log. Colored region: Apple Log 2.

The first version of Apple Log is slightly flawed, as part of that black triangle falls outside of the range of colors an Apple sensor will produce. That's wasted bits. Not only does Apple Log 2 fix this, it also includes more color in deep blues and purples.

Apple Log 2 is definitely a step forward, but it doesn't make a big difference in everyday situations. You'll find it useful in niche situations, like neon lights or blue stage lighting. Apple Log 2 is great, but I wouldn't call it a "rush out and buy a new iPhone" improvement, unless you're shooting a John Wick movie.

Kino 1.4: Now With Apple Log 2

Today Kino added support for Apple Log 2, because of course we need to offer the highest quality image. As you can tell by this post, Log video is a bit complex, so adding support took more than flipping a few flags. We had to expand some internals to accommodate the greater color range, detect different versions of Apple Log, and more.

We would have gotten this update out sooner, but if you've followed the company this past year, you know we've been heads-down on Halide Mark III, our flagship photography app. We promised to release a public preview of Halide by the end of January, which we did. We then immediately set to work on this Kino update.

I love working on Kino, our users love it, and I have ambitious plans ahead. However, the final version of Halide Mark III is still our top priority. So what do we do? Probably bounce back and forth between the apps for a bit. Big Halide updates will follow shorter Kino updates.

The long term plan is to unify the rendering system between the apps. As we just showed you, RAW photography and Log video are very similar. By unifying some technology, we can share technical improvements across both apps with less duplicated work.

In fact, my slight detour into Kino already helped with Halide Mark III. Both apps now use the new Apple Gamut colorspace for their image processing, the same colorspace that powers Apple Log 2.

That said, Kino and Halide will always remain separate apps. Photography and motion pictures are two separate art forms that call for very different user experiences.

You can download Kino 1.4 in the App Store today. Stay tuned for more updates!

]]>
<![CDATA[Inside Looks: A Mark III Preview]]>https://www.lux.camera/mark-iii-looks/69436cc8f2fb900001dd648eWed, 28 Jan 2026 16:58:49 GMT

Today we launched a public preview of Halide Mark III, the next generation of our flagship photography app. When you download today's Halide update in the App Store, you'll find a new "" button to switch to Mark III.

Mark III is still a work in progress, and we'll share more details about that at the end, but we're ready to start sharing what we've worked on for the past year through a series of app updates alongside posts like this. Today, we're diving into Looks.

In the early days of digital photography, the look of a photo amounted to little more than contrast and color palette. In the era of smart phone photography, algorithms now alter lighting, local contrast and much more. Consider these four different interpretations of exact same data.

We don't point this to denigrate the first-party camera, which serves a billion people. It does this through simplicity. You tap a button, you capture a decent photo. Most people, most of the time, people want to see details in shadows and highlights.

The convenience of smart cameras has the unintended consequence of making everyone's photos feel the same. It's reached a point where folks who want a unique look to their images buy old digital cameras that lack algorithms. Others download advanced apps, only to be overwhelmed by hundreds of presets that all kind of feel the same.

This was my life until a few years ago, when I woke up one morning and found myself bored with digital photography. That's an awful place to be after a decade building camera apps. So I did what every millennial does in a midlife crisis and took a deep dive into analog photography.

It starts innocent enough. You buy a Canon AE-1 off the internet, remove the dead fly from inside the lens, and shoot your first roll. Next thing you know, there's a camera collection.

Inside Looks: A Mark III Preview

You learn to develop film at home. Then you dig into alternative processes like cyanotype, colloidal chloride, and platinum palladium.

This path ends in Hillsborough, New Jersey, buying a discontinued lab scanner, yellowed from age, trying to explain to the seller what you do for a living.

Inside Looks: A Mark III Preview

Wait, what was I talking about? Oh, digital photography.

I grew up during the transition from film to digital, so I assumed that convenience and flexibility made digital inherently better. Revisiting film decades later, I appreciate the constraint. An "out of the box" look frees me from worrying about histograms and bit-depths. I found joy popping a roll of film in a camera and just shooting. Some of my favorite results came from accidents.

If there is a single goal of Halide Mark III, it's to marry the simplicity, joy and beauty of analog photography with the power and convenience of digital.

It starts with embracing a film stock metaphor, but it's about much more than film emulation. We're calling them "Looks."

Inside Looks: A Mark III Preview
Artwork by our friends at The Iconfactory

Typical photo-preset apps simply swap a photo's color palette. Halide's Looks are capable of much more, virtue of being part of the camera itself. When you select a look, it changes the way the camera captures a photo and interprets the results. For the best results, you should pick the final look at the time of capture.

0:00
/0:08

But if you aren't ready to decide, shoot with RAW and you can try different looks in the Image Lab.

Inside Looks: A Mark III Preview

The Default look matches the iPhone's built-in camera app, algorithms and all. This is great if you want convenience with a little extra control, such as manual focus.

Our next look is Process Zero II, the second generation of our acclaimed look we launched it a year and a half ago. The Process Zero disables all computational photography at the time of capture, for photos with stronger contrast, subtler details and an overall more natural feeling.

We added a "II" to Process Zero because it's receiving a few big upgrades, starting with High Dynamic Range Photography. For the full details on the power of (tasteful) HDR, can check out a lengthy deep-dive we shared last year. In short, HDR is perfect for dramatic, high-contrast scenes, such as sunsets.

0:00
/0:10

HDR also leaves more room in the highlights for vibrant colors.

Inside Looks: A Mark III Preview
Sunflowers in SDR. Color is washed out as it approaches peak white.
0:00
/0:10

The same photo in HDR

At the same time, we recognize that the use of HDR is a creative choice, no different than your choice to shoot in color rather than black and white. Many scenes look better in SDR, such as this foggy, low-contrast morning in Osaka.

Inside Looks: A Mark III Preview
0:00
/0:10

If you can't design whether or not HDR is right for your shot, you can just shoot with RAW enabled and play with both version in Halide's Image Lab. If you ultimately decide HDR isn't your thing, you can turn it off by default in settings.

Process Zero Meets ProRAW

Every iPhone shipped since 2015 has supported RAW photo capture, but RAWs came with a serious drawback: they often look bad before editing. Out of the box, parts of your photo appeared under or over exposed.

In 2020, Apple launched their own variant of RAW called ProRAW. It combined the flexibility of editing in RAW with power of Apple's computational photography. Unlike classic RAWs of yesteryear, ProRAW gives you Apple's default appearance without touching an editor.

But sometimes you might disagree with Apple's algorithms. Consider this vivid sunset; the details in the clouds are lost with the default iPhone look, while the Process Zero look preserves them.

In the past, you had to decide between the flexibility of Apple ProRAW or the natural look of Process Zero. Not anymore. With Halide Mark III, you can quickly experiment with ProRAW and Process Zero looks, after the fact.

Not only does this give you flexibility, you can even apply Process Zero to modes only available in Apple's camera app, such as Night Mode.

But turning off algorithms comes with a tradeoff: you've turned off algorithms! Sometimes you want more details in shadows or highlights. Consider that example above. Sure, we could fiddle with the exposure slider to bring down the brightness of the Family Mart sign, but reducing overall brightness hides details in the rest of the scene.

Inside Looks: A Mark III Preview
A Process Zero shot with exposure turned out.

This is not a new problem. Analog photographers solved this by "dodging and burning" their prints in the darkroom.

Inside Looks: A Mark III Preview
Ansel Adams, "The Print"

With Mark III, we're including a single Tone Fusion slider. It brings back details in highlights and shadows, in a style inspired by dodging-and-burning film.

Unlike Apple's algorithms, tone fusion is off by the default. We recommend applying a light touch, but even at extremes, we find the results feel more natural than letting a machine decide for you.

We're proud to tell you that Tone Fusion does not use any AI. It uses pre-AI techniques to recover values already present in your image.

The Halide Film Engine

A conventional color grade just shifts colors around. Film operates in a much more complicated way, with chemical and optical characteristics that affect fine details. After studying these qualities over the last few years, and we're excited to launch our third Look, a medium-contrast black and white film we call Chroma Noir.

Several characteristics give film its look, but let's focus on just one today, halation.

Inside Looks: A Mark III Preview
A menorah with halation applied

When a very bright light source hits a film camera, the light bounce off the back of the camera, scatters, and pass through the film a second time.

Inside Looks: A Mark III Preview
Example via Wikipedia

Since color film has the red-emulsion in back, this creates a reddish halo.

In black and white, halation feels magical.

Inside Looks: A Mark III Preview

Halation serves as more than a visual gimmick. It communicates to your mind that you're looking at a very bright light source rather than, say, a pure-white piece of paper. Like most film characteristics, it's subtle but goes a long way.

To do halation right, we use real-world light values. This was only possible by building our imaging engine from the ground up with physical accuracy in-mind. And this is just one ingredient of Halide's new look system.

Film Meets HDR

We think Halide does something truly unique in the realm of film simulations: support for HDR. It may seem out of place, since HDR is a new display technology and analog film well over a hundred years old, but film itself is a high-dynamic range medium. You just never knew this because film was always limited to low dynamic range mediums, like print.

0:00
/0:10

Film and HDR Are Optional

If you aren't interested in the aesthetics of film, but you love the black and white look, you can always toggle it off in our Image Lab. Even without film simulation, our beta testers love our black and white treatment.

Inside Looks: A Mark III Preview
From Halide Mark III Beta Tester El Gebón

What's a Preview?

As we said at the top, today is a preview of Mark III. We hope to launch the final version by summer, but we've reached point where many testers find themselves using the unfinished version more than Halide Mark II. So if our new image engine excites you and you're comfortable living on the edge, you can opt-in to try Mark III by tapping this button in Mark II.

Inside Looks: A Mark III Preview

That said, it does have bugs, and the new design is a work in progress— which is why we aren't ready to talk about it today. There are many features we haven't gotten around to porting over from the old version. Yes, there will be custom white balance, focus peaking, and more. We're kicking off this preview to gather your feedback (and crash reports) while there's still time to steer the ship. If you'd to contribute, please join our Discord.

Today's Mark III preview is available to everyone who bought a one-time-purchase for Halide Mark II, along with all Halide subscribers. Likewise, the final version of Mark III will be a free upgrade to all Mark II buyers and subscribers.

What's Next?

We built Chroma Noir to prove out our new imaging engine. (Also, we love black and white photography.) With our new engine in place, now the fun begins. We have a number of exciting looks planned for the final version of Halide Mark III, which you'll be able to try in the coming months.

At the same time, we believe in quality over quantity. Sure, we could bundle hundreds of presets that all feel kind of the same. I bet we'd make good money selling them to you piecemeal. We'd prefer to bundle a handful of perfect looks, each solving a specific problem. There's a reason we're working with The Iconfactory to give each look their own identity.

Inside Looks: A Mark III Preview

Chroma Noir taught us that this work requires significant time, effort, and artistry. Based on the time it took to develop Chroma Noir, the remaining looks could take years. So we came to the obvious conclusion that we need to partner with an expert.

We're excited to announce that Halide’s full set of looks will be developed by an amazing colorist, image scientist, and educator with professional experience in film and television: Cullen Kelly. Cullen is an honorary member of the American Society of Cinematographers, and his work spans from Academy Award nominated films to high-profile projects for Netflix, HBO, and Apple. He also produces an invaluable YouTube channel that dives deep into the art and science of color grading. We couldn't be more excited to work together.

It's taken a year to get to this point, with many long hours, high-highs and low-low. We thank you for your support throughout this journey, and can't wait to show you more of what we've been working on, in the coming months.

— Ben, Doug, and Katie Rose

]]>
<![CDATA[Requiem for the Rangefinder: An iPhone Air Review]]>Last week I set out to write a few thousand words on the iPhone Air, but it turns out I only need three: the lesser iPhone. Compared to the Pro and baseline models, it has fewer cameras and a smaller battery. For an extra $100, you can upgrade to an

]]>
https://www.lux.camera/requiem-iphone-air/68d14ac70f437600019c8b92Thu, 02 Oct 2025 15:59:17 GMT

Last week I set out to write a few thousand words on the iPhone Air, but it turns out I only need three: the lesser iPhone. Compared to the Pro and baseline models, it has fewer cameras and a smaller battery. For an extra $100, you can upgrade to an iPhone Pro and get power features like ProRAW and LiDAR. What was Apple thinking?

Every few years, Apple tests a new product category with a "wildcard" iPhone. In 2015, that was a Plus sized screen. In 2017, the iPhone X ditched the home-button and gained a notch. In 2019, the Pro introduced bleeding-edge technology at a premium price point

Some experiments flop. For years, people begged for a smaller iPhone, so Apple delivered the iPhone Mini in 2020 to lukewarm sales. I'd wager it was because, 13 years after the iPhone's debut, we now use our phones like we used to use computers. The era of small screens is over.

From the mini's ashes comes the Air, a phone as easy on your hands as it is on your eyes. It may be as droppable as any modern iPhone, but the double Ceramic Shield and titanium frame makes it as durable as ever.

Last week I set out to write a few thousand words on the iPhone Air, but found my mind pulled in another direction, to an iconic camera design. You may not know its name, but you know its work.

Invented by Oskar Barnack in 1913, the compact 35mm rangefinder may be the most influential camera of the 20th century.

Requiem for the Rangefinder: An iPhone Air Review
M6 Titanium

By modern standards, early rangefinders were lesser cameras, lacking auto focus and auto exposure.

In many ways, the rangefinder is outright flawed. It's hard to frame close shots, it doesn't do macro, and zoom lenses don't exist. This isn't a camera for National Geographic. Yet thanks to its compact size, durability and stealth, the 35mm rangefinder excelled at candid portraiture, street photography and journalism.

Requiem for the Rangefinder: An iPhone Air Review
D-Day, from Robert Capa's The Magnificent Eleven, shot on a Contax II

SLR cameras addressed the flaws, winning the hearts and wallets of consumers by trading size and noise for convenience. Still, there's something about the rangefinder that feels perfect. When compact digital cameras removed the need for film or mirrors, a decade of experimentation converged designs that resembled 35mm rangefinders, minus one important feature: taste.

Requiem for the Rangefinder: An iPhone Air Review
Fujifilm FinePix F10, 2006, via Wikipedia

In a world of consumer electronics made of cheap plastic and garish logos, the iPod proved people would pay a premium consumer electronics with beautiful aesthetics. So in 2010, Fujifilm tried a bold experiment. They designed a camera with the conveniences of a modern point-and-shoot, a fixed 35mm lens, and wrapped it in the aesthetics of the classic rangefinders.

Requiem for the Rangefinder: An iPhone Air Review
Fujifilm X100VI

Their X100 should have been a swan song to a bygone era. In a few years, the point and shoot market collapsed as normal people realized smartphones were good enough. The X100 debuted at $1,199, twice the price of an unlocked iPhone 4, it proved a smash hit, defining a new camera category.

15 years later, Fujifilm just launched their high-end, $6,000 variant, the GFX100RF. The RF standing for rangefinder, but this refers to its design language, not the hardware. Today, "rangefinder style" means, "a beautiful, rugged point-and-shoot with a fixed, wide angle lens." It's a device that functions as both camera and fashion accessory. Does this sound familiar?

Requiem for the Rangefinder: An iPhone Air Review

The Air distills an iPhone to its spirit. While the iPhone Pro's bevy of lenses make it perfect for a trip to the Galapagos, the Air seems perfect for street photography, journalism, and candid portraits.

Is one lens really enough? Will you miss ProRAW and LiDAR? To put this to a test, I took to New York with an iPhone Air and an M6.

The Natural Focal Lengths

Before we dig into the iPhone, let's talk about lenses in general. Why are 50mm and 35mm the most popular focal lengths for documentary work? There's a myth that 50mm approximates human vision. In fact, our entire field of view is technically 17mm, but visual perception is more nuanced than a single number.

Humans actually see on two levels. Our peripheral vision is very wide, but low detail. It probably evolved to spot predators out of the corner of our eye. We also have a narrow but high detail central vision, which you're using right now to read these words. Central vision is about 43mm, which sits between 50mm and 35mm.

I'm not saying scientists met with lens makers to arrive at those numbers. Photographers probably just bought more of those lenses because they felt right. Still, it's interesting there's physiology to back it up.

Anyway, if you go from 35mm to 28mm, you get a little extra breathing room. It comes in handy in close quarters or wide expanses.

Requiem for the Rangefinder: An iPhone Air Review
Shot on film. 28mm focal length.

Of course you have to deal with more unwanted stuff in your shots.

Requiem for the Rangefinder: An iPhone Air Review
Shot on film, 28mm focal length

But you can always crop to 35mm.

Requiem for the Rangefinder: An iPhone Air Review
Shot on film. 28mm, cropped.

If you don't know what lens you'll need for the day, there's a simple rule of thumb. Can you only carry one lens? Make it a 35mm. Can you carry two? Make them 50mm and 28mm.

I made the mistake only packing my 50mm for my trip to Grand Central, but the 26mm on the iPhone Air came to the rescue.

Will you miss the ultra-wide lens, a stable of almost every iPhone for the last six years? There's an easy way to check. In the Photos app on a Mac, create a new Smart Album.

Requiem for the Rangefinder: An iPhone Air Review
Focal length is native sensor size, not full-frame equivalent

I found only three photos from the last year that make me go, "I'm glad I had that ultra-wide!" The first was the 7-mile wide Hubbard Glacier in Alaska.

Requiem for the Rangefinder: An iPhone Air Review
Glacier at Disenchantment Bay, Alaska, shot on the iPhone 16 Pro Ultra-Wide Lens

The second was the exterior of the Oculus:

Requiem for the Rangefinder: An iPhone Air Review
Shot on the iPhone 16 Pro Ultra Wide

The third wasn't wide at all! Don't forget that lens doubles as a macro.

Requiem for the Rangefinder: An iPhone Air Review
Shot on iPhone 16 Pro

I bet I could get away with the panorama mode in Apple's camera, but it's a bit disappointing to lose macro. Halide may have a macro feature that works on every iPhone, but we're the first to warn users that software cannot match a true macro lens.

If you love bug shots, the Air is not for you. But the available focal lengths are more than enough for the rangefinder crowd.

Computational Photography and (Lack of) ProRAW

Now that we've gotten composition out of the way, let's talk about image quality. By that I mean algorithms.

Camera algorithms are a faustian deal. Sure, they "fix" photos, raising shadows and taming highlights, but it costs you control. Compare the earlier shot of the Oculus on film to the default shot out of the first-party camera.

Requiem for the Rangefinder: An iPhone Air Review

I know this down to taste, but after seeing the dramatic contrast of the black and white film earlier, this all-too-perfect lighting feels wrong. It makes me as uncomfortable as staring into the cold dead eyes of generative AI.

Let me get this out of the way: I am not one of those elitists who resent how the iPhone has become Gen-Z's gateway to photography. I'm glad we're at the point where beginners don't need to get bogged down in technical details like film ISO and f-stops before they can get a decent photo, let alone something you'd hang on your wall.

The issue is that "fixing" the lighting in photos means wrestling contrast from the hand of the photographer. Contrast is one of the photographer's most powerful tools!

Apple addressed this in 2020 when they released the image format they call ProRAW. If you're interested in its tradeoffs, we wrote a few thousand words about them, but in short, ProRAWS are not RAWs in the traditional sense. These a semi-baked version of their computational photography, with methods to turn down effects like tone-mapping and sharpening. That's all moot in the case of the Air, as Apple restricts ProRAW to its Pro models.

ProRAW hasn't changed much since its introduction in 2020. Instead, Apple has focused its resources on a new feature called "photographic styles." In addition to color presets, you have access to a new "tone" control. Maybe you won't get the latitude of ProRAW, but maybe we can match the film look?

Requiem for the Rangefinder: An iPhone Air Review
Photographic Style

Not bad, but there are two problems. One, unlike ProRAW, Apple has limited this control their Photos app. You can't tweak tone in third party apps like Lightroom or Halide. The second problem occurs when you zoom in.

Requiem for the Rangefinder: An iPhone Air Review

Notice a lack of texture. That's because this photo was not generated from a single capture. My iPhone took a series of captures, and merged them together to improve dynamic range and reduce noise. There's nothing you can do about this with Photographic Styles. Even ProRAWs have limited control over this, because noise reduction is a byproduct of Apple's algorithms.

Whenever people accuse their phone of applying digital makeup to faces, or textured objects turning to plastic, this is what they're talking about. When your annoying hipster friend goes on and on about "the warmth of analog," they're talking about film grain, the extra texture caused by the random activation of silver halides as light strikes emulsion.

Requiem for the Rangefinder: An iPhone Air Review
Film grain

Digital cameras may act different than film, but many people (myself included) find that the noise from a digital camera sensor adds an organic quality. The good news is that back by capturing a traditional, Bayer (a.k.a. "Native" a.k.a. "Real") RAW. Every iPhone since the iPhone 6S supports Bayer RAW capture.

Requiem for the Rangefinder: An iPhone Air Review
Bayer RAW Noise shot on the iPhone Air

Thanks to the binning on the 48 MP quad-bayer sensor, the noise is soft and subtle. Maybe too subtle! We've gotten requests on our Discord for more texture, so I whipped up synthetic grain.

Requiem for the Rangefinder: An iPhone Air Review

Anyway, let's compare film, photographic styles, and Bayer RAW.

One thing you'll miss about ProRAW is editing latitude. When shooting high dynamic range scenes, you can bring out details in the shadows that you don't even know exist. Bayer RAWs can push and pull exposure a few stops, but it can't work the miracles. For many people, that's a serious drawback. For me? It makes things more fun.

Like every mid-century camera, classic rangefinders lacked auto focus and auto exposure, forcing you to think through every shot. They were technically obsolete by the 1970s, with SLRs like the Canon AE-1 tackled automatic exposure. By 1980s, we had auto focus.

Yet the fully manual nature of classic rangefinders still captivates camera nerds 40 years later. There's just something about knowing that you, not the machine, took the photo. If you feel the same way, the lack of ProRAW makes the Air more of a camera-camera than the iPhone Pro.

A Camera for the Present Moment

Requiem for the Rangefinder: An iPhone Air Review
Billionaire's Row, Shot on the iPhone

If I could pinpoint the moment the iPhone became the definitive camera for breaking news, it was January 15, 2009.

Requiem for the Rangefinder: An iPhone Air Review

By 2012, you'd see iPhone 4S photos on the cover of Time Magazine.

Requiem for the Rangefinder: An iPhone Air Review

The iPhone is so important for capturing once-in-a-lifetime moments that every iPhone now ships with a dedicated capture button. But how do we test an iPhone's ability to capture history?

Luckily, I live in a crumbling empire. Shortly before I started this review, America's mad king assaulted the first amendment.

Requiem for the Rangefinder: An iPhone Air Review
Film
Requiem for the Rangefinder: An iPhone Air Review
iPhone
Requiem for the Rangefinder: An iPhone Air Review
iPhone
Requiem for the Rangefinder: An iPhone Air Review
Film

One hundred years later, black and white remains the best look for a nation's spiral into fascism.

This march didn't start as a protest for Jimmy Kimmel. Officially, this was the Make Billionaires Pay March, a protest against climate abuse by billionaires. One highlight were the paper mache effigies of Elon and Bezos.

The centerpiece of the march was the 160 foot long Climate Polluter's Bill, detailing $5 trillion of damage caused by climate change in the last ten years.

I think the reason the rangefinder captured so many great candid moments came down to its humble presentation. It didn't scream "Camera!" like its contemporaries.

Requiem for the Rangefinder: An iPhone Air Review
Via Wikipedia

Today, seeing someone with any sort of dedicated camera draws attention to itself. In the past this might have worked to a reporter's attention, but today feels like a target.

If our country continues its descent into authoritarianism, the most important feature of our cameras will be security. At the moment, the iPhone is the most secure camera in the world. At the moment, you can download third-party apps like Signal for anonymous, end-to-end encryption. How long will this last? As long as we keep talking about it.

Film Intermission

Requiem for the Rangefinder: An iPhone Air Review
Requiem for the Rangefinder: An iPhone Air Review
Shot on Film

The Lost Art of Building Things That Last

If I treat my decades-old cameras right, they'll last decades more. They never beg for software updates. I never wake up one morning to find the dials changed size and shape. It makes me happy thinking of a world before software.

Yes, I'm a developer, and I can't look away from the version of iOS that shipped on these phones. To be clear, I'm not talking about aesthetics.

Requiem for the Rangefinder: An iPhone Air Review
Moments after launching iOS 26 for the first time

I don't think the problem rests on their designers or engineers. These small bugs seem like the same mistakes I've made myself countless times. Whenever they've slip into a release, it's generally because I ran out of time to find and fix them.

It feels like Apple rushed things out the door to make a Fall 2025 release. With another year of work— maybe just another few months— this could have been a smash hit. Instead we read stories about battery drain, accessibility, and other unforced errors.

It's just a bit ironic that if you hold off on upgrading your iPhone, you can wait to upgrade iOS until the bugs get worked out. The people who will have the worst experience paid $1,000 at launch for a device running a beta OS.

Requiem for the Rangefinder: An iPhone Air Review
Shot on Film

Whatever Happened to Leitz Camera?

The M6, launched in 1984, is widely regarded as Leica at its peak. It added a light meter for convenience, but if you don't like it, just remove the battery. The camera remained fully functional without power.

In 2002, Leica launched the M7, their first model with semi-automatic exposure. It drew backlash for adding electronics, which left you with limited control if the battery dies. They responded with the Leica MP ("Mechanical Perfection") in 2003, which dropped the electronics and basically backtracked to the M6.

Leica was in a bad position. While the rest of the camera industry transitioned from film to digital, Leica was stuck serving a niche fan base of analog purists. Their first consumer digital camera was nothing more than a reskinned Fujifilm point-and-shoot. They later partnered with Panasonic for compact Leica Digilux 1 point-and-shoot, which failed to pay the bills.

By 2004, Leica was the verge of financial collapse. It was saved by Andreas Kaufmann, heir to a 1.5 billion euro inheritance from his aunt. Kaufmann bought a major stake in the company and set out to return them to profitability. Two years later, they launched their first digital rangefinder, the infamous M8. The infrared filter on the sensor failed to do its job, causing ugly IR interference, a problem mitigated by recalls.

Meanwhile, the company juiced revenue by slapping its logo on everything from Panasonic point-and-shoots to Fujifilm instant cameras, and now Android phones and silly iPhone accessories. I guess the real money is in merchandising.

Requiem for the Rangefinder: An iPhone Air Review
The Leica Supreme Collab

Let's be honest, Leica was a status symbol long before its pivot into pure-brand. While war photographers went with Contax, artists took to Leica.

Requiem for the Rangefinder: An iPhone Air Review
Stanley Kubrick

Even if the classic M was more status symbol than tool, at least the engineering justified its price tag. Every device felt like a work of art, hand assembled in their factory in Wetzler. Today, they crank out many products on Chinese assembly lines, if you couldn't tell by the price hikes due to tariffs.

Leica's optics used to be unparalleled, but today's Voigtländer glass is ever just as good for a fraction of the price. In fact, every film photo in this post shot at 28mm was shot with a Voigtländer.

Influencers aside, I don't know any working photographers shooting on Leica digital cameras. That doesn't seem to worry the company. In their own words, they make "jewelry."

Requiem for the Rangefinder: An iPhone Air Review

Today, 55% of Leica continues to be owned by Kaufmann's investment firm, and the other 45% is owned by the Blackstone private equity group. Maybe the company will continue to print money for decades to come, like Hermes and De Beers. Or maybe brand saturation will make it lose its cool, like Supreme.

Regardless, the Leitz Camera where Oskar Barnack invented the 35mm camera 112 years ago, is dead.


Leica earned its reputation from stellar engineering. Precise, hand assembled cameras require a high price, which accidentally made them a status symbol. It also put them in a precarious position as technology marched on.

Apple's greatest strength in the new millennium was its lack of nostalgia or reverence. Had another company invented such iconic products as the iMac or iPod, they would have milked those designs for decades— I remember rumors that the first iPhone would feature a click wheel! Yet time and again, Apple has discontinued successful products years before they outstay their welcome, so they can make room for the next big thing.

Apple's engineering and taste earned it a spot alongside Leica or Porsche, but this proved both a blessing and distraction. They tried to get into high fashion with a $10,000 solid gold Apple Watch, and it flopped because they went about things backwards. At launch, Apple didn't fully understand why the Apple Watch should exist, and they hid that with marketing until customers told them, "This is for fitness." It's ironic that if they hadn't shot their shot at launch, I bet they could release a gold Apple Watch today.

Apple is known for beautiful, well engineered products, and I worry they damaged that reputation to hit an arbitrary deadline. I worry about Apple losing its sense of taste, as they send tacky push notifications to our Wallets to promote a movie, and sacrifice valuable screen real estate to promote paid services.

Apple still makes the best products in world, and I still buy them, but I hope someone in Cupertino is minding this course. Their biggest threat isn't an Android as good the iPhone, any more than Per Se should worry about Gray's Papaya. The only threat to Apple is Apple.

The Verdict

Since it doesn't have rangefinder, I won't call it the modern rangefinder. The iPhone Air is the spiritual successor to the Leica M6.

It isn't a camera for beginners, and you won't take it on a safari, but the Air's small size, discreet operation, and unmatched durability make it ideal for street photography, journalism, and candid portraits. You can buy phones with similar specs for half the price, but the premium pays for a beautiful piece of kit that is one-part tool, and one-part fashion accessory.

It's a camera that distills photography to its essence. It may have less, but that's what makes it fun. When you tap the capture button, you know that you, not the machine, took the photo.

Requiem for the Rangefinder: An iPhone Air Review

This article may contain affiliate links.

No AI was used in this article's production.

All product photos were shot on an iPhone 16 Pro with Halide. All street photography was captured on an M6 or iPhone Air running a pre-release build of Halide Mark III and its built-in grades.

]]>
<![CDATA[iPhone 17 Pro Camera Review: Rule of Three]]>https://www.lux.camera/iphone-17-pro-camera-review-rule-of-three/68d3cf7f0f437600019c8c6fThu, 25 Sep 2025 18:51:23 GMT

Every year I watch the Apple Event where Apple announces the latest iPhones, I can’t help but sympathize with the Camera team at Apple. They have to deliver something big, new, even ground-shaking, on a regular annual cadence.

And every year, people ask us the same thing: is it really as big of a deal as they say?

iPhone 17 Pro looks very different at first glance. It’s the biggest departure from the style of camera module and overall Pro iPhone style since iPhone 11 Pro. It still packs three cameras on the back and one on the front. It has an actual camera button (even its svelte sibling, the iPhone Air gets one of those, albeit smaller) and a few notable spec changes, like a longer telephoto zoom. Or is it? And is that really all there is to it?

To find out, I took iPhone 17 Pro to New York, London and Iceland in just 5 days.


We do not get early access like the press: this is a phone we bought, to give you an unfiltered, real review of the camera. All the photos in this review were taken on iPhone 17 Pro, with the Apple Camera app or an in-development version of Halide Mark III with color grades.

Let’s dig in — because shooting with iPhone 17 Pro, I was surprised by quite a few things.

iPhone 17 Pro Camera Review: Rule of Three

What’s New

iPhone 17 Pro packs what Apple calls the new 'ultimate Pro camera system'. This is the last upgrade the camera bump — er, I mean, plateau — was arguably still lacking.

After its introduction with iPhone 11 Pro, all cameras were shooting at a fairly standard 12 megapixels. After the ultra wide camera was upgraded to 48 megapixels in iPhone 16 Pro, Apple finally upgraded the telephoto camera sensor to a 56% larger unit with 48 megapixels. Not only does this allow for sharper shots, but Apple is so confident in its center-crop imaging pipeline that it argues it allows for a 12-megapixel 8× zoom of 'optical quality'. More on that one in its own, detailed section: I am a big telephoto fan, and this announcement had me immediately excited to test it out.

One of the biggest upgrades this year actually comes to the front camera — but its quality impacts will be far less noticeable to most people than most tech pundits initially predicted. In a classic Apple move, the company replaced the bog standard selfie camera with a much larger square-sensor packing camera, but instead of now simply shooting 24 megapixel square shots it added a very clever Center Stage system to reframe your selfie shots to include people into it automatically or save you from twisting your arm to take a landscape selfie shot.

iPhone 17 Pro Camera Review: Rule of Three
Apple's square sensor makes it part of a small elite lineup of square sensor cameras like the latest Hasselblad 907X

This is a very impressive piece of engineering, and a classic Apple innovation in that the hardware change is essentially invisible. Us camera geeks love the idea of a square sensor, but in the Camera you will not find a way to take images with the full square image area; it just puts the square area of the 24MP sensor to use for 18 MP crops in their landscape or portrait depending on the subject matter.

iPhone 17 Pro Camera Review: Rule of Three

Apple (in my opinion, correctly) figured that if this artistic choice being made by your iPhone offends you as an artist, you are free to use one of the better cameras on the rear of the iPhone or disable the automatic framing feature altogether, returning its behavior to a 'normal' front-facing selfie camera.

Finally, there's some notable changes to processing. "More detail at every zoom range and light level". In particular, Apple stated in its keynote that deep learning was used for demosaicking raw data from the sensor's quad pixels to get more natural (and actual, existing) detail and color in every image. In particular, Apple went to point out this also meant that its AI upscaling that's used to make those '2×' and '8×' 'lenses' (that are actually the center portion of the 48MP Main and Telephoto cameras) is significantly improved.

iPhone 17 Pro Camera Review: Rule of Three

Finally, and not insignificantly, the entire phone has gotten a total design overhaul. Its interface and exterior are both composed of all new materials, and some big changes under the hood (or ceramic back panel, if you will) allow for even more performance out of the latest generation Apple Silicon chip inside.

What’s Not New

While the entire iPhone looks brand new, the cameras have some familiar parts. The Main camera sensor and lens is identical to the iPhone 16 Pro's, which in turn is identical to the iPhone 15 Pro's. The ultra wide camera, too, is the same as last year's 48 megapixel snapper.

iPhone 17 Pro Camera Review: Rule of Three
The Ultra Wide camera returns to continue making wide, sweeping compositions

The Camera Control from iPhone 16 Pro returns on all iPhone 17s and iPhone Air. No significant updates here, but I still find it a fantastic addition to the iPhone for opening my choice of camera app and taking a photo. The adjustments, on the other hand, still seem fiddly to me a year later. I was hoping for some more changes to it, perhaps even a face lift along with iOS 26 — but it has remained essentially the same save for some additional settings to fine-tune it to your liking.

Party in the Front, Business in the Back

This is, without a doubt, a great back camera system. With all cameras at 48MP, your creative choices are tremendous. I find Apple's quip of it being 'like having eight lenses in your pocket' a bit much, but it does genuinely feel like having at least 5 or 6: Macro, 0.5×, 1×, 2×, 4× and 8× .

The — unchanged save for processing tweaks — ultra wide and main camera are still great. I find the focal lengths ideal for day-to-day use and the main camera especially is sharp and responsive. Its image quality isn't getting old (yet).

What's beginning to get very old is its lack of close focusing. Its new sibling camera in iPhone Air focuses a whole 5 cm (that's basically 2 inches) closer, and it's very noticeable. For most users, arms-length photography is an extremely common use case: think objects you hold, a dish of food or an iced matcha, your pet; you probably take photos at this distance every day. And if you do, you'll have encountered your iPhone switching, at times rapidly, between the ultra wide 'macro' lens and the regular main camera — one of which produces nice natural bokeh and has far higher image quality. It's been several years of this now, and it's time to call it out as a serious user experience annoyance that I hope can be fixed in the future. This is, incidentally, one of the reasons why our app Halide does not auto-switch lenses.

Shooting at 2× on iPhone 17 Pro did produce noticeably better shots; I believe this can be chalked up to significantly better processing for these 'crop shots'. Many people think Apple is dishonest in calling this an 'optical quality' zoom, but it's certainly not a regular digital zoom either. I am very content with it, and I was a serious doubter when it was introduced. 

iPhone 17 Pro Camera Review: Rule of Three

The entire camera array continues to impress every year in working in unison: this year, more than ever, my shots were very well color and color temperature matched and zooming was more smooth between lenses than I'd seen.

It's wild that they pull this off with 3 different camera sensors and lenses. It's essentially invisible to the average user, and that's a real feat. No other company does this as well: pick up an Android phone and go through their copy of the iOS Camera zoom wheel to see for yourself sometime. 

4× the Charm

I have previously written perhaps one too many love letter to the 3× camera lens that the iPhone 13 Pro, 14 Pro had. While it had a small sensor, its focal length was just such a delight; one of my favorite go-to lenses is 75mm. Shooting with longer lenses is a careful balance of framing, and it's harder the longer the focal length is. 

Creative compositions are much easier when you have to select what not to include rather than to focus attention on one thing; the devil is in the detail. 

iPhone 17 Pro Camera Review: Rule of Three
Satisfying compositions are everywhere if you start looking for them. Here's a bridge vs. bridge shot.

The previous iPhone traded some image quality in the common zoom range (2-4×) for reach. I found the 16 Pro's 5× lens reach spectacular, but creatively challenging at times for that reason. There was also a tremendous gap in image quality between a 3× - 4× equivalent crop of the Main camera and the telephoto, which made missing an optical lens at that range even more painful. 

4× is an elegant solution; while I do still miss 3× — 3.5× would've been perfect, but admittedly not nearly as numerically satisfying as 1-2-4-8× — the lens' focal length is fantastic for portraiture and details alike, and its larger sensor renders impressive detail: 

iPhone 17 Pro Camera Review: Rule of Three

Even in low light, the lens performs admirably — due to a multitude of factors: excellent top-tier stabilization of the sensor 3D space, software stabilization, good processing and a larger sensor.

It is still is very much reliant on processing and Night Mode compared to the Main camera, however — expect those nighttime shots to require ProRAW and / or Night Mode to get the most out of a shot.

Even then, things will look fairly 'smoothed over':

Regardless, this is a tremendous telephoto upgrade, and if you were as much of a telephoto lover as me it might well be reason alone to upgrade.

Are the 48 megapixel details truly visible? Well, judge for yourself:

I find that the resolution is great, though the lens is a bit soft.

I like this softness, myself; it is to Apple's great credit that there isn't some kind of heavy handed sharpening algorithm that pushes these images to look artificially sharper.

It renders very naturally, extremely flattering for portraiture, and showcases processing restraint that I haven't seen from many modern phone makers. Bravo.

It also has an additional trick up its sleeve thanks to those extra pixels and processing: an additional lens unlocked by cropping the center 12 MP area of the image, along with some magical processing. Does it really work?

8× Feature's a Feat

The overall experience of shooting a lens this long should not be this good. I've not seen it mentioned in reviews, but the matter of keeping a 200mm lens somehow steady and not an exercise in tremendous frustration is astonishing. Apple is using both its very best hardware stabilization on this camera and software stabilization, as seen in features like Action Mode.

You will notice this while using the camera at this zoom level: the image will at times appear to warp in areas of your viewfinder, or lag behind your movement a little bit. The only way to truly communicate how impressive this is is to grab a 200mm lens and hand-hold it: you'll find that it magnifies the small movements of your hand so much that it is really hard to frame a shot unless you brace it.

And then there's the images from this new, optimized center-crop zoom.

To say I've been impressed with the output would be an understatement.

Sometimes you get a little bit of a comedic effect as you realize you are seeing things through the telephoto lens you hadn't even noticed or can't quite make out with your own eyes:

And other times you become that stereotypical bird photographer (or in my case, a wanna-be). I will note that even with its magical stabilization, getting a picture of something in rapid motion is a bit of a challenge...

... but the results are truly magical if you do nail it. Is this tack sharp?

iPhone 17 Pro Camera Review: Rule of Three

Well, no, but this is 500% detail of a crop of a phone sensor shooting at 200mm at a fast moving bird on a cloudy day. I am pretty impressed.

It allows for some astonishing zoom range through the entire system. 

I mentioned it before, but I want to reiterate it because it's such a fun creative exercise for anyone with this phone: I believe that the longer the lens, the more of your skills in creating beautiful compositions and photos will be challenged. It's just not that easy — but it also means you suddenly find different beautiful photos in what was previously a single frame:

The details are often prettier than the whole thing. Now I get to choose what story my image tells. What caught my eye, or what made the moment so magical. In video this is also lots of fun; I will post some Kino shorts on our Instagram to highlight the fun of moving video details of a scene.

Another example: here, Big Ben can take the center stage. As I shoot at 4×, I get an 'obvious' composition:

At 8×, I am presented with a choice: I can capture the tower, or the throng of people crossing the bridge and note as the evening sun lights up the dust in the air:

I like what this does for you as a photographer. Creativity, as many things do, can function as a muscle. Training it constantly, and stimulating yourself by forcing creative thought is what helps you become better at the craft.

This is a little artistic composition gym in your pocket. Use it.

iPhone 17 Pro Camera Review: Rule of Three

Trust the Process 

As we mentioned in our last post, algorithms are about as important — perhaps more so — than the lens on your camera today. There's a word for that: processing. We're keenly aware of just how many people are at times frustrated with the processing an iPhone does to its imagery. It's a phenomenon that comes from a place of exceptional luxury: without its mighty, advanced processing an iPhone would produce a far less usable image for most people in many conditions.

I believe the frustration often lies in the 'intelligence' of processing making decisions in image editing that you might consider heavy handed. Other times, it might be simply reducing noise that makes an image look smudgy in low light.

iPhone 17 Pro Camera Review: Rule of Three
Processing makes curious mistakes at times. Here, a telephoto image came out looking a bit mangled.

Image processing is the one area where phones handily beat dedicated cameras, for the simple reason that phones have far more processing power at their disposal and need to do more to get a great image from an exceptionally small image sensor. We review it as intensely, then, as a new bit of hardware. How does it stack up this year?

Well, it's somewhat different.

iPhone 17 Pro Camera Review: Rule of Three
iPhone 17 Pro above, iPhone 16 Pro below

On the Main camera, don't expect huge changes. I found detail to be somewhat more natural in the Ultra Wide camera, but even here it was somewhat random-seeming if the results were truly consistently better. Overall, image processing pipelines are so complex now that it's hard to get a great idea of the changes over just a week. The images overall felt a bit more natural to me, though — although I still prefer shooting native RAW and Process Zero shots if I have the option to.

As I mentioned in the earlier section, it is truly noticeable that the 2× mode on the Main camera is a lot better. Not only is the result sharper, it also just looks less visibly 'processed'; a real win considering Apple claims this is actually due to more processing!

Finally, you might wonder: if these images are a bit better processed and all this being software, why isn't this simply being rolled out to the older iPhones just the same? Is Apple purposefully limiting the best image quality to just the latest iPhones?

The answer is yes, though not through inaction or some kind of malevolent and crooked capitalist lever to force you to upgrade. Software in itself might be easily ported across devices, but image pipelines like the ones we see on the iPhones 17 Pro are immensely integrated and optimized. It's quite likely the chip itself, along with hardware between the chip and sensor are specifically designed to handle this series' unique image processing. Porting it over to an older phone is likely impossible for that reason alone.

Video for Pros

This is mainly a photography review, but I also increasingly shoot video and make an app for it. iPhone 17 Pro has some absolutely wild features for pro video. They put the capital P in Pro; things like Genlock and ProRes RAW are far beyond what even advanced amateur users will likely use.

That being said, these features aren't just for Hollywood. While it's true that some of these latest ultra-powerful video pro features will allow the iPhone to become even more of a pro workhorse in terms of capturing shots and become usable in significant productions, the introduction of Apple Log with iPhone 15 Pro and other technologies are really just fuel for developers to run with.

When we built Kino, we wanted to make it so you can actually use things like Apple Log and the Pro iPhone's video making advancements without an education in the fine art of color grading in desktop software and learning what shutter angle is.

Adding technologies like this not only make the iPhone a truly 'serious camera', but since it's a platform for development, it also creates use cases for these technologies that have not been possible in traditional cameras used for photography and videography.

This is super exciting stuff, and I think we'll see the entire field evolve significantly as a result. With this set of new features — Open Gate recording, ProRes RAW, Apple Log 2 — Apple is continuing to build an impressive set of technologies that let it rival dedicated cinematic cameras without compromising on the best part of the iPhone: that it's really a smartphone, which can be anything you want it to be.

A Material Change

Everything's new on this phone, appearance wise: a return to aluminum is welcome. The new design cools itself much better and that's noticeable when you shoot a lot. It feels great in the hand and hopefully will age as nicely as my other aluminum workhorses from Apple. Apple even markets it as being especially rugged:

On the other hand, its other user-facing aspect — iOS itself — has also gotten a new material shift.

Liquid Glass is here with iOS 26, and it brings about an entirely new Camera app design, some much desired improvements to the Photos app, and a general facelift to the OS. While this isn't an iOS review, I will say that it's beautiful, and I'm a fan of Liquid Glass. iOS 26, however, has been a bit of a rough start: I ran into a lot of bugs even with the latest updates installed on the iPhone 17 Pro, from bad performance (OK) to photos not showing up for a long time to distorted images and the camera app freezing or being unusable (not so OK).

iPhone 17 Pro Camera Review: Rule of Three
It seems all telephoto images shot in native RAW have this light band artifact on the left side of the frame. Not great.

Big releases are ambitious, and difficult to pull off. I give tremendous credit to the teams at Apple for shipping iOS 26 along with these new devices, but in everyday use it truly felt like using a beta release. The constant issues I ran into did not make me feel like I was using a release candidate of an operating system.*

*Feedback reports on these issues have been sent to Apple.

Conclusion 

I think the iPhone Air serves a very important purpose: it allows Apple to make one phone a jewelry-like, beautiful device that is like a pane of glass and one that is decidedly like the Apple Watch Ultra: bigger, bulkier and more rugged.

iPhone 17 Pro Camera Review: Rule of Three

For years, I was a bit annoyed at the shininess and jewel-like qualities of the Pro, and to be entirely honest, I do now miss it a little bit. This is a beast in both performance and appearance, and it feels almost a little unlike Apple. I think, however, that the direction is correct and significant.

Our phones are such a central part of our lives now that it feels significant be able to have a choice for a product that prioritizes the true 'pro' — much like MacBook Pro did in a fantastic way with the thicker, bulkier M1 series.

This, then, might be the first 'workhorse SLR' of the iPhone family, if the regular iPhone is a simple Kodak Brownie. In that, some of the simplicity that delighted in the first iPhone may have been lost — but the acknowledgement that complexity is not the enemy is a significant and good step. As a camera, it is first and foremost a tool of creative expression: gaining permission to become more fine-tuned for that purpose makes it truly powerful.

It's left as an exercise to the user to excel at their purpose as much as the phone does.


All images in this review were taken on iPhone 17 Pro (unless otherwise noted). Photos were taken with an in-development version of Halide Mark III with built in grades for adjustment, with a smaller portion taken with Apple Camera in ProRAW and stills from the Kino app using built-in grades for adjustment.

]]>
<![CDATA[Rewrites and Rollouts]]>https://www.lux.camera/rewrites-and-rollouts/68b1da28cd7623000100f9e2Fri, 19 Sep 2025 15:45:46 GMT

iOS 26 is here. Every year we release major updates to our flagship apps alongside the new version of iOS, but not this year. Rather than stay silent and risk Silkposts, let's share our thoughts and plans.

Deciding When It's Time to Move On

In 2017 we launched the first version of our pro camera, Halide. In those days, the days of the iPhone 7, you just wanted manual control over focus, shutter speed, white balance… controls you expect in a classic camera.

Today, algorithms matter as much as a camera's lens. Halide kept up with changing times by offering control over these algorithms, and it became one of our most popular features, but we have to admit we aren't happy with how things evolved, with too many controls tucked away in settings.

Rewrites and Rollouts
This is getting busy.

How did things get so complicated?

Our app grew organically from its 1.0, and while we still love its design, we believe it will hit a bit of an evolutionary dead-end. Almost 10 years later, cameras and the way we take photos have changed a lot. We have big plans, and if we're going to be build the best camera for 2025 and beyond, we need to rethink things from the ground up.

For example, rather than bury the controls from earlier in settings, what if we put them right next to the shutter?

Rewrites and Rollouts

A change like this may sound simple, but these changes have ripple effects across our entire interface and product. I'll spare you a few thousand words and leave Sebastiaan to walk you through our big new design sometime soon.


If our visuals show cobwebs, let's just say the code hosts a family of possums. Since 2017, Apple's SDKs changed faster than we could keep up. Refreshing our codebase should improve speed, reliability, polish, and cut down the time it takes to ship new features.

It sure sounds like we should rewrite Halide.

If you've ever taken part in a rewrite, I know your first reaction is, "Oh no," and as someone who lived through a few big rewrites, I get it. Big rewrites kill companies. It's irresponsible to do this in the middle of iPhone season, the time we update our apps to support the latest and greatest cameras.

So we are not rewriting Halide right now.

We rewrote it two years ago.


In Summer 2023, we began our investigation into a modern codebase. We built a fun iPad monitor app, Orion, test the maturity Apple's new frameworks and experiment on our own new architecture. We were delighted by the results, and so were you! We were surprised Orion only took 45 days.

This gave us the confidence to test our platform on a bigger, higher-stakes project: our long-awaited filmmaking app Kino. We began work in Fall 2023, shipped in under six months, and won 2024 iPhone App of the Year.

Rewrites and Rollouts
record scratch yep, that’s me. You’re probably wondering...

This signaled our new architecture was ready for prime time, so earlier this year, we drew a line in the sand. In our code, we renamed everything Halide 2.x and earlier, "Legacy Halide." Mark III will be a clean break.

Rewrites and Rollouts
A few files in Xcode

After a few weeks of knocking out new features faster than ever, it was clear this was the right decision. Kino let us skip over the hard and uncertain part, and now all that's left is speed-running the boring part of translating the old bits to the new system.

Through The Liquid Glass

In June, Apple unveiled the new design language of iOS 26, Liquid Glass, and it threw a monkey wrench in all of our plans. As someone who worked on a big app during the iOS 7 transition, I know platform rewrites are wrought with surprises all the way up to launch.

Before we decided how to proceed with our flagship apps, and its effects on Mark III, we need to investigate. So we returned to Orion, our low-stakes app with fewer moving parts. Updating Orion's main screen for liquid glass took about a day, but it was not without snags, like when I spent an hour in the simulator fine tuning the glass treatment of our toolbar only to discovered it rendered differently on the actual device.

We moved on to Kino, which already aligned with the iOS 26 design system pretty well. Sebastiaan updated its icon treatment, which looks great when previewed in Apple's tools.

Rewrites and Rollouts
The version previewed on Icon Composer

However, when we loaded it on the device…

Rewrites and Rollouts
The version on a real device

This issue still persists in the final version of iOS 26, and filed a bug report with Apple (FB20283658). We'll hold off on our Kino update until it's sorted out.

None of these issues are insurmountable, but troubleshooting iOS bugs for Apple can be its own part-time job. As a team with only one developer, this left us with three options for Halide:

Option 1: Embrace Liquid Glass in Legacy Halide. Liquid Glass paradigms go beyond the special effects, such as its embrace of nested menus. Reducing the new design system to a stylistic change— a glorified Winamp skin— is a recipe for disappointment. Unfortunately, a deep rethinking of legacy Halide would force us to halt Mark III development for months, just to update a codebase on its way out.

Option 2: Rush Mark III with Liquid Glass to make the iOS 26 launch. Even before Apple unveiled the Liquid Glass treatment, Mark III was arriving at similar concepts. We're confident that the two design systems will fit well together. So what if we tackle both challenges at once, and target an immovable iOS 26 deadline? Nope. A late app is eventually good, but a rushed app is forever bad.

Option 3: Wait to launch a full Liquid Glass redesign alongside a rock solid Mark III. This is what we did, and we think it paid off big time. Earlier this week we released an early preview of our new UI (without any liquid glass) to Halide subscribers via our Discord. The results were overwhelming positive.

The Rollout (and early upgrade perks)

That's not to say we have nothing to show for iOS 26. Today we're launching Orion 1.1. It retains most of its retro aesthetics, but we're also digging how the liquid glass treatment interacts with our custom CRT effect.

Rewrites and Rollouts

We've also added a long-requested feature: fit and fill, for aspect-fill ratios. You can finally play your virtual console games in full screen glory!

For Kino, we're holding off on our update until we sort out the iOS bugs. Maybe things will be fixed in an iOS 26.1 update.

We have an update ready for our award winning long exposure app, Spectre. Unfortunately, it appears the App Store portal is broken at the moment, and won't allow us to submit the update.

Rewrites and Rollouts

Luckily, we submitted an update to Halide before running into this issue. It updates the icon, fixes a few glitches, and includes basic stylistic updates. We just released this update, moments ago.

Earlier today, we received our new phones and we've begun running them through the paces. We'll submit an update to support the new hardware and fix any bugs, assuming the App Store lets us.


These updates to Halide are a swan song for the legacy codebase. After this month, all of our energy goes Mark III, which includes the real Liquid Glass alongside a redesigned camera for a new age.

If you'd like a peek at things to come, we've opened another thousand spots in TestFlight to Halide subscribers. It's got tons of bugs, and parts are incomplete, but will give you an idea of where things are headed. If you'd rather wait for a polished experience, or prefer a one-time-upgrade, no problem. As we announced last winter, everyone who bought Mark II eventually gets Mark III for free.

It feels bittersweet moving on. Hopping into Legacy Halide to crank out updates feels a bit like a slog, while the new Mark III design and codebase is a joy. It makes me wish I wish I'd gutted Halide years ago. At the same time, there are moments I feel warmth for a project where I spent almost a decade of my life. It helps you understand why nostalgia means, "A pain from an old wound."

In Summary

  • We have an Orion update out, today
  • We have a Spectre update, soon
  • We might have a Kino update, soon?
  • We have a Halide update, today
  • Halide Subscribers can sign up for the Mark III TestFlight, today
  • We'll have a wider Mark III preview, this Fall
  • If everything goes according to plan, we expect to launch Mark III, this Winter

This won't be the last you'll hear from us this Fall. Stay tuned for a post from Sebastiaan on our new design, along with our annual iPhone reviews.

]]>
<![CDATA[Physicality: the new age of UI]]>https://www.lux.camera/physicality-the-new-age-of-ui/68278e703bc03b0001380e67Tue, 03 Jun 2025 15:06:02 GMT

It’s an exciting time to be a designer on iOS. My professional universe is trembling and rumbling with a deep sense of mystery. There’s a lot of rumors and whispers of a huge redesign coming to the iPhone’s operating system — one that is set to be 'the biggest in a long time'.

There’s only been one moment that was similar to this: the spring of 2013. On June 10th, Apple showed off what would be the greatest paradigm shift in user interface design ever: iOS 7. I remember exactly where I was and how I felt. It was a shock.

Physicality: the new age of UI

If there is indeed a big redesign happening this year, it’ll be consequential and impactful in many ways that will dwarf the iOS 7 overhaul for a multitude of reasons. The redesign is rumored to be comprehensive; a restyling of iOS, macOS, iPadOS, tvOS, watchOS and visionOS. In the intervening years between iOS 7’s announcement and today, iPhones have gone from simply a popular device to the single most important object in people’s lives. The design of iOS affected and inspired most things around, from the web to graphic design and any other computer interface.

That’s why I figured I'd take this moment of obscurity, this precious moment in time where its changes are still shrouded in fog to savor something: wholesale naivety of where things are going, so I can let my imagination run wild.

What would I do if I were Apple’s design team? What changes would I like to see, and what do I think is likely? Considering where technology is going, how do I think interface design should change to accommodate? Let’s take a look at what’s (or what could be) next.


Smart people study history to understand the future. If we were to categorize the epochs of iOS design, we could roughly separate them into the Shaded Age, the Adaptive Age, and the New Age.

The Shaded Age

iOS started out as iPhone OS, an entirely new operating system that had very similar styling to the design language of the Mac OS X Tiger Dashboard feature:

Physicality: the new age of UI
via https://numericcitizen.me/what-widgets-on-macos-big-sur-should-have-been/
Physicality: the new age of UI
early iPhone prototypes with Dashboard widget icons for apps

The icon layout on iPhone OS 1 was a clear skeuomorph.

You might’ve heard that word being thrown around. It might surprise you that that doesn’t mean it has lots of visual effects like gradients, gloss and shadows. It actually means that to make it easier for users to transition from something they were used to — in this case, phones typically being slabs with a grid of buttons on them — to what things had become — phones were all-screen, so they could show any kind of button or interface imaginable.

And yes, there was a whole lot of visual effects in user interfaces from iPhone OS 1 to iOS 6. In this age, we saw everything from detailed gradients and shadows in simple interface elements to realistically rendered reel-to-reel tape decks and microphones for audio apps.

Physicality: the new age of UI
The Facebook share sheet had a paperclip on it! The texture of road signs on iOS maps was composed of hundreds of tiny hexagons!

Having actually worked on some of the more fun manifestations of it during my time working at Apple, I can tell you from experience that the work we did in this era was heavily grounded in creating familiarity through thoughtful, extensive visual effects. We spent a lot of time in Photoshop drawing realistically shaded buttons, virtual wood, leather and more materials.

That became known as ‘skeuomorphic design’, which I find a bit of a misnomer, but the general idea stands.

Physicality: the new age of UI

Of course, the metal of the microphone was not, in fact, metal — it didn’t reflect anything like metal objects do. It never behaved like the object it mimicked. It was just an effect; a purely visual lacquer to help users understand the Voice Memos app worked like a microphone. The entire interface worked like this to be as approachable as possible.

Notably, this philosophy extended even to the smallest elements of the UI: buttons were styled to visually resemble a button by being convex and raised or recessed; disabled items often had reduced treatments to make them look less interactive. All of this was made to work with lots of static bitmap images.

The first signs of something more dynamic did begin to show: on iPad, some metal sliders’ sheen could respond to the device orientation. Deleting a note or email did not simply make it vanish off-screen, but pulled it into a recycling bin icon that went as far as to open its lid and close it as the document got sucked in.

Physicality: the new age of UI
If it had not been for Benjamin Mayo publishing this video, no trace of this was ever even findable online.

Our brand new, rich, retina-density (2×) screens were about to see a radical transformation in the way apps and information were presented, however...

The Flat Age

iOS 7 introduced an entirely new design language for iOS. Much was written on it at the time, and as with any dramatic change the emotions in the community ran quite high. I’ll leave my own opinions out of it (mostly), but whichever way you feel about it, you can’t deny it was a fundamental rethinking of visual treatment of iOS.

iOS 7 largely did away with visual effects for suggesting interactivity. It went back to quite possibly the most primitive method of suggesting interactivity on a computer: some ‘buttons’ were nothing more than blue text on a white background.

Physicality: the new age of UI

The styling of this age is often referred to as ‘flat design’. You can see why it is called that: even the buttons in the calculator app visually indicate no level of protuberance:

Physicality: the new age of UI

The Home Screen, once a clear reference to the buttons on phones of yesteryear, was now much more flat-looking — one part owing to simpler visual treatment but also a distinct lack of usage of shadows.

Physicality: the new age of UI

But why did shadows have to go? They had an important function in defining depth in the interface, after all. Looking at the screenshot above actually does it no justice: the new iOS 7 home screen was anything but flat. The reason was that the shadows were static.

Physicality: the new age of UI

iOS 7 embraced a notion of distinct visual layers and using adaptive or dynamic effects to distinguish depth and separation. Why render flat highlights and shadows that are unresponsive to the actual environment of the user when you can separate the icons by rendering them on a separate plane from the background? Parallax made the icons ‘float’ distinctly above the wallpaper. The notification center sheet could simply be a frosted pane above the content which blurred its background for context.

Jony Ive proudly spoke at the iOS 7 introduction, on how ‘the simple act of setting a different wallpaper’ affected the appearance of many things. This was a very new thing.

Also a new thing in the interface was that the UI chrome was able to have the same dynamics: things like the header and keyboard could show some of the content they obscured shining through as if fashioned out of frosted glass.

Physicality: the new age of UI

While it was arguably an overcorrection in some places, iOS 7’s radical changes were here to stay — with some of its dynamic ‘effects’ getting greatly reduced (parallax is now barely noticeable). Over time, its UI regained a lot more static effects.

One of the major changes over time was that iOS got rounder; in step with the hardware it came on, with newly curved screen corners and ever-rounder iPhones, the user interface matched it in lock-step. It even did this dynamically based on what device it was running on.

Physicality: the new age of UI

More interface elements started to blend with content through different types of blur like the new progressive blur, and button shapes were slowly starting to make a comeback. It settled into a stable state — but it was also somewhat stagnant. For bigger changes, there would have to be a rethink.

What would come next couldn’t simply be a static bitmap again: it would have to continue the trend of increasingly adaptive interfaces.

The Age of Physicality

When Apple’s designers imagined the interface of VisionOS, they had a mandate to essentially start from scratch. What does an ‘app’ look like in an augmented reality?

What appears to be a key foundational tenet of the VisionOS design language is how elements are always composed of ‘real’ materials. No flat panels of color and shapes exist as part of the interface.

This even applies to app icons: while they do have gradients of color, they occupy discrete layers of their own, with a clear intention from their introduction video of feeling like actual ‘materials’:

Alan Dye, upon introduction of the VisionOS interface, stated that every element was crafted to have a sense of physicality: they have dimension, respond dynamically to light, and cast shadows.

This is essential in Vision Pro because the interface of apps should feel like it can naturally occupy the world around you and have as much richness and texture as any of the objects that inhabit that space. Comparing to the interfaces we are familiar with, that paradigm shift is profound, and it makes older, non physicality-infused interfaces feel archaic.

If I were to position a regular interface in the Vision Pro context, the result looks almost comically bad:

Physicality: the new age of UI

I find it likely, then, that there will be more than a mere static visual style from visionOS brought to iPhone, iPad and Mac (and potential new platforms) — it seems likely that a set of new fundamental principles will underpin all of Apple’s styling across products and expressions of its brand.

It would have to be more subtle than on Vision Pro - after all, interfaces do not have to fit in with the ‘real world’ quite as much - but dynamic effects and behavior essentially make the interface come to life.

Sound familiar? Existing aspects of the iPhone user interface already do this:

0:00
/0:17

Apple’s new additions to the iOS interface of the last years stand out as being materially different compared the rest of the interface.

They are completely dynamic: inhabiting characteristics that are akin to actual materials and objects. We’ve come back, in a sense, to skeuomorphic interfaces — but this time not with a lacquer resembling a material. Instead, the interface is clear, graphic and behaves like things we know from the real world, or might exist in the world. This is what the new skeuomorphism is. It, too, is physicality.

The Dynamic Island is a stark, graphic interface that behaves like an interactive, viscous liquid:

You can see it exhibiting qualities unique to its liquid material, like surface tension, as parts of it come into contact and meld together.

When it gains speed, it has momentum, much like the scrolling lists of the very first iteration of iPhoneOS, but now it reads more realistic to us as it also has directional motion blur or a plane of focus as items move on their plane:

Similarly, the new Siri animation behaves a bit more like a physically embodied glow - like a fiery gas or mist that is attracted to the edges of the device and is emitted by the user’s button press or voice.

Physicality: the new age of UI

What could be the next step?

My take on the New Age: Living Glass

Physicality: the new age of UI

I'd like to imagine what could come next. Both by rendering some UI design of my own, and by thinking out what the philosophy of the New Age could be.

A logical next step could be extending physicality to the entirety of the interface. We do not have to go overboard in such treatments, but we can now have the interface inhabit a sense of tactile realism.

Philosophically, if I was Apple, I’d describe this as finally having an interface that matches the beautiful material properties of its devices. All the surfaces of your devices have glass screens. This brings an interface of a matching material, giving the user a feeling of the glass itself coming alive.

Buttons and other UI elements themselves can get a system-handled treatment much like visionOS handles window treatments.*

*VisionOS is an exceptionally interesting platform, visual effect-wise, as the operating system gets very little data from the device cameras to ensure privacy and security. I would imagine that the “R1” chip, which handles passthrough and camera feeds, composes the glass-like visual effects on the UI chrome. All Apple devices can do this: they already do system-level effect composition for things like background blurs.

I took some time to design and theorize what this would look like, and how it would work. For the New Design Language, it makes sense that just like on VisionOS, the material of interactivity is glass:

Physicality: the new age of UI
My mockup of a dynamic glass effect on UI controls

Glass is affected by its environment. The environment being your content, its UI context, and more.

Physicality: the new age of UI

Since it is reflective, it can reflect what is around it; very bright highlights in content like photos and videos can even be rendered as HDR highlights on Glass elements:

Physicality: the new age of UI
Note the exhaust flare being reflected in the video playback bar; an interactive element like the close button in the top left has its highlights dynamically adjusted by the scene, too.

Glass elements visibly occupy a place in a distinct spatial hierarchy; if it does not, elements can be ‘inlaid’: in essence, part of the plane of glass that is your display or a glass layer of the UI:

Physicality: the new age of UI

Much like the rear of your iPhone being a frosted pane of glass with a glossy Apple logo, controls or elements can get a different material treatment or color. Perhaps that treatment is even reactive to other elements in the interface emitting light or the device orientation — with the light on it slightly shifting, the way the elements do on VisionOS when looked at.

Controls may transition as they begin to overlay content. One can imagine animated states for button lifting and emerging from their backdrop as it transitions the hierarchy:

Physicality: the new age of UI

These effects can be rendered subtly and dynamically by the system. In comparison, it makes ‘regular’ static interfaces look and feel inert and devoid of life.

Glass has distinct qualities that are wonderful in separating it from content. It can blur the material below it, as we already see in modern iOS controls. It can have distinct, dynamic specular highlights from its surroundings:

Physicality: the new age of UI
A little drag control for exposure, as a spin on our modern EV adjustment in Halide. Note the material itself responding to the light the adjustment emits.

It can have caustics, which is to say it separates itself from the backdrop by showing interaction with light in its environment by casting light, not shadow:

Physicality: the new age of UI

... and it can also get infused with the color and theme of the interface around it. Glass does not just blur or refract its background: it reflects, too. This isn’t totally out of left field: this is seen in the WWDC25 graphics, as well:

Physicality: the new age of UI

Elements of Style

Having a set of treatments established, let’s look at the elements of the New iOS Design.

Tab Bar

I would imagine that the era of ‘closed tab bars’, that is, the type that masks the content outright is ending. In fact, I wouldn’t be surprised if almost all UI that outright masks the interface like that as a max-width bar would be gone.

These types of static interface panels are a legacy element from the early days of iOS. The new type can float over content:

Physicality: the new age of UI

Controls like this are better suited to transition to rise from its underlying ‘pane’ as you scroll it out of view, and can similarly also hide themselves so they're not obscuring content all the time.

Controls

It can be overwhelming for all elements in the interface to have a particularly rich treatment, so as I mentioned before, I would expect there to be various levels of this ‘depth’ applied. Core actions like the email sending button in Mail can be elevated:

Physicality: the new age of UI

Whereas other actions that are part of the same surface — like the ‘Cancel’ action here — can get more subtle treatments.

Elevated controls can be biased slightly towards warmer color balance and background elements towards cool to emphasize depth.

App Icons

Apple put considerable work into automatic masking for icons in iOS 18, and I doubt it was only for Dark Mode or tinted icons on an identical black gradient icon backdrop. The simple, dark treatment of icon backgrounds makes me imagine it was preparation for a more dynamic material backdrop.

Physicality: the new age of UI
Dynamic icon backdrops in Dark Mode - note the variable specular highlights based on their environment.

Not to mention, app icons are exactly the type of interactive, raised element I spoke of before that would be suited to a living glass treatment.

Physicality: the new age of UI
Dynamic rendering of icons with a ‘content layer’, glassy effects and an overall polishing of existing designs. The corners are also slightly rounder.

I’d also imagine some app icons that are due for a redesign would get updates. Many haven’t been updated since iOS 7. This would be a major change to some of Apple’s ‘core brands’, so I expect it to be significant, but consistent with the outgoing icons to maintain continuity while embracing the new visual language —kind of like the Safari icon above.

Physicality: the new age of UI

On the note of icons, I also wouldn’t be surprised if the system icons themselves got a little rounder.

Home Screen

It seems likely the Home Screen as a whole is re-thought for the first time. Its complexity has ballooned since the early days of iOS. I find myself spending a lot of time searching in my App Library.

I think there’s a great AI-first, contextual slide-over screen that can co-exist with the regular grid of apps we are used to. I was a bit too short on time to mock this up.

Sliders and Platters

Basic interactive building blocks of the iOS interface will get system-provided treatments that are responsive to their environment:

Physicality: the new age of UI
Note how the Contact platter has some environment interaction with the green ‘Now’ light

Overall, one can imagine a rounding and softening of the interface through translucent materials looking pretty great.

Beyond

This ‘simple’ change in treatment — to a dynamic, glassy look — has far-reaching consequences.

Apple is unique — its predominant user interface style in the 2000s has always been linked to its branding. Its icons are also logos, its treatments a motif that stretch far beyond platforms they live on. Consider the navigation of Apple.com:

Physicality: the new age of UI
The navigation of Apple’s website has changed in step with major UI design eras. The introduction and maturation of Aqua in 2000 and beyond; iPhone and Mac OS X’s softer gradients in 2008; and finally, a flatter look after 2014.

It is not a stretch to assume that this, too, would assume some kind of dynamic, new style. Therein lie some of the challenges.

I love products with innovative, novel interfaces — modern iOS isn’t a simply a product, but a platform. Its designers bear responsibility to make the system look good even in uncontrolled situations where third party developers like myself come up with new, unforeseen ideas. That leaves us with the question of how we can embrace a new, more complex design paradigm for interfaces.

A great thing that could come from this is new design tools for an era of designing interfaces that go so far beyond placing series of rounded rectangles and applying highly limited effects.

When I spoke of designing fun, odd interfaces in the ‘old days’, this was mostly done in Photoshop. Not because it was made for UI design — quite the contrary. It just allowed enough creative freedom to design anything from a collection of simple buttons to a green felt craps table.

Physicality: the new age of UI
Green felt, rich mahogany, shiny gold and linen in the span of about 450 pixels.

If what is announced is similar to what I just theorized, it’s the beginning of a big shift. As interfaces evolve with our more ambient sense of computing and are infused with more dynamic elements, they can finally feel grounded in the world we are familiar with. Opaque, inert and obstructive elements might occupy the same place as full-screen command line interfaces — a powerful niche UI that was a marker in history, passed on by the windowed environment of the multi-tasking, graphical user interface revolution.

Science Fiction and Glass Fiction

The interfaces of computers of the future are often surprisingly easy to imagine. We often think of them and feature them in fiction ahead of their existence: our iPhone resembles a modern Star Trek tricorder; many modern AI applications resemble the devices in sci-fi movies like ‘Her’ and (depressingly) Blade Runner 2049. It’s not surprising, then, that concept interfaces from the likes of Microsoft often feature ‘glass fiction’:

The actual interface is unfortunately not nearly as inspired with such life and behavioral qualities. The reason is simple: not only is the cool living glass of the video way over the top in some places, but few companies can actually dedicate significant effort towards creating a hardware-to-software integrated rendering pipeline to enable such UI innovations.

Regardless, we like to imagine our interfaces being infused with this much life and joy. The world around us is — but our software interfaces have remained essentially lifeless.

And that brings us to Apple. There was an occasion or two where Apple announced something particularly special, and they took a beat on stage to pause and explain that only Apple could do something like this. It is a special marriage of hardware, and software — of design, and engineering. Of technology and the liberal arts.

Physicality: the new age of UI

And that still happens today. Only Apple could integrate sub pixel antialiasing and never-interrupted animations on a hardware level to enable the Dynamic Island and gestural multi-tasking; only Apple can integrate two operating systems on two chips on Vision Pro so they can composite the dynamic materials of the VisionOS UI. And, perhaps only Apple can push the state of the art to a new interface that brings the glass of your screen to life.

We’ll see at WWDC. But myself, I am hoping for the kind of well-thought out and inspired design and engineering that only Apple can deliver.


All writing, conceptual UI design and iconography in this post was made by hand by me. No artificial intelligence was used in authoring any of it.

]]>
<![CDATA[What is HDR, anyway?]]>https://www.lux.camera/what-is-hdr/6781592569132a0001999641Tue, 13 May 2025 16:18:16 GMT

It's not you. HDR confuses tons of people.

Last year we announced HDR or "High Dynamic Range" photography was coming to our popular photography app, Halide. While most customers celebrated, some were confused, and others showed downright concern. That's because HDR can mean two different, but related, things.

The first HDR is the "HDR mode" introduced to the iPhone camera in 2010.

What is HDR, anyway?
September, 2010

The second HDR involves new screens that display more vibrant, detailed images. Shopped for a TV recently? No doubt you've seen stickers like this:

What is HDR, anyway?

This post finally explains what HDR actually means, the problems it presents, and three ways to solve them.

What is Dynamic Range?

Let's start with a real world problem. Before smart phones, it was impossible to capture great sunsets with point-and-shoot cameras. No matter how you fiddled with the dials, everything came out too bright or too dark.

In that photo, the problem has to do with the different light levels coming from the sky and the buildings in shadow, the former emitting thousands of times more light than the latter. Our eyes can see both just fine. Cameras? They can deal with overall bright lighting, or overall dim lighting, but they struggled with scenes contain both really dark and really bright spots.

Dynamic range simply means, "the difference between the darkest and brightest bits of a scene." For example, this foggy morning is an example of a low dynamic range scene, because everything is sort of gray.

What is HDR, anyway?
Screens have no trouble showing this low-contrast photo. Shot with Halide in Osaka.

Most of our photos aren't as extreme as bright sunsets or foggy mornings. We'll just call those "standard dynamic range" or SDR scenes.

Before we move on, we need to highlight that the HDR problem isn't limited to cameras. Even if you had a perfect camera that could match human vision, most screens cannot produce enough contrast to match the real world.

Regardless of your bottleneck, when a scene contains more dynamic range than your camera can capture or your screen can pump out, you lose highlights, shadows, or both.

Solution 1: "HDR Mode"

In the 1990s researchers came up with algorithms to tackle the dynamic range problem. The algorithms started by taking a bunch of photos with different settings to capture more highlights and shadows:

Then the algorithms combined everything into a single "photo" that matches human vision… a photo that was useless, since computer screens couldn't display HDR. So these researchers also came up with algorithms to squeeze HDR values onto an SDR screen, which they called "Tone Mapping."

What is HDR, anyway?
The Reinhard Tone Mapper, invented in 2002. It is one of many.

These algorithms soon found their way into commercial software for camera nerds.

What is HDR, anyway?
Photomatix Circa 2008

Unfortunately, these packages required a lot of fiddling, and too many photographers in the mid-2000s… lacked restraint.

What is HDR, anyway?
The Ed Hardy t-shirt of photography. Via Wikipedia.

Taste aside, average people don't like fiddling with sliders. Most people want to tap a button and get a photo that looks closer to what they see without thinking about it. So Google and Apple went an extra step in their camera apps.

Your modern phone's camera first captures a series of photos at various brightness levels, like we showed a moment ago. From this burst of photos, the app calculates an HDR image, but unlike that commercial software from earlier, it uses complex logic and AI to make the tone mapping choices for you.

What is HDR, anyway?
Phil Schiller at the iPhone XS introduction showing off a newer Smart HDR

Apple and Google called this stuff "HDR" because "HDR Construction Followed By Automatic Tone Mapping" doesn't exactly roll off the tongue. But just to be clear, the HDR added to the iPhone in 2010 was not HDR. The final JPEG was an SDR image that tries to replicate what you saw with your eyes. Maybe they should have called it "Fake HDR Mode."

I know quibbling over names feels as pedantic as going, "Well actually, 'Frankenstein' was the doctor, you're thinking of 'Frankenstein's Monster,'" but if you're going to say you hate HDR, remember that it's bad tone mapping that is the actual monster. That brings us to…

The First HDR Backlash

Over the years, Apple touted better and better algorithms in their camera, like Smart HDR and Deep Fusion. As this happened, we worried that our flagship photography app, Halide, would become irrelevant. Who needs a manual controls when AI can do a better job?

We were surprised to watch the opposite play out. As phone cameras got smarter, users asked us to turn off these features. One issue was how the algorithms make mistakes, like this weird edge along my son Ethan's face.

What is HDR, anyway?
When life gives you lemons, you... eat them.

That's because Smart HDR and Deep Fusion require that the iPhone camera capture a burst of photos and stitch them together to preserve the best parts. Sometimes it goofs. Even when the algorithms behave, they come with tradeoffs.

Consider these photos I took from a boat in the Galapagos: the ProRAW version, which uses multi-photo algorithms, looks smudgier than the single-shot capture I took moments later.

What's likely happening? When things move in the middle of a burst capture— which always happens when shooting handheld— these algorithms have to nudge pixels around to make things line up. This sacrifices detail.

Since 2020, we've offered users the option of disabling Smart HDR and Deep Fusion, and it quickly became one of our most popular features.

What is HDR, anyway?

This lead us to Process Zero, our completely AI-free camera mode, which we launched last year and became a smash hit. However, without any algorithms, HDR scene end up over and under exposed. Some people actually prefer the look — more on that later — but many were bummed. They just accepted this as a tradeoff for the natural aesthetic of AI-free photos.

But what if we don't need that tradeoff? What if I told you that analog photographers captured HDR as far back as 1857?

What is HDR, anyway?
The Great Wave by Gustave Le Gray, via The Met

Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes.

What is HDR, anyway?
The Tetons and the Snake River via Wikipedia

It's even more incredible that this was done on paper, which has even less dynamic range than computer screens!

From studying these analog methods, we've arrived at a single-shot process for handling HDR.

What is HDR, anyway?
Halide's new, optional tone-mapping.

How do we accomplish this from a single capture? Let's step back in time.

Learning From Analog

In the age of film negatives, photography was a three step process.

  1. Capture a scene on film
  2. Develop the film in a lab
  3. Transfer the film to paper

It's important to break down these steps because— plot twist— film is actually a high dynamic range medium. You just lose the dynamic range when you transfer your photo from a negative to paper. So in the age before Photoshop, master photographers would "dodge and burn" photos to preserve details during the transfer.

What is HDR, anyway?
An excerpt from The Print, the Ansel Adams Photography Series 3
What is HDR, anyway?
"Clearing Winter Storm, Yosemite National Park" Via Wikipedia.

Is it a lie to dodge and burn a photo? According to Ansel Adams in The Print:

When you are making a fine print you are creating, as well as re-creating. The final image you achieve will, to quote Alfred Stieglitz, reveal what you saw and felt.

I'm inclined to agree. I don't think people reject processing your photos, whether it's dodging-and-burning a print, or fiddling with multi-exposure algorithms. The problem is that algorithms are not artists.

AI cannot read your mind, so it cannot honor your intent. For example, in this shot, I wanted stark contrast between light and dark. AI thought it was doing me a favor by pulling out detail in the shadow, flattening the whole image in the process. Thanks Clippy.

Even when tone mapping can help a photo, AI may take things too far, creating hyper-realistic images that exist in an uncanny valley. Machines cannot reason their way to your vision, or even good taste.

We think there's room for a different approach.

A Different Approach: Opt-In, Single Shot Tone Mapping

After considerable research, experimentation, trial and error, we've arrived on a tone mapper that feels true to the dodging and burning of analog photography. What makes it unique? For starters, it's derived from a single capture, as opposed to the multi-exposure approaches that sacrifice detail. While a single capture can't reach the dynamic range of human vision, good sensors have dynamic range approaching film.

However, the best feature is that this tone mapping is off by default. If you come across a photo that feels like it could use a little highlight or shadow recovery, you can now hop into Halide's updated Image Lab.

0:00
/0:08

In the Image Lab we have an exposure slider for adjusting overall brightness just like before. But to its right, we have a single dial that tames or boosts dynamic range. We think it's up to the photographer to decide what feels right.

To be clear, the tone mapper works different than simply bringing your photo into an editor and dragging the "shadows" and "highlights" sliders. It also does it best to preserve local contrast.

Don’t worry: adjusting this stuff after-the-fact won't sacrifice quality. Since Halide captures DNG or "digital negative" files, it contains all of the information that your screen cannot display. The shadow and highlight details are already in there, and the tone-mapping simply brings it out selectively.

Solution 2: Genuine HDR Displays

I went to all that trouble explaining the difference between HDR and Tone Mapping because… drum roll please… today's screens are HIGHer DYNAMIC RANGE!

0:00
/0:10
0:00
/0:10
0:00
/0:10

The atrium of the Hyatt Centric in Cambridge

Ok, today's best screens still can't match the high dynamic range of real life, but they're way higher than the past. Spend a few minutes watching Apple TV's mesmerizing screensavers in HDR, and you get why this feels as big as the move from analog TV to HDTV. So… nine years after the introduction of HDR screens, why hasn't the world moved on?

A big problem is that it costs the TV, Film, and Photography industries billions of dollars (and a bajillion hours of work) to upgrade their infrastructure. For context, it took well over a decade for HDTV to reach critical mass.

Another issue is taste. Much like adding a spice to your meal, you don't want HDR to overpower everything. The garishness of bad HDR has left many filmmakers lukewarm on the technology. Just recently, cinematographer Steve Yedlin published a two hour lecture on the pitfalls of HDR in the real world.

If you want to see how bad HDR displays can get, look no further than online content creators. At some point these thirsty influencers realized that if you make your videos uncomfortably bright, people will pause while swiping through their Instagram reels. The abuse of brightness has lead to people disabling HDR altogether.

What is HDR, anyway?

For all these reasons, I think HDR could end up another dead-end technology of the 2010s, alongside 3D televisions. However, Apple turned out to be HDR's best salesperson, as iPhones have captured and rendered HDR photos for years.

In fact, after we launched Process Zero last year, quite a few users asked us why their photos aren't as bright as the ones produced by Apple's camera. The answer was compatibility, which Apple improved with iOS 18. So HDR is coming to Process Zero!

What is HDR, anyway?

To handle the taste problem, we're offering three levels of HDR:

  • Standard: increases detail in shadows, and bumps up highlights while giving a tasteful rolloff in highlights
  • Max: HDR that pushes the limits of the iPhone display
  • Off: turns HDR off altogether.

Compatibility Considerations

Once you've got an amazing HDR photo, you're probably wondering where you can view it, today. The good news is that every iPhone that has shipped for the last several years supports HDR. It just isn't always available.

As we mentioned earlier, some users turn off HDR because the content hurts their eyes, but even if it's on, it isn't always on. Because HDR consumes more power, iOS turns it off in low-power mode. It also turns it off when using your phone in bright sunlight, so it can pump up SDR as bright as it can go.

An even bigger issue is where you can share it online. Unfortunately, most web browser can't handle HDR photos. Even if you encode HDR into a JPEG, the browser might butcher the image, either reducing the contrast and making everything look flat, or clipping highlights, which is about as ugly as bad digital camera photos from the 1990s.

But wait… how did I display these HDR examples? If you look carefully those are short HDR videos that I've set to loop. You might need these kinds of silly hacks to get around browser limitations.

Until recently, the best way to view HDR was with Instagram's native iPhone app. While Instagram is our users' most popular place to share photos… it's Instagram. Fortunately, things are changing.

iOS 18 adopted Adobe's approach to HDR, which Apple calls "Adaptive HDR." In this system, your photos contain both SDR and HDR information in a single file. If an app doesn't know what to do with the HDR information, or it can't render HDR, there's an SDR fallback. This stuff even works with JPEGs!

What is HDR, anyway?
From Apple's Adaptive HDR Presentation

Browser support is halfway there. Google beat Apple to the punch with their own version of Adaptive HDR they call Ultra HDR, which Android 14 now supports. Safari has added HDR support into its developer preview, then it disabled it, due to bugs within iOS.

Speaking of iOS bugs, there's a reason we aren't launching the Halide HDR update with today's post: HDR photos sometimes render wrong in Apple's own Photos app! Oddly enough, they render just fine in Instagram and other third-party apps. We've filed a bug report with Apple, but due to how Apple releases software, we doubt we'll see a fix until iOS 19.

Rather than inundate customer support with angry emails about how photos don't look right in Apple's photos app, we've decided to release HDR support in our Technology Preview beta that we're offering to 1,000 Halide subscribers. Why limit it to 1,000? Apple restricts how many people can sign up for TestFlight, so we want to make sure we stay within our limits. This is the start of our preview of some very exciting big features in Halide which are part of our big Mark III update.

If this stuff excites you and you want to try it out, go to the Members section in Settings right now.

What is HDR, anyway?

Solution 3: Embrace SDR

As mentioned earlier, some users actually prefer SDR. And that’s OK. I think this about more than just the lo-fi aesthetic, and touches on a paradox of photography. Sometimes a less-realistic photo is more engaging.

But aren't photos about capturing reality? If that were true, we would all use pinhole cameras, ensuring we capture everything in sharp focus. If photos were about realism, nobody would shoot black and white film.

What is HDR, anyway?
Shot on Ilford HP5, ƒ/1.4

Consider this HDR photo of my dad.

0:00
/0:10

Shot in ProRAW

HDR reveals every wrinkle and pore on his face, and the bright whites in his beard draw too much attention. Just as you might use shallow focus to draw attention on your subject, this is one situation where less dynamic range feels better than hyper-realism. Consider the Process Zero version, with HDR disabled.

What is HDR, anyway?
Process Zero, without Tone Mapping

While we have plenty of work before Process Zero achieves all of our ambitions, we think dynamic range is a huge factor in recapturing the beauty of analog photography in the digital age.

What is HDR, anyway?
Shot on film.

We think tone mapping is an invaluable tool that dates back hundreds of years. We think HDR displays have amazing potential to create images we've never seen before. We see a future where SDR and HDR live side by side. We want to give you that choice — whether it is tone-mapping, HDR, or any combination thereof. It’s the artists’ choice — and that artist doesn’t have to be an algorithm.

We think the future of sunsets looks bright.

]]>
<![CDATA[iPhone 16e camera review: The Essentials]]>https://www.lux.camera/iphone-16e-camera-review-the-essentials/67ceb6a13f48240001c4c634Wed, 12 Mar 2025 18:35:33 GMT

If there’s one constant in iPhones, it’s that cameras get bigger every year. A peek at the backside of iPhone over the last few years showcases an almost alarming growth of the camera system, with the latest and — quite literally — greatest iteration of the camera system in the iPhone 16 Pro taking over more than half of the top half of its backside.

Surprisingly, this year Apple released an iPhone that bucked that trend. Last week, the iPhone 16e arrived in stores and at our West Coast headquarters* and  I couldn’t wait to dig in. Was this a huge step back in usage compared to the cutting, large-camera-edge, or would this delightfully simple package actually hold up? 

*(my home. We are just two people, we don’t really have HQs)

Let’s go deep into iPhone 16e’s camera and find out.

What’s in a camera

iPhone gains bigger and better cameras every year for a simple reason: one of the main reasons people upgrade phones is to take better photos and videos. We’ve gone from an introduction of the iPhone as an iPod and phone to it being more camera than almost anything else. Selling a phone without a camera today would be considered releasing a laughable dud for a device, whereas just 20 years ago we weren’t even sure it was all that important to have.

A key improvement that made photos so important was quality. That seems counter-intuitive, but let me explain: the cameras on smartphones became more important as they improved because they got good enough. Sure, my indestructible Nokia candy bar from the ancient age could arguably capture a photo, but it was a grainy mess. For capturing a meaningful moment I’d pack a ‘real camera’. This distinction between ‘real cameras’ and ‘phone cameras’ started to become far less important around the iPhone 4 or so, as apps making quick sharing of shots with a few adjustments or filters became popular and the cameras themselves started making huge leaps in image quality. 

While some of that image quality came from making the actual camera bigger, you’ll notice that cameras were not all that huge in that generation. In fact, iPhone 4 and 4S, as well as 5 and 5S had no camera bumps to speak of. What they did pack in was the very first version of computational photography: a software process to improve the image from these tiny, tiny sensors and lenses through applying the processing power of the entire computer that is the phone in your pocket. It started with HDR, then Smart HDR and more technical sounding technologies like Deep Fusion in modern iPhones. 

Software, more than hardware by a lot, is responsible for letting us take great photos that make us feel fine leaving big, bulky dedicated cameras at home when it comes to capturing life’s most important moments. If you propose to your love tomorrow and all you have is an iPhone 16 snap, you’re probably not terribly unhappy. The photo will be usable: you can frame it and enjoy it. The fact that that’s true even if you had done it by moonlight at night is a remarkable advancement on par with magic: there’s simply not a lot of light and sharpness you can get out of that small a camera, and things like Portrait and Night mode are genuinely superb.

In iPhone 16e, Apple promises to pair some of its best, smartest processing from the latest A-series chip with a somewhat smaller, older camera hardware component. The question that was immediately on my mind was whether or not the shots from the iPhone 16e would hold up, or if they would just look… kind of bad. 

What you get — and what you don’t

You can speculate what the ‘e’ in ‘16e’ stands for, but in my head it stands for ‘essential’. Some things that I consider particularly essential to the iPhone are all there: fantastic build quality, an OLED screen, iOS and all its apps, and Face ID. It even has satellite connectivity. Some other things I also consider essential are not here: MagSafe is very missed, for instance, but also multiple cameras. It would be reasonable to look at Apple’s Camera app, then, and see what comprises the ‘essential’ iPhone camera experience according to Apple. 

The good news here is that it packs in a lot. Once you dig into settings to enable things, here’s some of the features it packs in:

  • Single-camera Portrait mode with Depth Control
  • 48MP sensor with 24MP default capture and optional 48 MP JPG/HEIC capture
  • 2× ‘optical’ zoom by cropping the 48MP sensor’s center area
  • Night mode (12MP, like all other iPhones)
  • Very good video in Dolby Vision HDR, up to 4K/60fps 
  • Slo-mo video up to HD/240fps
  • Spatial Audio capture in video 

That’s nothing to scoff at. Compared to the last year’s flagship iPhones there’s little you miss out on.

To me, the most notable things lacking here that I enjoy (but do not find surprising omissions) are:

  • ProRAW capture
  • Apple Log capture (it’s not a Pro phone, so this all makes sense) 
  • Night mode Portrait mode 
  • Cinematic mode in video
  • Action mode in video (this is just super nice, and missing out on it makes me sad)
  • Macro mode 

Some of these omissions are entirely predictable. For instance, macro mode has only ever existed on iPhones thanks to the extra-close focus distance of their ultrawide 0.5× cameras. ProLog and ProRAW has only ever been available on iPhone Pro models. It’s possible Cinematic mode requires multiple cameras to work right. 

Action mode, on the other hand, is a genuinely fantastic feature I’ll miss. I suspect that it is missing because the camera itself lacks one of the best camera hardware enhancements of the last years: sensor-shift camera stabilization.

But — one could argue the ‘essentials’ are all here. 

How does iPhone 16e compare? 

Remember that I said, however, that the camera itself is actually physically smaller. It stands to reason then that these features that *are* available might be worse if the software brains of the iPhone 16e fail to compensate for it. Is it comparable to the latest and greatest iPhones, or just a literal crap-shoot? 

I pitted it against the iPhone 16 Pro to get some shots, and here’s my impressions. Let me preface that by saying that I think this is an unfair comparison: most people that are the target audience of iPhone 16e aren’t the type to buy the top-end iPhone, most likely. But I am sure they do care about getting great shots. 

Let’s compare a capture from iPhone 16e to the iPhone 16 Pro.

Notice anything a little different? Aside from color (iPhone 16e consistently took pinker, warmer shots) — the iPhone 16e gives you a slightly narrower field of view. Its camera isn’t quite as wide angle as the iPhone 16 Pro’s, by about a 2mm focal length (26mm vs. 24mm equivalent). In practice, that felt nice to me — you lack a separate zoom lens, so a little bit closer of a zoom is a nice trade-off.

Cropping in close at 48 MP HEIC

Crop things in close and you see that the smaller sensor on iPhone 16e does have less detail and a bit more ’smudging’. This gets more aggressive in lower light. It’s really not a shabby shot detail-wise, though — this camera packs a punch.

In all, the iPhone 16e has times where it shoots great shots — not quite on par with iPhone 16, but certainly on par with previous non-Pro iPhones or even the iPhone 14 Pro, a previous favorite of mine.

Recall that lack of the sensor-shift stabilization, though? I found that that — not its smaller sensor — is what is limiting the iPhone 16e most of all. Shots that a Pro-iPhone user would get steady and sharp will be a bit blurry, and that happens a lot more often when the sun begins to set or you are in darker indoor settings.

Another good point of comparison is the ‘2×’ lens that is included. Since its sensor is somewhat smaller and lower quality, the resulting image should be a bit less great too. I found it to be a bit ‘crunchy’ in its processing and appearance.

The 2× camera + smart ‘zoom’ processing can sometimes result in less than pleasing images.

Software rules this camera 

OK, yes: it’s worse than the flagship of iPhone cameras. But the story really doesn’t end there. Remember when I said that the iPhone 16e offers the essentials of an iPhone? That includes iOS and its apps. What really makes the iPhone 16e shine, then, is how thoroughly enjoyable it is as a single-lens camera with apps unlocking some of those extra ‘skills’.

Neural Macro mode in Halide enabled some of the following shots, which while not nearly microscopic like the Pro-series and iPhone 16 are are still a nice closeup:

You can capture RAW shots with Halide or other third party apps too, which (surprise!) were what I used for several of the shots in this blog post, with either minor edits or Halide’s minimal ‘Process Zero’ processing.

It’s what let me to discover that the iPhone 16e’s sensor has a unique quality and rendering to it that I enjoy; it’s a bit grainy, somewhat different in color, but entirely moody. 

In fact, if shot at night, an edited RAW file gave me nice low light results that I preferred over the regular HEIC shots out of camera without Night mode.

As the kids would say today, it’s a vibe. Well done, Apple. 

If you really miss the 3× (and wow, did I miss a 3× / 5× while using this phone) you can even get that via some ML enhancement — though not nearly as sharp as a dedicated camera, obviously:

Some signature crunchiness on this 3× neural zoom from Halide, but it’s fairly serviceable. There’s no substitute for a real physical 3× / 5× lens.

And the apps are plentiful: you can set a separate camera on your Lock Screen to make up for the Camera Control — which I found myself going for and missing a lot — or add widgets from great apps like Sunlitt to catch the light just right. 

Overall, the iPhone 16e is a very interesting way to take in what is the essential experience of iPhone photography: and it makes sense that it’s a wealth of apps and options in iOS that truly make it sing.

If you focus on the essentials and love shooting RAW, this one-camera wonder will be a delight to chase the light with - just take it slow, and handle it with care.

]]>
<![CDATA[The Road to Halide Mark III]]>Today we're excited to announce our plans around Halide 3.0, or as we like to call it, Mark III. We're also inviting you to participate in its development through our new community Discord, where you'll be able to share feedback, early builds, and

]]>
https://www.lux.camera/the-road-to-halide-mark-3/67636afd69132a0001998fcdMon, 23 Dec 2024 19:17:41 GMT

Today we're excited to announce our plans around Halide 3.0, or as we like to call it, Mark III. We're also inviting you to participate in its development through our new community Discord, where you'll be able to share feedback, early builds, and more!

This approach is a break from how we've built things in the past, where we'd play coy when users ask us about new features. It's fun to surprise drop a new app or major update. Why change?

The Road to Halide Mark III
Hey look, it’s the entire team! Photo courtesy of Apple

We're a small team of one developer and one designer, so it can take months to launch big features. It's risky to invest months of our time on features without any feedback from users. Even worse, months of radio-silence can leave customers (that is you!) feeling ignored.

We experimented with some more transparency a little over a year ago, when we announced we were building Kino. The reception to our announcement was overwhelmingly positive, and made us felt comfortable investing six months into a brand new app.

Our openness around product plans also made it easier to solicit ideas and feedback. Kino arrived to enormous success, with Apple just awarding it the incredible App Store Award for 2024 iPhone App of the Year. I guess a little transparency doesn't hurt anyone!

The Road to Halide Mark III

So, what’s next? We'd like to dig into three features coming to Mark III, and how Halide subscribers can try them early.

Feature 1: Color Grades

In the beginning, Halide was about offering advanced control, while leaving the actual image processing algorithms up to the system. Over the years, we felt ourselves pulled toward our own image processing.

For example, with our big Macro Photography update, we added super-resolution algorithms to get sharper macro shots. Still, these sharper images were ultimately based on the default iPhone "look."

Process Zero was our experiment in developing a signature look for Halide. Little did we know it would completely changed how our customers used the app.

The Road to Halide Mark III

And now the top request we've gotten is support for one-tap color grades, like what we launched this past year in Kino.

The Road to Halide Mark III

We are way ahead of you. We envision an app that not only bundled gorgeous film looks within the app, but also let you import looks built by other people. In fact, we built Kino as an experiment in the future of Halide!

There's just one feature that didn't make it into Kino, due to time constraints and technologies that weren't ready for prime time a year ago…

Feature 2: Our Take on High Dynamic Range Photos

If you've scrolled through social media in the last few years, perhaps you've noticed images "pop" a bit more. That's because current iPhones can display brighter peak-brightness, which lets images contain more details, even in high-contrast scenes. This is called HDR.

Now if you took photos with Apple's default algorithms, HDR is included. Halide's bespoke process, Process Zero, does not support HDR— for a few good reasons. From a product perspective, there's an issue of taste, with a lot of cameras producing obnoxious HDR images. We want to take our time and come up with a thoughtful and nuanced HDR look.

Another issue was compatibility. Most websites, including our own blog hosted by Ghost, cannot display HDR images. I embedded that Instagram example because it seems major social networks are finally on-board. We think compatibility will improve thanks to iOS 18, which introduces, "Adaptive HDR," a technology that allows old file formats such as JPEG to support HDR. This should make it easier for every platform to hop on the HDR bandwagon.

We think HDR is about to hit a tipping point, which is why we want to sort out support as we invest in our color grading machinery.

Feature 3: The Redesign

Next year, Halide Mark II turns 5. iPhone photography has changed a lot in five years, and we've learned a lot more about our users and our products. It's a perfect time to make some big changes.

Halide is getting a redesign. We think Halide's refreshed camera design will follow HDR and Grade support, because form should follow function. We can’t quite reveal it — but we can tell you it will be gorgeous in both form and function. For example, if Halide's version of Instant Grade goes as smooth as we think it will, we'll make grade-picking central to the UI, just like Kino.

When will Mark III ship?

While it's comforting to list out features and say, "That's all there is to Mark III," it's notoriously difficult to estimate large software projects. We've never worked on an app without coming across surprises like iOS bugs and edge cases. That's why they're called surprises.

The Road to Halide Mark III
We’re fond of John Carmack’s quote: “It’s done when it’s done"

We also have to budget our fall release schedule and any surprises. Whenever Apple launches new hardware alongside an iOS update, we drop everything to integrate the new hotness in our apps. Sometimes it takes days, sometimes it takes months.

And for those wondering: Yes, if you are a current subscriber of Halide Mark II, you get Mark III as well. The same goes for Halide Mark II one time purchases.

Finally, while we've focused on Halide in this post, it goes without saying that we have lots of exciting plans around Kino. We'll be working hard to fit those in between Halide updates.

We want to ship Mark III as soon as we can. We'd love to ship it in 2025. That sounds doable at this moment, but we'll keep you posted as things progress.

How do you try it early?

Just like in the past, when we were getting ready to launch major updates, we are opening up "Early Access" to Halide subscribers. It's really helpful at bringing features across the finish line, but we'd like to try gathering feedback even earlier.

We've noticed a lot of indie game devs do this through community Discords, and so we're stealing that idea (hey, we’re indies, too!). We're launching an official Halide+Kino Discord!

The Road to Halide Mark III

You can join us here today!

We'll try using this Discord to engage bounce ideas off our users, post sample photos, and even share early TestFlight builds.

We’ll go one step further, though. We thought, "Wouldn't it be cool to form a community about more our products?"

If you're just a passionate photographer who wants to make friends and hone their craft, check out our new Challenge: the 2025 Halide and Kino 52-week challenges!

The Road to Halide Mark III

Every week you'll get a photography challenge on our Discord. We'll also include resources to help with the challenge — like app-specific tips. The challenge will be shared there and on our social media. Once you've got your shot, you can share your shots and see what the rest of the community came up with.

We'll come up with real rewards for community members who complete the entire, year-long challenge - including some limited edition Halide and Kino swag!

You're also welcome to just join that Discord to connect with other Halide and Kino users, get tips from the best, share your shots and learn from each other.


That's it from us for 2024. We have never had such an incredible adventure in one year. From Ben having a baby, to shipping Kino, and being in a keynote and winning App of the Year— it was truly a wild ride.

The Road to Halide Mark III

It was an incredible year for us — made possible entirely by you supporting our tiny team to do what we love most: making nice apps to take great shots. We loved seeing all your photos and videos this year, thoroughly enjoyed all your kind messages, comments and feedback and we are so proud of what we have shipped.

2025 is proving to be another incredible year, and we’re very excited to bring you along for the ride. We’re wishing you very happy holidays and a great new year!

]]>
<![CDATA[Kino is the iPhone App of the Year]]>https://www.lux.camera/kino-is-iphone-app-of-the-year/675857241ca9d20001d9ce5cWed, 11 Dec 2024 14:01:21 GMT

Our new iPhone pro video camera app won big in this year’s App Store Awards.

We got a mysterious email from Apple last month. They invited us to a call, where me and Ben were told that Kino was a finalist for iPhone App of the Year!

A lot of anticipation and excitement and some time later, we arrived in New York this week for a meeting where we were told we were not a finalist, after all...

In fact, Kino has won the App Store Award for iPhone App of the Year!

Kino is the iPhone App of the Year

We started developing Kino about a year ago. Ben shared a video answering our top Halide feature request: no, we we never going to add video to Halide. We were going to make a new app: Kino, our pro video camera app.

The development was a whirlwind, with both of us being new dads and Ben’s baby son being born around the time we hoped to ship our first version — but six months ago we released what we considered an ambitious, fun and powerful first version.

We did things a little differently: we wanted to keep things simple, but powerful like we did with Halide, but target even the most casual users with features so you could download the app and shoot impressive, cinematic video with no editing or experience necessary. A camera with batteries included.

We got help, of course. We were supported endlessly by friends new and old in video that contributed advice, time, effort and even Instant Grade presets. I want to thank Stu Maschwitz, Adam Lisagor and the folks at Sandwich, Tyler Stalman, Evan Schneider, and Kevin Ong, and our longtime creative collaborator Jelmar Geertsma as well as our friends at Apple for the tremendous support in making Kino.

It came out just six months ago, and we were absolutely thrilled and blown away with the response. Kino launched to the number one spot on the App Store charts for several days. We immediately set out to map out our many plans for updates, and have shipped several big releases since.

We are humbled and still somewhat in disbelief at this incredible award from Apple’s App Store team. It supports our dream as two friends to make apps for creatives because we love doing so, with a focus on craft and accessibility. We love how much this year’s awards are a mix of app makers both big and small, with products we love and use like Lumy and Adobe Lightroom. We’re incredibly proud to be taking this big App Store icon home!

A huge thanks to our families for supporting us through long nights, weekends and days. And the biggest thanks to you, our users and supporters, for making this possible.

Kino is the iPhone App of the Year

Kino — made with love, with friends.

]]>
<![CDATA[IndieSky]]>https://www.lux.camera/indiesky-on-bluesky/673f55a87549d500014434b8Tue, 26 Nov 2024 15:00:17 GMT

These past few weeks, a lot of our indie app making friends having been having fun on the new social platform Bluesky. It seems we are not alone, with the service seeing over a million new users every day. We think Bluesky is cultivating awesome communities full of fun and creative people, and we'd love to see more friends and fans join us, so we came with a fun idea: a giveaway!

Between now (yes, now!) and the end of Thursday this week, a bunch of cool indie developers will give away our apps through our Bluesky accounts. Codes will be randomly posted during the week, messaged to random followers, and awarded to favorite replies. Each app may do things a bit different, so you should check out each account for their plans.

Other developers have might give out codes differently — check out their announcements!

You can find all of the apps in the IndieSky Starter Pack, but here's a rundown:

If you already have these awesome apps and just want to show support, please spread the word with friends. And if you’re an indie developer that want to tag along, ping us and we’ll add you to the pack and list!

If you're already on Bluesky, you can just follow that Starter Pack and tag your posts #IndieSky.

It's Thanksgiving week in the USA, and this giveaway is the least we can do to thank everyone for supporting us over the years. We can't wait to connect. See you soon!

]]>
<![CDATA[The iPhone 16 Pro Camera Review: Control]]>https://www.lux.camera/the-iphone-16-pro-camera-review-control/66e9f1f5dfa4330001ad15e7Tue, 01 Oct 2024 14:59:00 GMT

Ben and I have an annual ritual. For the last half decade, around this time of year, we run to the store, hastily unbox the latest iPhone and get shooting. We do this because we're passionate about finding out everything there is to know about the new camera — not just to make sure things work well with Halide, but also because no other camera has as many changes year over year.

A byproduct of this ritual? A pretty thorough iPhone review.

If you've read our reviews before, you know we do things different. They’re not a quick take or a broad look at the iPhone. As a photographer, I like to focus on reviewing the iPhone 16 Pro as if it were purely a camera. So I set off once more to go on a trip, taking tons of photos and videos, to see how it held up.

For the first “Desert Titanium” iPhone, I headed to the desert. Let’s dive in and see what’s new.

What’s New 

Design

As a designer from an era when windows sported brush metal surfaces, it comes as no surprise I love the finish of this year's model. Where the titanium on the iPhone 15 Pro was brushed on the side rails, this year features more radiant, brushless finish that comes from a different process.

It is particularly nice on the Desert Titanium, which could also be described more like "Sequoia Forest Bronze":

The iPhone 16 Pro Camera Review: Control
Think bronze, not brass or gold, when it comes to the shade of Desert Titanium’s metal

The front features the now-standard Dynamic Island and slimmer bezels. The rear packs the familiar Pro camera array introduced way back in iPhone 11 Pro.

The iPhone 16 Pro Camera Review: Control

Its less professional sibling, iPhone 16, features a unique colored glass process unique to Apple. This year's vibrant colors feel like a reaction to last year's muted tones. I haven't seen this process copied anywhere else, and it's beginning to earn its rank as the signature style of the iPhone. The ultramarine (read: "blue") iPhone 16 is gorgeous, and needs to be seen in real life. I went with the color Apple calls "teal," but I would describe it more as "vivid Agave."

The iPhone 16 Pro Camera Review: Control

The sensor array on the 16 non-Pro has returned to the stacked design of the iPhone X. The motivation behind the change may be technical— better support for Spatial video— but from an aesthetic perspective, I also simply prefer the vertical arrangement.

The iPhone 16 Pro Camera Review: Control

While beautiful to look at, that’s also about all I will say about iPhone 16. While a less colorful, it’s the iPhone Pro line that has always been Apple's camera flagship, so that's the one we'll dive into.

Inside iPhone 16 Pro

A New 48 Megapixel Ultra Wide

The most upgraded camera is the ultra-wide camera, now 48 megapixels, a 4x resolution improvement from last year. The ultra-wide shows impressive sharpness, even at this higher resolution.

The iPhone 16 Pro Camera Review: Control

At 13mm, the ultra-wide remains an apt name. It's so wide that you have to be careful to stay out of frame. However, it does allow for some incredible perspectives:

The iPhone 16 Pro Camera Review: Control
The iPhone 16 Pro Camera Review: Control

At the same time, temper your expectations. While the iPhone 14 Pro introduced a 48 MP sensor for its main camera, they almost doubled the physical size of the sensor compared to the iPhone 13 Pro. This year, the ultra-wide is the same physical size, but they crammed in more photo-sites. In ideal lighting, you can tell the difference. In low-light, the expected noise reduction will result in the some smudgier images you'd also get from the 15 Pro.

One very compelling bonus of the 48 MP upgrade is that you get more than for the high-resolution shots. It does wonders for macro photography.

Since the iPhone 13 Pro, the ultra-wide camera on iPhone has had the smallest focus distance of any iPhone. This let you get ridiculously close to subjects.

The iPhone 16 Pro Camera Review: Control
Shot on iPhone 13 Pro

The problem was that… it was an ultra-wide lens. The shot above is a tight crop of a very wide frame. If you wanted a close up shot like that, you ended up with a lot of extra stuff in your shot which you'd ultimately crop-out.

The iPhone 16 Pro Camera Review: Control

In the past, that meant a center crop of your 12 MP ultra wide image would get cropped down to a 3 MP image. In Halide, we worked around this with the help of machine learning, to intelligently upscale the image.

With 48MP of image however, a center crop delivers a true 12 MP image. It makes for Macro shots that are on another level.

Fusion Energy

Here’s the main meat - the camera most people shoot almost all their shots on. iPhone 16 Pro’s 48 megapixel main camera sensor.

iPhone 16 Pro packs a new 24mm main camera, they now dub the Fusion camera. It is a new sensor, the ’second generation’ of their 48MP shooter introduced in iPhone 14 Pro. iPhone 16 is also listed as having a ‘Fusion’ camera — but they are, in fact, very different cameras, with the iPhone 16 Pro getting a much larger and higher quality sensor.

The iPhone 16 Pro Camera Review: Control

‘Fusion’ refers to the myriad of ways Apple is implementing computational magic that produces high quality shots. If you were to zoom in on the microscopic structure of the sensor, you would see that every pixel is made up of four ‘photosites’ — tiny sensor areas that collect green, red, or blue light.

When iPhone 14 Pro quadrupled its resolution, Apple opted for a ‘Quad Bayer’ arrangement, dividing each photo site into four, rather than a denser ‘regular’ arrangement. There’s a huge benefit of this arrangement: the sensor can combine all those adjacent sites to act like single, larger pixels — so you can shoot higher-quality 12MP shots. This was already employed in video and Night mode.

The iPhone 16 Pro Camera Review: Control

The ‘Fusion’ workflow is essentially using the 48 megapixels worth of data and the 12 megapixel mode to combine into great 24 megapixel resolution shots. I think this is perfect. I firmly believe most people do not benefit from giant 48 megapixel photos for everyday snaps, and it seems Apple agrees. A very Apple decision to use more megapixels but intelligently combine them to get a better outcome for the average user.

Is processing very different from last year? No, not really. It was great, and it’s still great. While there’s slightly more processing happening, I found it difficult to spot a difference between iPhone 15 Pro and iPhone 16 Pro captures. The sensor is the same physical size as last year’s iPhone 15 Pro / Pro Max, and still has delightful amounts of depth of field as a result.

The iPhone 16 Pro Camera Review: Control

The larger the sensor, the nicer this is, and it really renders beautifully — especially in its secondary telephoto lens mode.

Telephoto: 5× and Fusion at Work

The telephoto camera is a defining characteristic of the Pro line of iPhones. Last year only the 15 Pro Max featured the 5× 'tetraprism' lens. This year it's standard across the Pro line, and I'm happy I have the option of going smaller this year.

The iPhone 16 Pro Camera Review: Control

That said, I'm a huge fan of the outgoing 3× lens. It was dang near perfect for me. Now, every focal length between 1× and 5× is bridge with the 48 MP main camera, and it's a bit controversial. Because of its quad-bayer configuration, there's been a question as to whether the 48 megapixel on the main sensor is really 48 MP, since it needs to do a bit more guesswork to recover details.

Well, comparing a 12 MP crop on the sensor to a "real" 12 MP image shot on iPhone 12 Pro, I preferred my ‘virtual’ output on the 16 Pro.

I'll admit that years ago I was a skeptic. I like my lenses optical and tangible, and it feels wrong to crop in. Well, this past year, I've been sporting the iPhone 15 Pro Max with its 5× zoom, so I found myself using the imaginary 2× lens much more to bridge the gap between focal lengths.

Thanks to wider aperture on the Fusion camera, the virtual 2× produces better results than the physical 2× of the past. I really like it. I no longer want Apple to bring back the physical 2×. Give me an even larger, better Fusion camera.

As for the 5×, after a year of real-world use on the 15 Pro, I don't want to lose that reach. It’s like having a set of binoculars, and amazing for wildlife, landscapes, or just inspecting things far away.

On a creative level, the 5× can be a tricky focal length to master. While the ultra-wide camera captures everything, giving you latitude to reframe shots in editing, the 5× forces you to frame your shot right there. Photographers sometimes say, "zoom with your feet," which means taking a few steps back from your subject to use these longer lens. This requires a bit more work than just cropping in post, but the results are worth it.

At night, the telephoto camera suffers as the only remaining 12 MP sensor and narrower field of view that lets in less light. I’d be appreciative of a larger or 48 MP sensor in the future, not for the added resolution, but to reduce noise through binning. What this camera needs more than anything is more light — it would be transformative, and I hope Apple takes things in this direction in the future.

The iPhone 16 Pro Camera Review: Control
A quick snap taken with Camera Control and the 5× lens

For portraits, which usually happens in a more controlled lighting environment, the 5× telephoto also truly shines. It’s a great lens, and we’re all better for having it on all the iPhones Pro.

Night Photography

With the sun setting, I noticed the latest display made a big difference. With a screen that goes down to one nit, I found it really nice when shooting out in the dark.

The iPhone 16 Pro Camera Review: Control

Within night mode, the HDR now allows a larger range dynamic range to be captured. However, it was still a frustrating dance at times to get exactly what I wanted out of the exposure, with some exposures over-done, and inconsistent exposure times. In fact, I enjoyed shooting on the iPhone 16 Pro outside of night mode, as it gave me darker, contrasty shots.

The iPhone 16 Pro Camera Review: Control
The iPhone 16 Pro Camera Review: Control
The iPhone 16 Pro Camera Review: Control

While Night Mode remains incredibly impressive, and the intelligence produces solid results without thinking — but at times it can still be frustrating to get exactly what I want. I wish there were an API for apps like Halide to dial in manual settings.

(If anyone at Apple reads this, we filed request FB11689438.)

Under the Hood

If you were to treat this as a review of the iPhone as a camera, there’s actually more to talk about than the cameras. This is a unique year, because the iPhone 16 Pro packs improvements that go beyond the cameras — touching on every part of the photography and videography workflow. In my testing, USB transfer speeds were faster than my iPhone 15 Pro. On the wireless front, Wifi 7 offer up to 46 Gbps, in theory.

The new modem in here has given me easily my best cellular download speeds — in more places. I pulled down a 450 MB offline map of the Mojave Desert in Joshua Tree in less than a minute.

The iPhone 16 Pro Camera Review: Control

On the wireless power front, I noticed much faster wireless charge speeds with a new MagSafe cable, and also when plugged in. All those things add up to minutes to hours to days saved on the job.

Thermals are a make or break aspect of an iPhone, especially now that it shoots computational intensive video like Apple Log with ProRes. I tested by shooting 4k at 120 FPS for a bit, and found it considerably less hot than the 15 Pro under similar demand. In fact, I never got it to overheat!

Average users will appreciate these quality of life improvements, and Pros will appreciate how it lets them push these devices further than ever before.

The iPhone 16 Pro Camera Review: Control

Digging deeper into the camera subsystems, the new "Apple Camera Interface" internals allow for faster sensor readout times. This improves features like QuickTake (not that Quicktake), that feature that let you quickly take videos by holding the camera button.

Previously, it wasn't possible to quickly reconfigure the camera system for high-quality video. It seemed on par with your viewfinder's video feed, which isn't as high quality as when you recorded from the camera's video mode. On iPhone 16 Pro, QuickTake has far better processing — Dolby Vision HDR, 4k resolution, the works. It's noticeable.

Burst ProRAW 48MP capture performance is also much faster. When absolutely mashing the shutter, 48MP ProRAW frame rate clocked in at 2× the iPhone 15 Pro’s speed. This is good news, but doesn't solve the tradeoff that comes with ProRAW files  — the lag. Apple talked about ‘Zero Shutter Lag’ in the keynote, and that’s exactly what that is about.

The iPhone 16 Pro Camera Review: Control

When an iPhone captures a ProRAW photo, there's a two step process. First, the iPhone captures a burst of photos, and then it merges those photos together with the help of sophisticated computational photography algorithms. The iPhone 16 is faster at the first step, grabbing source photos. It still takes several seconds to process the resulting shot, but if you tap the shutter button the camera will now take a photo practically instantaneously — where there was a very real delay before.

The improvement is huge in practice. In total, the iPhone 16 Pro beat the iPhone 15 Pro by anywhere from 400 to 900 milliseconds. Hundreds of milliseconds matter in the moment, and could mean the difference between getting the shot or missing it completely. It's a massive improvement and a huge achievement, technologically.

Software

While hardware was upgraded, iPhones 16 also come with iOS 18 — a huge update that touches on every single part of the photography experience. We won't touch on Apple Intelligence or Clean Up, which won't be ready until next month, but there's still plenty to talk about with iOS 18.0.

Capture overhaul

You can finally open camera apps from the Lock Screen, which is the single biggest feature request we've had from Halide users. In the past, we had to make do offering widgets you could load on your Lock Screen, but real Lock Screen support goes way beyond that, letting you capture photos without unlocking your device.

The iPhone 16 Pro Camera Review: Control

Aside from several changes in the camera app like being able to pause recordings and keep music playing while you record, there's an elephant in the room… Photos.

This year, Photos received biggest overhaul since the first iPhone, and the results are subjective. For me, it’s been challenging to adapt — but I do believe in their mission. Photos’ fundamental interface has not changed in 16 years, and I do think has to evolve. For most, it might really work better. Its added customizability is a step forward, and fits into the theme of giving you greater control.

Shooting in Style(s)

Which brings us to Photographic Styles, which have also been overhauled. When they were introduced in the iPhone 13 Pro they were simple filters. You could pick from a warm look, a more contrasty look, maybe a more cool look, but the results were all pre-canned.

Now consider this salt flat. You might want to bring out the coolness of the sunrise and really make that a vivid blue in contrast to the sky above:

The iPhone 16 Pro Camera Review: Control

But if I apply a simple filter, it would apply the look to both skin color and the sky equally. Blue skin doesn't work outside James Cameron movies. These new Photographic Styles can target undertones, the way the filter affects skin tones in your shots, making things feel more natural.

These filters are named after moods, such as "Dramatic" or "Quiet." You can fine tune them with a two dimensional pad. There's also a slider to adjust the color palette.

Maybe it’s just me, but I found the UI a bit bewildering at first, so I drew this legend to illustrate.

The iPhone 16 Pro Camera Review: Control

Your adjustments get wiped after a while, though, if not configured to persist.

The iPhone 16 Pro Camera Review: Control
Toggle this setting in your Camera settings to keep your adjustments from resetting.

In the past, I avoided photographic styles, because they were destructive; if I went with a black and white style, I'd lose all the color information. The coolest change to photographic styles this year is that they're "perceptually non-destructive." You should be able to reverse the effects of the style, later.

The iPhone 16 Pro Camera Review: Control

It passed my test — it worked great for me. This even survives an AirDrop to your friend — they can undo or edit the Style when the metadata remains intact.

The added control in Photos also allows you to tune down the "HDR look," one of the more polarizing aesthetics of iPhone photos. However, Photos doesn't reduce sharpening, noise reduction, and subject-based adjustments. They still give your photos a "Shot on iPhone" look, whether or not that's your cup of tea. For deliberate photography, I'm sticking to RAW. For quick snapshots I'll be shooting in a custom Rose Gold Style.

Video and Audio

iPhone 16 Pro brings 4K 120fps video, including ProRes Log (!). It's a huge upgrade, and the improved controls in Photos to adjust playback speed are a welcome change too. 4K 120fps video can be recorded in HDR / SDR and with full processing (Slo-mo video uses a visibly lower quality process), whereas ProRes can only be captured using an external SSD. I love the ’new Apple’ that is shipping features like this clearly aimed at the professionals; I don’t see many people shooting iPhones with an SSD attached, but for those that do, this is a fantastic improvement on what has already proven to be the best phone to capture video on.

With Log and ACES, shots from iPhone simply fit into a nice workflow and can slip in undetected as deep depth of field B-roll no problem:

The iPhone 16 Pro Camera Review: Control

I am not a tremendously huge user of iPhone mics, but both iPhones (iPhone 16 and iPhone 16 Pro) get an improved 4-mic array that support a new Audio Mix feature. It lets you filter out background noise or re-render your audio recording as it if was in a studio or mastered more cinematically.

iPhone 16 Pro can capture Spatial Audio along with your Spatial Video, and does this computational audio processing a bit better than its amateur sibling. It’s very impressive, and can be a huge benefit if you find yourself without a mic — which for most people is probably most situations!

A minor improvement I would suggest to make this useful to us is to allow us to run 4-mic audio capture sessions during recordings that use an external microphone. The peace of mind of having a usable backup recording with Audio Mix would be tremendous.

Camera Control

Okay, here’s the really big deal. Something entirely new on your iPhone.

Over the life of the iPhone, buttons its buttons have either remained the same, evolved, or vanished. Here’s the original iPhone: home, power, volume down, volume up, and a ringer switch. 

The iPhone 16 Pro Camera Review: Control

The first thing to change was the home button. It became a fingerprint sensor and was no longer actually clicking down. With iPhone X, it was finally put out to pasture: a full screen phone didn’t need a home button. A system of gestures worked much better, and Face ID removed the need to scan your finger to unlock.

After that, things stayed the same up until last year, when the ringer switch became an action button. That’s evolution on par with the home button. The only thing so far has been evolution or reduction. 

The addition of a new control, then, is a huge deal. I feel like everyone is being fairly casual about this, when Apple is extraordinarily focused on reducing things down to the bare essentials. This showing up on the outside of your iPhone means Apple views it as essential. 

How is in actual use? To me, the most important part about controls on a camera is that they become an extensions of you. You can get it in your fingers, and use it blindly. You know what a press or swipe sets. Camera Control delivers on this on some fronts, and not on others.

At the core of the Camera Control experience, there’s several fundamental interactions: one is to open your camera — truly, yours; it can be any camera app — and the other is to interact with it.

The Button

The first was something I had truly underrated when I saw the announcement. What caught eyes and headlines about the Control is the way you can half press and swipe on it; after all, we’ve had camera buttons on phones before. When I got my first iPhone, my then-girlfriend was deep into fancy Nokias — her Nokia N95 had a camera button (and a lot of other ones, too). Nothing new here. Or is there?

I found myself grabbing my ‘old’ iPhone 15 Pro after just days of using the 16 Pro and pointlessly mashing the side of the phone instinctively when I went to take a shot. The Camera Control (don’t call it a button!) is flush with your iPhone; it does not detract from the regular, familiar iPhone hand feel. But it will change the way you interact with it all the same.

Take a beat right now if you are reading this on your phone. Imagine a sudden flash of light causes a gorgeous rainbow to appear in an instant outside your window. Close your eyes. What routine do you have to quickly open your camera?

I had one. We all have some kind of routine, and after years of iPhone use, it’s pretty hard wired. It might take you some time to override this on iPhone 16 Pro, but once you do, it’s much, much faster. You just press that button. Locked? Press the button. Reading in the News app? Press the button.

When I reflexively went to do it on my older iPhone, the phone felt broken — as if you’d press the side button and the screen didn’t light up. I think we’ll see this camera opening button on many if not all Android phones very soon. It just becomes routine so fast, and once this gets in your muscle memory it’s extremely frustrating when it’s not there. You miss a shot. Because of that stupid button-less phone.

When Apple adds something like this, it tends to be just a bit more thought out than a new button that takes a photo — not a thing tacked on as a quick shiny thing to entice buyers. Thoughtful details are abound with the camera triggering press: in your pocket, iPhone won’t open the camera if you press it by accident, as was so wonderfully tested in Faruk’s review:

The iPhone 16 Pro Camera Review: Control

John Gruber wrote an excellent part of his review going into more into detail on what makes it behave the way it does. Myself, found all this 'smart' behavior solid — I haven't ended up with any errant snaps.

Let’s talk about the rest of this control, though — what is beneath the Sapphire surface.

The Adjustments

This button can be half-pressed, which is to say, not pressed fully. A light press on the button while Camera is open opens an Adjustment menu. Swiping on the control itself lets you dial the selected setting. The settings are (sequentially): Exposure, Depth, Zoom, Cameras, Style, and Tone.

By default, it behaves as a zoom dial. The dial ’snaps’ to the native focal lengths of each lens fairly aggressively, which is a good thing because a swipe on the Control has momentum if you swipe and let go. For precise adjustment, keeping your finger on the Control will allow pretty fine-grained dialing-in with minimal finger movements. I am impressed with its precision.

The iPhone 16 Pro Camera Review: Control

Regardless, if you are like me and consider zooming no more than cropping your shot, Apple has a ‘Cameras’ Adjustment that is by far my favorite way to use it. The Adjustment has all 4 ‘lenses’ in a row — from 0.5× at the far end to 1×, 2× and finally the 5× telephoto at the other. The result is a quick, pleasing way to cycle through your framing options with a satisfying level of precision — and delivers an amazing interaction iPhone cameras have never had.

The Cameras adjustment can be used... blindly. This may sounds bizarre at the face of it — why would you want to operate the camera without seeing it on a smartphone? Well — recall that reflexive habit-forming I described of opening your camera without looking at it by pressing the Control? The same applies here. Not only can I open it, but I can swipe and feel the haptic click or being on the ultra-wide or telephoto and raise my camera to take my shot.

You'll see photographers look at a shot and have a hand on their lens, snapping to a setting and then raising it to their eyes to shoot. It's essential. With this, I can hold my phone in an awkward position with little visibility and shoot through one of the lenses without seeing the screen. I ended up using this a lot. It’s really hard to put into words, but it becomes something in your fingers; a really tactile camera experience that is more of an extension of you. It’s so nice. It’s just like using a camera lens.

That brings me to the not so good part: the Cameras adjustment experience is so nice, integrated and good that it makes the rest of the adjustments feel less great.

Apple has successfully kept a lot of its Camera app paradigms rooted in traditional concepts of photography. Portrait mode features f-stops for its depth effect; lenses are described in full-frame equivalent focal lengths. This stuff matters: it exposes users, even casual ones, to the fundamentals of photography. Through the most popular camera, everyone continues to be educated about these things and can learn what they mean in a very hands-on manner.

Camera Control offers a lot of options, and in doing so, I feel like the Camera Control somewhat breaks from your traditional expectation of what a ‘dial’ on a camera does. Dials do one thing. This does many. In doing so, departs from a camera convention whose simplicity is appreciated by amateurs and professionals alike.

In my ideal setup, Camera Control simply has one, potentially mode-dependent, adjustment. Ideally, it has a logical and predictable start and end (‘opening up’ an aperture can be done without looking at the lens — a similar thing goes for the zoom range). Simplicity can be its flexibility: ideally, it is so predictable and applicable to the camera task at hand that it works even if you cannot see an on-screen interface. Having a "double light press" and navigating a sort of mini-meta-menu system just ends up feeling kind of clunky and odd.

It ends up packing a lot of on-screen interface, and that can also get in the way: If I launch into the Camera, swipe quickly to get to the ultra wide, then hold my finger on it to be ready to shoot, the Camera Control keeps hovering over my frame.

In all, I think the relative plethora of Adjustments makes it feel clumsier and less sleek and snappy than it could be. Given its soft haptic feedback and many options, it can seem a bit overwhelming even to more photographically savvy users. Those more conspiratorially minded might assume Apple added more features here to compensate for the iPhone having fewer at launch; I myself think it’s just a commendable first attempt to do something new.

Focus

For us as developers, it is an interesting new thing. It seems, for now, uniquely catered to us: not only can you set the Camera Control to open any (camera) app like Halide, you can also create your own Adjustments. The API allows us to pick a system icon and use a basic picker — no custom interface — to tie into features of our own app.

It was tempting to just rush into this and have something on day one, but we really wanted to experience the Camera Control and devices for a while to see how it would work in our way. We like to do things a certain, opinionated, focused way. And that’s exactly what we did: Camera Control in Halide offers 2 Adjustments: EV, to adjust exposure, and Focus, at the end of the scale.

The iPhone 16 Pro Camera Review: Control

Much like Cameras, a manual focus adjustment allows you to quickly focus something as close as possible without looking at the phone. Adjustment for exposure lives in the middle, with a bit more latitude than the system (we go up to -/+ 6 EV, vs. 2) — and the top one is like the middle of your gearbox: “Lock”. Leaving a simple locked adjustment at the top level means Halide does not suffer from any accidental triggers in case you have sensitive adjustments.

On The Nature Of Shutters

There’s an invisible aspect to Camera Control I want to touch on before we move on. I noticed it is also deeply integrated into a low-level improvement I mentioned before — and to understand that, you have to look at how cameras take photos.

Try pressing a shutter button on a regular camera. Film or digital — it will make a quick click. The moment the button reaches the bottom of its throw is when a camera takes a photo.

iPhone 16s do not do that. In fact, they cannot do that. What do I mean by ’that’? Taking a photo as soon as you press down. They take a photo when you release the button. This is something we worked hard to avoid in Halide: when you press the shutter, you want the smallest possible delay; a shutter should fire when the shutter is triggered, not upon release.

But the Camera Control can be long-pressed to take a video. How, then, do you still capture what you see on your screen? Therein lies the smart part of this camera — using the aforementioned Zero Shutter Lag, it can offset the ’slowness’ of the button by grabbing a photo in its buffer. It’s remarkable, and works great for getting a steady shot despite your press, and despite any delay from raising your finger.

The Long Game

I am obviously excited about what the Camera Control brings to the iPhone. It’s a huge change, but it’s easy to miss the long view here.

There’s a reason this isn’t on just Pro phones like a telephoto. Apple knows something about cameras, and that is that they will mean something very different in the years and probably decades and beyond to come. 

As our devices become our intelligent companions, cameras are their most important sensors. Their eyes to the world — and accessing the toggle to let them see and interacting with the world is exactly what this control is about. While I feel tremendously catered to, I do think the long view here isn’t to use this as an aperture ring or a focus dial — it’s a button and aperture for the intelligence inside your device. 

Processing 

And that brings us to the intelligence that does live in this device and controls how every image comes out: Apple’s intelligent image processing.

Image processing has been a hot topic of this review for a while now, and this generation is no different. It's something a lot of reviews of the iPhone 16 actually already have talked about in varying ways.

Here's the thing that won’t change, review after review: an iPhone is just better at being a computer than a camera. That’s the reality of it. If you have a large camera with a big lens and a big sensor, it can gather a lot more light. That's just physics.If you have a small camera and a small sensor, you're going to have to make up for it somehow. The way the iPhone makes up for it is by being a better computer than a camera. All this computational magic that it does, merging a dozen frames into one, gives it great dynamic range. It lets it take photos at night. It does magic — stuff a small camera shouldn’t be able to pull off.

It's honestly invisible and fantastic when it works. But when it doesn't, and it does something unexpected, it's not great. Is that different this year?

In brief: if you were a fan of the iPhone 15 Pro’s processing, you will enjoy what iPhone 16 Pro is offering up this year. And if you didn’t, there is now a genuinely useful and mostly-lossless way to get shots looking very different than the past years’ iPhones without editing them.

I think there’s people at Apple that probably want the iPhone camera to have a more opinionated ‘look’ — but at this point, a billion people use it. It’s an eternal balance of being a tool for creatives and the most popular tool in human’s hands to capture the world around them as it exists. Not an easy task.

That being said: I think Apple should put almost all of its effort into achieving the seemingly-impossible: a noise reduction method that looks more natural than AI ‘making up’ details, or watercolor smudging. If anyone can make grain a symbol of photographic craft and authenticity, even out of a digital camera, it’s Apple. I still get shots from iPhone where it will go through tremendous lengths to prevent me from having a noisy shot. I can see that extracting detail from the noise is difficult; but the resulting image just looks odd.

If there is one theme to iPhone’s approach to photography this year, it’s more control — and that might apply to the Camera Control, and Photographic Styles, but it remains rather processed whether you like it or not. My advice?

Start to accept that highly processed images are here to stay.

As technology marches on, we are using cameras that help us achieve greater results than the physics would even support — but in doing so, some level of creative control is lost. And while we have tools, like our Process Zero, to achieve what I would call ‘old fashioned photography’ — We are not sure if that will even survive through the long future.

As we strive for ever thinner devices, folding phones and the tech we see in science fiction, processing is the only thing that enable cameras to work in the increasing constraints on power and size they have to fit into.

Even on your new iPhone, camera quality isn’t only quantified by sharpness of a lens or rendering of a single image anymore. The definition of color and sharpness have given way to photography reborn as data science. Your camera gathers signal — and in that signal is noise. The more signal it can acquire, the better. It can handle the aberrations, it can handle the noise with extra processing — as long as it can maximize its light input. In native raw captures, we see more color fringing than years ago; it’s just very well processed out of your shot. Lenses get ‘worse’ — but the photos get better.

That’s why I am here to tell you not to be optimistic about our cellphone cameras going towards less processing. Cameras are being optimized for the future, where photography relies increasingly on magic — and it today’s processing will seem quaint. Things in a decade will be very different than what it is today.

iPhone SE (Spatial Edition)

I’ve talked a lot about photography and video changing, but if you’ll humor me for just one more moment, I’ll talk about one change that excites me. Apple’s push into Spatial photo and video might not be for everyone, but its existence helps a chicken-and-egg problem in an emerging medium that has moved more people close to me to tears than I can recall.

The iPhone 16 Pro Camera Review: Control

Spatial media — that is, photos and videos shot in 3D for you to relive on a device like Apple Vision Pro — is still nascent.

There’s various tools for capturing immerse and spatial video and audio, but if this is the first iPhone built for the ground up for AI, it’s equally fair to say it’s the first one built from the ground up for Spatial Capture.

That excites me, not because I am an avid lover or consumer of it, but because it’s a genuine new form of media arts that does not involve boiling a lake to generate an image of an astronaut riding a cat. I love that Apple’s working hard to make tools, regardless of demand. The only way we can experience amazing art is if we invent the tools to make it, first.

Verdict: A Camera That Adds Something

iPhone 16 Pro, along with iPhone 15 Pro and 14 Pro, are all what I would call a ’seismic’ camera release for Pros: the kind that has such significant changes that you would not consider it an incremental move but one that makes it practically impossible to go back.

iPhone 14 Pro brought us a large, gorgeous 48MP main cameras.
iPhone 15 Pro ProRes Log.
And now, iPhone 16 Pro brings Zero Shutter Lag and Camera Control.

If you want a quick verdict: the iPhone 16 Pro is a tremendous camera because between Camera Control, Zero Shutter Lag and its advanced Photographic Styles, it will capture more moments than any iPhone ever did by a huge margin — and that in itself makes me recommend it over any previous one.

That being said, there’s a larger feeling I am left with after reviewing this device in my hands.

As I feel myself getting older, I hold on to the idea of what I think a ‘camera’ or ‘photography’ is more and more. The same happened with cellphones. People used to ridicule that your telephone had a camera on it. No doubt there were purists that said, “well, in my day, this was a thing you took phone calls on. Not a computer in your pocket.”

Here I am: in my day, a camera was a thing you took photos on. Not a computer brain’s eyes to the world. Perhaps I am feeling this is a big change because possibly, this is the close to the last of its kind or the link in evolution: an iPhone that has long since redefined what a phone is, but is about to redefine what a camera is, and what photography means.

Recall the introduction of the iPhone as a phone, an internet communicator, and an iPod. Notably lacking? The camera. 

This iPhone is a camera. Maybe the first, if you were to define a camera as a device that has dedicated control for it.

The iPhone 16 Pro Camera Review: Control

It was in a place like this where one of Steve Jobs’ greatest inspirations once stood and imagined a revolution in photography that shocked the world. He imagined something simple: Instead of having to hire a photographer with a camera, who would bring film to a lab or go into a darkroom to present the shot days later, he imagined a small, elegant metal rectangle that fit into your pocket.

You could simply take it out, slide your finger on its surface to adjust your shot, and take the photo. The real magic? You’d take it and see it; no need to develop any film. Instant gratification.

That man was Edwin Land. He envisioned something most considered impossible: the Polaroid SX-70. It changed photography forever. It seems futuristic today. And guess what? The only controls on that camera were a button... and one slider, right here at the top.

The iPhone 16 Pro Camera Review: Control

Land didn’t create this because he was obsessed with technology. He wanted to strip away the complications of photography and make it accessible. To focus on the craft, and art and less about worrying about know-how or technique. To truly bring it to its essence: empowering anyone to capture a moment. Surely, some lamented the loss of craft. The loss of essential parts of photography.

Perhaps it was the camera phone that was the next step that truly made photography even more accessible and instant. But many feel like something was lost. It’s telling, then, that where Land removed so many parts of the camera, Apple is adding one.

Apple adding the a new control - a button, a dial - to iPhone isn’t a move it does casually. It’s an admission of a fundamental change of iPhone's nature that happened over time. An admission that iPhones are far less phones today, and far more cameras.

But as a photographer, remember that ‘camera’ might really mean something entirely different than what we are used to — phones once made phone calls. Today, cameras take photos. In the future? Perhaps this is much more a lens to see and process the world with. A camera, as it is defined in the 21st century.

If I’m reviewing this the way it is, then, I’m really enjoying what I have in my hands. A device on the edge of the sands of time — rooted in the cameras I love, with just enough of the future of photography packed in here for me to manage.

The iPhone 16 Pro Camera Review: Control
]]>
<![CDATA[Process Zero: The Anti-Intelligent Camera]]>https://www.lux.camera/introducing-process-zero-for-iphone/6514766219ee0e0001415aebWed, 14 Aug 2024 17:00:10 GMT

Today, we are launching something unlike any tech product in 2024: a product that uses zero AI and zero computational photography to produce natural, film-like photos. We call it Process Zero. It lives in Halide, and it turns your iPhone into a classic camera.

Process Zero is a new mode in Halide that skips over the standard iPhone image processing system. It produces photos with more detail and allows the photographer greater control over lighting and exposure. This is not a photo filter— it really develops photos at the raw, sensor-data level.

Just like film, Process Zero photos come with (digital) negatives, affording incredible control to change exposure after the fact. Much like film, it has grain. It works best in daytime or mixed lighting, rather than nighttime shots. Thankfully, unlike film, you don't need any chemicals to develop these negatives. We give you one dial.

Process Zero: The Anti-Intelligent Camera

Best of all, Process Zero is available on every iPhone that runs Halide and iOS 17, not just the latest iPhones Pro.

Because Process Zero eschews magical algorithms, it has tradeoffs. This is why it’s a new choice in addition to the standard iPhone photo processing system in Halide. Read on to learn why we built this, what the tradeoffs are, and where are we going.

In Search of a Classic Camera

Today's smartphones let anyone just press a button and get a nice picture. At the extreme, you have cameras that swap faces, insert stock photos of the moon, or use AI to generate completely new elements.

0:00
/0:06

Thanks, I hate it.

By comparison, an iPhone is downright conservative, mostly a magic helping hand in difficult lighting situations. Consider the classic problem of capturing a window on a sunny day. If you've only captured photos on an iPhone, you might not even know this is a classic photography problem:

Classic cameras can either over-expose the outside or under-expose the room. But algorithms can combine these multiple photos and voilà!

Process Zero: The Anti-Intelligent Camera
Via Wikipedia

This is great for aspiring photographers, who can now focus on learning high-level concepts instead of getting bogged down fiddling with knobs. Even experts can appreciate the convenience of just pressing a button and getting useful results.

In the years following Halide's launch in 2017, iPhone cameras have gotten much smarter, adding sophisticated algorithms like Smart HDR and Deep Fusion. We worried about what this could mean for our app. What's the point in manual control when a phone can do better if you stay out of the way?

It turns out we were wrong to worry. As cameras have gotten smarter, Halide has only thrived. Users want more than standard manual controls— they want control over algorithms. Tons of people tell us they love Halide for this one toggle:

Process Zero: The Anti-Intelligent Camera

To be clear, these algorithms are amazing, but leaving all decisions to a machine means sacrificing some choices as an artist. A machine can only make objective decisions, but many technical decisions are inherently subjective.

For example, a photographer manually editing their photos might ask themselves, "Do I want noise in my photo, or to eliminate it at the cost of detail?" The iPhone's image processing pipeline doesn't like noise at all, and that's fine. We wouldn't be surprised if most iPhone users prefer their photos that way.

But Sebastiaan and I like a bit of noise in our shots, and that toggle in Halide that reduces processing didn't help. Noise reduction is just one of those things that gives iPhone photos their look. Because Halide was built on top of the system processing, we had to come along for the ride.

So our love of noise sent us down the path of building our own process.

What's going on with the noise removal? We can't say for sure, because we didn't build the iPhone's algorithm. We do know that when you combine multiple photos (as in the window example earlier) you are no longer capturing a single moment in time, and when you average together multiple photos, noise goes away.

Unfortunately, photo merging algorithms have to guess how each photo lines up. This is especially tricky with moving objects, and if the algorithm guesses wrong you see ghosts. In the end, all of this intricate guesswork costs more than just noise, but also fine details.

In contrast, Process Zero is a single-shot process. We take one, and only one, photo. If parts of your image are not properly exposed, we don't have any algorithms to fix that. Sometimes this is a good thing! Consider this photo of Ethan from the day he came home from the hospital.

Process Zero: The Anti-Intelligent Camera

If an algorithm sees this image, it may think it needs to bring out details in the shadows and smooth away the noise. This can frustrate an experienced photographer who knows what they're doing.

Process Zero also lets you retain full control over your camera settings. For example, algorithm-based exposure logic can’t let you pick a specific shutter speed, because the whole job of the algorithm is to take nine photos at different shutter speeds and pick the best of the bunch.

As we've followed the iPhone's algorithms getting more sophisticated over the years, we've found that our single-shot approach is the only reliable way to give users total control over shutter speed and other exposure settings.

Consider these shots I took last year from a boat in the Galapagos. By shooting a single photo with a fast shutter speed, I outperformed the algorithm.

Just be careful what you wish for. Turning off the algorithms has tradeoffs.

I mentioned grain earlier, and just like film, Process Zero will have an ideal ‘ISO’ range. In the dark, it will get noisy. Fortunately, newer iPhones with quad-bayer sensors have incredible low-light performance, compared to the past. Don't go in expecting night mode, but I've been surprised by how useful the results can be.

Because Process Zero does not fuse multiple shots, you are limited by the dynamic range of the sensor. That means that if you're shooting something like that window from earlier, you need to choose which bits you want exposed.

Finally, some flagship features of the iPhone are deeply integrated with algorithms. If you want a full, 48-megapixel resolution rather than binning, that isn't possible. That also means that if you want a virtual 2× camera at a 12-megapixel resolution, that isn't possible. However, the situation might improve if enough people file feedback with Apple!

Process Zero: The Anti-Intelligent Camera

Then there's the issue of HDR output. If you've ever scrolled through photos and video in your camera roll, and your phone suddenly got bright, that's what we're talking about. Some people hate the look, but I think the results can be stunning when done with thought and care.

We'd love to show you examples, but browsers don't really support it, and that cuts to the heart of the problem. If you shoot in HDR today, your photos can look pretty different from screen to screen. Later this year, new standards will land that help with HDR compatibility, and we'll revisit it then.

Process Zero: The Anti-Intelligent Camera

To summarize: Process Zero gives you a single 12-megapixel shot. It will be less saturated, softer, grainier, and quite different than what you see from most phones. Each shot includes a true Bayer RAW file, if you want to use it in a full-fledged RAW editor, but we designed Halide so you don't need one.

If any of Process Zero's tradeoffs are dealbreakers, that's fine! If Process Zero just shows you how valuable smart-processing can be in difficult situations, that's cool. We find ourselves toggling between Process Zero and the system processing depending on the content of a scene and what we're hoping to accomplish. That's why we made it easy to switch modes with a tap.

Process Zero: The Anti-Intelligent Camera

As cameras make more and more creative choices on your behalf, we think the photographer should retain the agency to cast aside algorithms and do their own thing. Just as a photographer expresses themselves in their choice of lens, exposure settings, and film stock, Halide now lets you choose the process that works for you.

Image Lab

Back in the days of film photography, half of the art was in taking the photo, and the other half was developing your negative. Sometimes it was to correct mistakes, and sometimes it was for creative effect. When going analog, we love pushing and pulling film.

However, adjusting the exposure on a processed JPEG/HEIC is never as good as ‘re-developing’ a digital negative.

Process Zero: The Anti-Intelligent Camera
Above: adjusting exposure on an underexposed digital negative. Below: adjusting exposure on a JPEG.

This is why Process Zero includes a digital negative. Rather than leave you to find an editor that supports them, we decided to go the extra mile and include Image Lab, a one-dial solution to developing your negative.

We think Image Lab gives Halide a great "batteries included" experience. Take a shot, tweak it if you have to, and share it — no other steps are required. We also built Image Lab because even if you're comfortable working with RAW editors, they can yield different results than Process Zero.

Process Zero: The Anti-Intelligent Camera

Image Lab is not a full-fledged editor. It does not contain any color or contrast adjustment knobs. You can't even crop! Think of Process Zero + Image Lab as "step zero" of your workflow.

But if you do take those DNGs we include with Process Zero into another app, they edit pretty well:

Process Zero is Step One

Today is also step zero in our journey to the next generation of Halide. We plan to make it bigger than our award-winning Mark II. We're going to call it… wait for it… Mark III.

Process Zero: The Anti-Intelligent Camera

Our Plan for Mark III

We released Halide Mark II after a long silent period, so we could release one big, splashy update. While that’s fun, it’s both risky and ultimately less useful to our users. We've decided to try roll out some Mark III features a bit early, rather than saving them up for one big launch at the end. As much as we love surprises, we think getting these features into your hands and gathering feedback will make Mark III much better than developing it in a vacuum.

Process Zero is step one: the first part of giving you even more control, and beautiful output out of the box. We’re going to go beyond that.

Halide members will get early access to even more Mark III features, along with exclusive icons and other goodies. You can get the app here — and try it for free for a week by starting a membership.

If subscriptions aren’t your thing, that's totally fine. We’re continuing to offer a one-time-purchase option too. While it’s technically a one-time purchase for Mark II, we’ve decided to include Mark III, too.

It has been a busy year. A few months ago, we shipped our new video app Kino. Today, we launched Process Zero, the first step on the road to Mark III. In a few weeks we'll enter the busiest time of year. We feel incredibly privileged to be able to work on what we love, and it wouldn't be possible without your support. Thank you.

We'd love to hear your feedback on Process Zero, and we can't wait to see what you shoot with it. Make sure to tag your photos #ProcessZero so we can see and share your shots!

]]>
<![CDATA[The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months]]>https://www.lux.camera/kino-a-pro-video-camera-in-four-months/668fe5105f6be40001da372bThu, 01 Aug 2024 19:09:51 GMT

It’s been seven years since we launched a little side project, Halide. We built it for us: we didn’t set out to ‘disrupt’ the world of camera apps, or get rich. We wanted to build a beautiful, powerful, and delightful camera app that we would enjoy using.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
Working on finishing up Halide 1.0 — May 2017

It exceeded our wildest expectations and quickly grew to become an award-winning photography app and a legitimate business. Soon, we quit our jobs to work on apps full-time. Yes, that's plural, apps.

From what we learned building Halide full-time, we spun out one-off projects such as Orion and Spectre (that one went places — a little long exposure app that won 2019 App of the Year). While these were fun diversions, Halide remained the focus of our company. It was what we worked on most days of the week, for the last seven years.

There was just one thing we knew we'd never add to Halide: video. Ever since Halide 1.0, users asked for it, but we knew it wouldn't work. Photography and cinema are different mediums that call for different user experiences.

Instead, Sebastiaan and I talked about building a completely separate app, a "Halide for video." Talk never went past the "Wouldn't it be cool…" phase, because we weren't sure if we could juggle a second major app. We're a small team. Ridiculously small. One designer and developer.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
The team, circa 2019 - photographed by Apple for a feature exclusive to the Japanese App Store. ありがとう!

We knew expectations were sky high, and honestly, nothing excited us about iPhone video. It felt photography was at the forefront of the camera innovations.

Our attitude quickly changed in November 2023, and we launched our video app, Kino, six months later. This is the story of why we made the plunge, its whirlwind development, the results, and where we go from here.

Fall 2023: "How hard could it be?"

At the end of last summer, Sebastiaan and I launched Orion, a free app that helps turn your iPad into an HDMI monitor.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

We normally take our time with new projects. Both Halide and Spectre each took one year to ship (though, in our defense, these started as side projects). Orion was a fun challenge to see if our two-man company could ship a brand-new app in 45 days, and it went really well. It reset expectations of what we could accomplish quickly, but it felt a bit exhausting toward the end.

We expected to spend the rest of the year slowing down and turning our attention to Halide. We were in the home stretch of a brand new feature that we were very excited about, and with another month or two of work, we could bring it across the finish line.

I could also use a little breather for the rest of the year, as I expected my first kid at the end of February.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Those plans changed moments after the Orion release, with Apple's unveiling of the iPhone 15 Pro. We watch every keynote paying close attention to changes in photography, which we'll weave into our fall Halide update. This time we were blown away by the announcement of "log video."

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Log video is a very big deal. It contains much more information than conventional iPhone video, allowing ridiculous control over the final image. Apple called it Apple Log. Of course.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
Apple Log video is fantastic for professional-looking video — but it requires some editing.

As Apple Log made waves in the filmmaking community, we got a sense of déjà vu. Our minds went back to the Summer of 2016, when Apple iPhones would soon capture RAW photos, allowing incredible editing superpowers. Sebastiaan and I felt that was the right time to build Halide. Now, with Apple Log, we felt iPhone video was about to have its RAW moment.

That said, there are only so many hours in the day. Could we add another major app to our portfolio without supporting ou breadwinner, Halide?

Well, Halide is overdue for a refresh. Its foundations were built seven years ago, and Apple's newer technologies would vastly improve the product. Orion wasn't just a fun side project, it was a test as to whether Apple's new technologies were mature enough to start rolling into Halide. If so, how much do they improve our productivity? The result was "yes," and "a ton." We were able to accomplish some tasks for Orion in hours that would take days in Halide.

But Orion wasn't nearly as complex as Halide. Our video app would let us build a foundation for the future. While our photo and video apps would never share user interfaces, we could architect them to share underlying technologies. If we do this right, we can manage the workload.

The Deadline

Maybe the hardest part of our job is planning everything around Apple's schedule. We spend summers readying our apps for the Fall iOS release, which launches alongside new iPhones. Then we scramble to test our apps and support the new camera hardware. If we're lucky, we get six months a year to define our product direction.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
Our desk is always a pile of iPhones of various generations - and the pile grows every fall.

That said, constraints help you focus. We decided that Orion had to launch when iOS 17 dropped, because we expected lots of similar apps to pop up over time, and we wanted to be there right out of the gate. Orion had to ship in 45 days, so we made it work.

I'm not advocating crunch time, where developers work long hours for the sake of unrealistic deadlines. Quite the opposite. Deadlines force us to accept that we won't get everything done in 1.0. We have the freedom to cut as many features as needed to keep things sustainable. The old adage goes, "Work expands so as to fill the time available for its completion," but can't the same be true for work shrinkage?

The question was whether we could ship a whole new camera app within four months, before my baby's due date. It might sound ridiculous, given Halide 1.0 took one year to ship, but Orion proved how much more productive we could be with the latest technology, and this time we were building a camera with seven years of experience in Apple's AV stack.

One reason that Halide and Spectre each took a year was that we handled 100% of each project ourselves. With Orion, we worked with Anton Heestand on its wonderful over-the-top onboarding, and Cabel Sasser wrote us an intro song! It turns out we can delegate work and collaborate with friends without losing any character.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

The biggest risk with our four-month deadline had nothing to do with code. We were worried that we didn't yet understand what we were building. There were already plenty of free apps that let you record Apple Log. What could our new app bring to the table?

Defining the Product

After surveying the app landscape, we quickly realized that every app that supports Apple Log targets advanced users. These types of users look at a high-end camera rig and go, "Wow! Cool!"

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
If you look at this and you go ‘hell yeah’, yes, you are that kind of user.

Don’t get me wrong, we love this stuff, but we weren't excited to build a high-end tool exclusive to pros. But as we dug into the techniques of filmmaking, I had a flashback to the late 90s and early 2000s. I grew up on the cusp of digital filmmaking and had lots of fun making short movies with friends.

0:00
/0:16

Tequila (2006)

In the mid-2000s, we used a ridiculous camera that had been modified to shoot 10-bit log footage. That hacked camera required a handful of portable hard drives that would overheat at the worst times, but these technical shenanigans piqued my
curiosity, and led me down a path where I now build camera apps for a living.

Returning to the world of filmmaking excited me. I could build an app for 99% of people just starting who wish they could record beautiful, cinematic videos, but can't make heads or tails of "colorspaces" or "shutter angles." I had the chance to
build the camera I wish I had decades ago.


If we have one guiding principle, it’s the belief that "intuitive" and "powerful” do not have to be mutually exclusive. We thought our app could deliver 95% of the features demanded by high-end professionals without making the app too complex for novices. We'd start with an approachable 1.0, and carefully layering on more advanced features over time.

It turned out, Sebastiaan had been quietly designing concepts for a video app for quite a while. This is how early some of Kino’s most recognizable visual elements were born, like the recording tally light ring that follows the curvature of your iPhone’s screen, or the little segmented audio levels.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
one of Sebastiaan’s early Sketch explorations of a video app, codenamed ‘Amalfi'

That said, the worst way to explore a product is with a pixel-perfect version. A beautiful UX takes extra time, and pretty images can distract you from fundamental problems. So in the interest of speed, I spent the next few weeks focusing on a functional prototype that resembled Sebastiaan's concepts. It could record Apple Log footage, connect to an external microphone, and let us quickly experiment with UI concepts.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
We call this style ‘brutalist’

In December 2023, just as our prototype built momentum, news broke that Filmic Pro— the most popular filmmaking app on the App Store— was shutting down. This left a vacuum in the ecosystem of filmmaking apps and a material loss in our tiny community of camera apps.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Normally, we don't pre-announce products. Part of it might be Sebastiaan’s ex-Apple penchant for secrecy, to announce stuff and surprise and delight. But it also raises expectations and runs the risk of committing to features that might not work out. At the same time, the demise of Filmic was an invaluable opportunity to announce our new app and test demand. We decided to pre-announce.

First, our app needed a name, and we didn't want to repeat the pronunciation ambiguity of "Halide." (Note: Sebastiaan says Hey-lide, and I say Hal-ide, but we switch every other week.) Sebastiaan floated a name that both encapsulated ‘craft video’ and sounded friendly: Kino.

I spent 24 hours shooting an announcement video with our alpha build, and we launched a teaser page at shotwithkino.com.

The reception to our video was overwhelmingly positive, and it felt like we had a hit on our hands.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Now we just had to ship it. Wait… what were we shipping?

December: Instant Grade

Sometimes building products is like writing a story. Writers don't go, "Once upon a time," and finish the story in one pass. Many writers approach a blank page with ideas for characters, major plot points, and themes running through their heads. The hard part is forming loose ideas into a cohesive structure. It's called "cracking the story," and there's a similar process in building products.

We decided a good starting point was to shoot videos. When you're forced to eat your own dog food, you quickly figure out what works and what doesn't. In a meta turn of events, we made a video about making Kino.

We appreciated the natural look of Apple Log footage, as it didn't have the same post-processing you see in iPhone video. The hard part is giving log footage a nice treatment. Straight out of the camera, Apple Log looks… uh…

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
A frame from an Apple Log video. By design, log video is quite washed out.

Log footage is supposed to look that way. It just contains the ingredients that can make up gorgeous images, and it's your job to bake them. You're supposed to bring it into a high-end tool like Davinci Resolve to "color grade" it. While these tools feel empowering to professionals, to a novice, they feel as intuitive as the cockpit of a commercial airplane.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

We knew Kino could be a game changer if it let everyone grade their footage right in the app with a tap, by using a handful of packaged presets.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

But why stop there? We could let you import a preset from anywhere. Apple Log was only a month old, but pros were already selling great grade packs, ready to be imported into your favorite editing suite.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
We saw that people like Evan Schneider and Tyler Stalman had made some beautiful presets.

And if you're a high-end user, you could even author your own looks in Resolve! Speaking of the high end…

Tackling the High End

On the other end of the spectrum, we had to figure out what professionals demand. We packed up and flew down to Los Angeles, where Sebastiaan booked us into the only NFT hotel in town (Sebastiaan Note: So sorry, I had no idea).

It led us to interview Stu Maschwitz, Adam Lisagor, and others. We asked what they would want in an app, but it was ultimately up to us to decide what features work with our casual user experience.

Pros wanted two things: adjustability, and consistency. Adjustability was straightforward. Like Halide, we just needed a manual mode that lets you adjust the shutter, ISO, and more.

Consistency was a bit more nuanced. For example, Pros want to lock exposure settings at the start of the recording. This doesn't happen with the iPhone's first-party camera, where exposure settings can change mid-recording. This means that if someone walks into your video wearing a dark shirt, the image could change its overall brightness to compensate. This feels off, and you never see it in real movies.

However, locked exposure could confuse casual users. We could already imagine confused customer support emails like, "I started recording a movie inside my house, I walked outside, and everything was too bright!” They aren't wrong. Most people expect their cameras to just work.

Exposure locking warranted a toggle, which we'd leave off by default.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

But how would a professional know this feature exists, to begin with? We settled on an extra screen in our first-launch experience that lets you dig into all the customizable options.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Giving users extensive options makes things harder to test and develop, but we couldn't take a one-size-fits-all approach. We think this struck a nice compromise, and everything felt on track for our February release.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

January Battles

With Instant Grade now in our early alpha, something funny happened: we began using it all the time. It wasn't because we had to, but because we loved the results. This was a good omen, since we had two months left on our schedule, and this was the point that the app should be coming together. Soon we'd have to shift gears into polishing the interface and hunting for bugs.

According to Sebastiaan, there's a Dutch saying, "The last lead weighs the most." It stems from typesetters, who had to take little lead letters and set them in a clamp to write out text. Near the end of a sentence, things get heavy, and the final words are the hardest to set.

Once an app rises in quality and things feel "real," weaknesses stand out more, and as we were about to finish off our sentence, we realized we missed a word.

A few weeks into January, we knew something felt off. Stepping back, we realized it was onerous and silly to reach for manual exposure controls to create that signature "180-degree" look you see in films. If you aren't a camera nerd, 180 degrees means nothing, so bear with me.

Just like a photograph, every frame of a film or video is exposed to light for a split second. The longer you expose an image, the more blur you see when objects move. While photographers generally avoid motion blur, in filmmaking, motion blur is a subtle detail that gives Hollywood films a certain feeling.

To that end, most cinematographers set their cameras to expose at half the frame rate; most movies are shot at 24 frames per second, with each image exposed at 1/48 of a second. The term "180-degree shutter" dates back to analog cameras, which had spinning wheels in front of their film gates.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

All of this stuff overwhelms beginners. If we set out to make cinematic video accessible to everyone, we had to do more than make manual settings friendlier. Kino should just handle these settings for you. You tap the record button, and an algorithm handles the 180-degree shutter. It automated cinematic motion, so we branded it... "AutoMotion."

I knocked out a simple version of AutoMotion in a few hours. It only worked when the user enabled that exposure-lock feature I talked about earlier because it was easiest to calculate these settings once, at the start of the recording, rather than updating it continuously. It didn't take long to realize the feature was awesome. Combined with Instant Grade, it produced videos that really did look like they came from a cinema camera.

There were just two issues. First, AutoMotion can't work in bright daylight without attaching an ND filter in front of the lens. It's just a limitation of light and physics. Hey, if Kino is a smash hit, maybe Apple could address this with some sort of integrated ND filter in the iPhone 18? Until then, users will want to know if it's active or not, so they have a chance to fiddle with lighting. We solved that by turning the "auto” button green when you're good to go.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months
Green means go.

The second issue came from feedback from our early testers. They expected exposure to continue to adjust during recording. And— ugh — they were right. If we were to serve beginners, "AutoMotion" had to work out of the box without toggling any advanced features. So we made the difficult decision to derail the schedule a bit to get this working with continuous auto-exposure.

Onboarding Reset

We love to make the onboarding in our apps fun. We started with a little book in Halide that works as a manual, recalling old vintage camera manuals. In Orion, you unbox your ‘appliance’. We had big plans for Kino, but we were short on time and our plans to use a subcontractor had fallen through, which meant that we had to build it ourselves.

Scaling back our ambitions, we married our two past approaches: Kino would open up to a similar "manual" concept that we previously used in Halide and Orion. If we had time, we'd add a little unboxing.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

It was also time to nail our app's overall styling. We wanted something less ‘campy’ than Orion's over-the-top 80s-electronics theme. There would be no custom VCR display typefaces, but we did take inspiration from Sony’s vintage camcorders.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

With Halide and Kino being siblings, we wanted stylistic consistency, but this time with more color. Sebastiaan initially conceived of our built-in presets having small, emoji-like icons or frame previews, though he settled on more film-like packaging.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Working with longtime collaborator and designer Jelmar Geertsma, we even created a set of retro-like feature graphics for some of the features of the app. We only ended up using these on the box you open to start the app, but that’s OK. They’re still really cool.

Jelmar’s work is also in other areas of the app — like in our very own family of typefaces we call Ambrotype, including a monospaced version that shines everywhere from the format settings to the timecode.

0:00
/0:05

Death by a Thousand Configurations

Halide introduced us to the reality that everyone shoots differently, and video brings even more choices. For instance, Hollywood filmmakers usually shoot at 24 frames per second, while respected video creators like Marques Brownlee prefer 30 FPS. Rather than present every possible option, we started with a drop-down menu that let users pick from common presets:

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

This was too limiting for pros, so if you needed more than that, we have a nice screen to build your own custom configuration:

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

This felt perfect from a product perspective, but once you give users infinite control to customize settings, they are guaranteed to come across every weird bug in iOS.

For example, we discovered that on the iPhone 15 Pro it isn't possible to shoot:

1) Apple Log video;

2) at 60 FPS;

3) with stabilization enabled;

4) when shooting from the telephoto camera.

We can't explain why that one particular camera has the issue, but if the user configured things that way, the video stream shuts down. We called these "dead viewfinder" problems. We haven’t done the math on how many permutations there are for settings across all iPhones, but suffice to say there are a lot.

Despite how much we thought we knew about the iPhone's AV system, we spent a good chunk of January discovering weird edge cases and searching for workarounds.

February: The Schedule Changes

January's setbacks cost weeks of our schedule. Entering February, we knew exactly what to ship with Kino 1.0, but hitting our deadline would mean sacrificing quality. The app wouldn't be polished, and we felt uneasy about how few devices we had tested.

We considered limiting Kino only to the latest devices at launch, but as it turns out the App Store doesn't have a way to limit device support at a granular level. Apple wants iPhones to have a long life, so it makes sense they make it hard for developers to opt out of supporting older devices.

It was a tough pill to swallow, but we accepted that we weren't going to hit the 1.0 deadline that we announced. We kept our chin up and acknowledged that it happens to the best of teams. "A delayed game is eventually good, but a rushed game is forever bad."

Rather than throw out the deadline entirely, we changed the goalposts. We'd release a feature-complete beta version to a large group of testers. Over paternity leave, the team would collect bug reports and other feedback, so I could hit the ground running when I returned. We'd target the new launch date for the end of May, coinciding with the seven-year anniversary of Halide 1.0.

With the new schedule in place, we began compiling a beta tester list, preparing release notes, writing up known bugs, and… and… well, then life had other plans.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

On February 7, 2024, our prenatal check-up discovered my wife had run out of amniotic fluid. That's bad. We ended up shipping our son, Ethan Marc Sandofsky, that day.

0:00
/0:06

#ShotWithKino. There were no second-takes.

I'm off for a bit.

Spring 2024: The Final Stretch

I’ll take over here. Hey, it’s Sebastiaan! I am the other half of the team building things. While I can’t code (much), we luckily entered Ben’s paternity leave with Kino solid enough for a beta release without his involvement.

While a deadline slipping is always a bad situation, it was also an opportunity for us to seize on extra time to build out our product more. While we’d gather feedback, index our weak spots and find bugs, I was going to work hard on the most important part of the app: helping people get great-looking shots.

I like to say that there are two ways to help people get a better picture. You can create a shiny button — whether that is a preset, filter, effect, algorithm, or something else — to make an image look beautiful. This helps a lot: it can make otherwise unimpressive shots look great, is super approachable, and often simple. The problem, possibly, can be that everyone gets similar results.

The second way, which is far more difficult, isn’t to make the image nicer looking. Instead of offering a button, offer teachings. Make the user better at photo- or videography. It’s far more difficult, but also more satisfying for users, and allows anyone to discover their own style.

Ideally, Kino would cover both of these. But it meant a lot of extra work — extra work I now had some time for...

In Grade Company

Video is far more complicated than photography, and Apple Log allowed for beautiful results that were far from homogenous with the color-grade presets I made for Kino. That being said, I mostly edit photos — video color work wasn’t my expertise.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

I decided to reach out to the people I admired the most in the field. Some were longtime friends, like Adam Lisagor. An incredibly kind man with an inimitable studio Sandwich Video, he was eager to help — bless him. Other, newer friends like Tyler Stalman, whose presets and videos I absolutely loved also got on board with enthusiasm.

Stu Maschwitz was a new acquaintance, introduced to me when we went to ask pros in our network — a fantastic, gentle, patient and hyper-talented guy who helped me a lot during my process, to the point of sitting down with me in his studio to talk about color, workflow and pipelines. I returned the favor by personally implementing his suggestions.

Then there were people I wasn’t that close with. I had never chatted with Kevin Ong or Evan Schneider, but after I explained our vision for Kino they too were interested in working with us. Evan ended up providing me with tons of helpful feedback — he is a fantastically skilled colorist and creative, and was generous with his time to help me become wiser about the craft and to help Kino be as great as it could be.

Then there were the teachings.

Teachings, to me, don’t mean dropping a textbook or floating tips in the app.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

The best software interfaces are obvious, and empowering in their intuitiveness.

Not all concepts can be intuitive from the start, but if you think about it, there’s a name for a simple, pared-down interface or example of a complex problem: it’s a lesson. Whether it’s a scenario of two trains meeting in math class, or a simpler interface to camera exposure that shows you how the image changes as you swipe a slider, they both help you get acquainted with the fundamentals of something by absorbing the underlying concepts. It does that by only giving you a little bit at a time in an understandable way. If you remove complexity, what remains is more clear, and less overwhelming.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

I worked tirelessly on re-doing our onboarding, thinking of ways we could help people get started, and sketching out a set of online resources that users could immediately dig into. Covering frequently asked questions, but also writing a detailed manual for those who wanted to go deep.

For my last coup, I would exceed what we’d done before: I would film some tutorial videos to help people get started.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

I am incredibly proud with how much of this we managed to get done, but my inexperience with video led me to one major realization: video is hard, and takes a ton of time. I managed to get great color grades in from myself and my dear new and old friends, and designed our best website yet with the most resources at launch, but tutorial videos didn’t fit into the crunch time.

In early March, Ben's mother-in-law visited for a few weeks to help ease his workload at home, giving him a chance to check in and survey the major bugs. It helped ease the transition into…

April to May: the Home Stretch

Hey! It's Ben, and I'm back, (with a) baby!

As planned, I returned to work in April, hitting the ground running. Most bugs discovered in our wide beta test were fairly mundane, but it still felt like the right call setting the launch in May, rather than pushing to make April.

As we fixed bugs, tested older devices, and tied up loose ends, Sebastiaan worked on the final version of ShotWithKino.com. Our friend Adam at Sandwich helped us with the final, authoritative test of Kino: using it on a real production.

The experience was wonderful, messy, and genuine. When the crew ran into issues, we quickly fixed them and turned around a new build. It was a great way to harden Kino for its 1.0 release. There was only one remaining item on the to-do list: charging money.

I doubt anyone who makes products enjoys the part where you ask for money, and it's even worse with App Stores, where people do not like paying for apps. It's no wonder that the two dominant business models are advertising and freemium.

When we launched Halide in 2017, folks found it a breath of fresh air that we just charged up-front for our product in an era when subscriptions and in-app purchases were becoming the norm.

When we launched Halide Mark II in 2020, conventional wisdom said that the pay-up-front business model was dead. We decided to change things up by offering a low-cost subscription in addition to a pay-once option. Subscriptions were a huge win: let us offer free trials, Apple takes a smaller portion of sales, and predictable, recurring revenue made our business healthier and more sustainable.

We considered following Halide's business model, but building a great “paywall” takes time. And honestly, we just didn't feel up for a wave a negative reviews at launch.

Ever since Halide began offering subscriptions, we started receiving a steady trickle of negative reviews. People see that the app is free to download, and get upset to find out we charge money; adding all-caps text at the top of our listing warning people that Halide costs money did not help. On top of this, some people leave negative reviews for any apps that offer subscriptions at all, even though we continue to have a pay-once option. We have thick skin so it isn't the end of the world, because this stuff gets exhausting.

If people claim they'll support non-subscription products, and they're happy to pay for products upfront, let's give it a go and see what happens. We decided to make Kino pay upfront at launch.

The Launch

On May 29, 2024, we released Kino into the world, to overwhelmingly positive reviews. It shot to the #1 top-paid app, where it remained for three days.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

As for revenue, Kino made 25% more than Halide 1.0's launch, after adjusting for inflation. Sales significantly decayed after a week, but if there's anything I've learned since launching Halide 1.0, it's that the story never ends with the launch. It's the end of the story's first act.

The hard part comes next, as you're inundated with bug reports and feature requests, and you ask yourself "What did I get myself into?”

Despite the 100 beta testers leading up to the launch, the larger audience introduced a slew of new bug reports. Most trouble came from users on older phones, usually when recording at 60 frames per second. Other bugs were simply random, like the image stabilization system causing trouble with our AutoMotion algorithm.

Luckily, WWDC arrived a few weeks later, and I showed up to Cupertino with a long list of questions in hand for Apple's AV engineers.

It took a month to squash these major bugs, getting Kino in a stable post-launch state. Then we shifted our attention toward feature requests, which we launched last week.

Kino 1.1: The First Big Update

Following our goal of serving both casual and high-end user needs, we shipped features for each group.

For the Pros: Manual White Balance

Our high-end users asked for extensive control over white balance. Much more so than Halide, where we simply let users pick from several presets — which we do because RAW images allow for full white balance control during editing anyway.

When shooting video, you might want to dial in a specific white balance that's consistent with multiple cameras, or you want to ensure that every shot in a series has the same settings. Whatever your reason, Kino supports both white balance presets, and manual configurations through our beautiful, tactile picker.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Tap to Focus

When researching Kino's UI, we noticed people are ultra-careful to not touch their screens during recording. After all, one errant tap could cause a sudden change in focus, ruining a shop. We made the conscious decision to launch Kino without the "tap to expose" or "tap to focus" that you see in Apple's camera app.

While professionals are comfortable fiddling with a focus dial, it turns out everyone else expects the camera on their phone to support tap to focus. Even professionals prefer to tap to focus at times. So we went ahead and added a tap to focus, but with an option to disable it.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Better Both Ways

Remember Sebastiaan’s talk about buttons and teachings?

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Well, we worked on a lot more of both. We added three new Grade presets, and a short tutorial for beginners that pops up the first time you open the app.

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

Sebastiaan also made a comprehensive tutorial video. Check out our Quick Start guide! More to come.

Finally, we changed the default setup when you first launch the app. Previously, we things to mimic Apple's own camera app, for the sake of consistency. In retrospect, we should configure it in a way that stands out from Apple's camera. To that end, we now default Kino to shooting Apple Log with a beautiful grade applied.

Revisiting Price

Kino's revenue is off to a great start, but nowhere near the level of Halide. That's to be expected, as it took several major releases for Halide to hit its stride.

That being said, we're going to experiment with its price. Kino is normally $20, but we launched at $10. To celebrate our 1.1 launch, we're running one more 50% off sale. With those data points, we're going to play around with prices until we find the sweet spot. That might end up being $15, or it might be $60. After that, maybe we'll be ready to assess the state of pay-up-front apps in 2024.

Final Thoughts on Kino 1.0

We set out to build Kino in four months, and while we're proud we launched that beta, we did not hit that deadline. I do think we could have hit it if we hadn't struggled with quirks in Apple's AV platform, managed to delegate more things like onboarding, and didn't ship AutoMotion in our 1.0.

However, every project has surprises, and sometimes products are way better off by letting things gestate a bit longer. There will always be things out of your control, no matter how many resources you throw at a problem. "Nine women can't make a baby in one month.”, as they say.

0:00
/0:08

#ShotWithKino

It's still the early days, but we're bullish on Kino's potential. There’s undoubtedly a lot of work ahead, but we're going to take a break from Kino to focus on… Halide!

The Making of Kino: You Can (Not) Build a Pro Video Camera App in Four Months

We're ecstatic about Halide’s future thanks to the lessons we learned these last six months. Kino yielded new technology we can bring over to Halide, from our fresh image processing pipeline to our improved SwiftUI chops.

Just as valuable was the fresh perspective. As I mentioned at the start, we're now in the home stretch of a huge new feature in Halide, but we're picking up where we left off with a fresh set of eyes. Can the next stage of Halide strike a similar balance between novice and professional?

It's tempting to tell you more, but Kino reaffirmed our decision to err on the side of surprise. We'll talk plenty about the product when it's done. If there's anything we've learned in the last year, it's that you never know what’s ahead.

]]>
<![CDATA[Kino 1.1 — One big step]]>https://www.lux.camera/kino-pro-video-camera-update-1/669f3e38c82a6b0001fdd93bTue, 23 Jul 2024 16:01:22 GMT

It’s been about a month a half since we launched the video companion to Halide, Kino. Today, we're excited to launch our first major update. Say hello to Kino 1.1, codenamed “Blade Runner”.

This update packs in tons of enhancements, polish and even a few secret things, but we’re especially happy with how much feedback has shaped our first large update. Thanks to your many emails and messages, we knew what to prioritize and what you needed to use and love Kino even more.

Keeping with our goal of serving both high end and casual users, we're shipping some big features requested by pros, and also some requested by casual users. Perfectly balanced — an elegant camera tool, for a more civilized age.

Kino 1.1 — One big step

Manual White Balance

This was a huge request from our users, and rightfully so: sometimes you need manual control over the white balance of an image.

We’ve gone ahead and did this right: Kino 1.1 brings a new, beautiful set of white balance controls to Kino. The new “AWB” button in the Quick Bar will let you pick from automatic white balance, a series of white balance presets, or dialing in a specific setting in Kelvin.

This can be helpful if you're shooting with multiple cameras, or if you want to ensure that every shot in a series has the same settings. Or maybe you just want to deviate from neutral settings, to give your image a warmer or cooler look.

Whatever your reason, now Kino lets you dial in that manual value, or pick from a set of presets. Also nice: previously, Kino let you toggle whether or not to lock the white balance once you start recording in our Settings, but the new white balance menu lets you toggle it right from your viewfinder.

Kino 1.1 — One big step

Tap to Focus

While professionals are comfortable adjusting a focus dial, casual users expect the tap-to-focus experience they get out of Apple's camera. Kino was launched by us with the goal to make the camera feel stable, and secure: no mistaken taps should ruin your shot or change your settings. Despite that, tap to focus was still sorely missed by users, and we get that: it’s a brilliant method to set focus.

We've gone ahead and added support, with an equally important new option in Kino’s settings which disables it. If you are setting up Kino fresh, “Starter” will have tap to focus on by default, but Advanced setups can pick a setting themselves.

Kino’s tap to focus works great in more pro manual-focus heavy workflows, too: you can set a focus target and then drop into manual focus to get a start for refining your targeted depth of field. Tap the focus target again to remove it.

Kino 1.1 — One big step

Up-Grades

Kino 1.1 is a major update, which means new Grades! Say hello to Tyler Stalman’s brand new Stalman Film 03. Brand new and made for Apple Log from scratch, all iPhone 15 Pro Kino users can get right to work with this beautiful balanced grade. You can check out more of Tyler’s Apple Log grades here — they import great into Kino.

We also developed a cold, moody set of new grades for Apple Log and regular video alike as well: Tyrell and Wallace add a moody, contrasty and cool appearance to your shots.

Kino 1.1 — One big step

A Fresh Start

If you're reading this blog post, you're probably already familiar with Kino and its awesome features, but we want to highlight how we're investing in making things more approachable as we roll out new features in the future. To that end, we've made some huge improvements when new users launch Kino for the first time.

Previously, we configured Kino’s recording settings to mimic the settings of Apple's camera app, for the sake of consistency. In retrospect, we realize it’s even better to configure Kino in a way that stands out from Apple's camera. So now if you pick the “Starter” option the first time you launch the Kino, it defaults to shooting Apple Log with a beautiful grade applied. Great results from the get-go. For non-iPhone 15 Pro, we still default to the best settings available, of course.

Kino 1.1 — One big step

To make things even smoother the first time you run Kino, the second change we made is offering a short walkthrough of Instant Grade. This quick tutorial walks you through picking a grade and shooting with it.

Finally, we end the Instant Grade walkthrough with a link to our quick but comprehensive video tutorial. In 10 minutes, we guide you through the interface of Kino and all its best features, so you can start making great videos right away. It’s our Quick Start guide:

With more content coming, it’s a great time to subscribe to us on Youtube or follow on other social media as we crank out more polished resources packed with information, tips and tricks to help you get great shots.

Pricing

And finally, something everyone likes: a sale!

Kino 1.1 — One big step

Normally, Kino costs $20. To celebrate our big launch, the app is 50% off for one week.

That’s all of Kino for $9.99, paid once — with enough presets included to arguably hit a $40+ value.

After this sale, we think we'll experiment with different prices a bit until we find a sweet spot. Maybe it's $15, maybe it's $60. If you're on the fence, we strongly recommend you get in on this sale!

The Future

It’s hard to imagine this, but one year ago, Kino wasn’t even on our radar yet. Development on Kino started at the end of last year.

It’s still the very early days of Kino, but so far we’re tremendously happy with the app, its reception and its future. This year will no doubt be eventful in terms of updates, but we’re also eager to summarize the story of how we shipped Kino in a just few months time — with all its twists and turns. Stay tuned for our post on that very soon.

Kino 1.1 — One big step

Past that, we're going to continue to listen to users and turn our attention toward… Halide!

One of the secret motivations for Kino was to test the waters around where to take Halide. Part of it was testing new technology, such as SwiftUI and building a fresh image processing pipeline. Part of it was testing the product, and whether we could build a tool that was loved by both novices and professionals alike.

We will be opening up signups in Halide for members to test some highly anticipated new features in the near future. Stay tuned.

Thank you!

We want to thank you all for providing so much great, actionable feedback for us to make this update with. We truly couldn’t have done it without you. We hope you love this new update, and we’re excited for all we have coming in the year ahead!

]]>