The backward compatibility problem

When Apple announced Liquid Glass at WWDC 2025, it felt like validation. Glassmorphism (translucent panels, frosted backgrounds, light-reactive borders) had gone from a design trend to Apple's official design language overnight. For developers starting fresh on macOS 26, the new .glassEffect() modifier makes glass trivial. One line of SwiftUI and you're done.

But here's the reality most indie developers face: you can't require macOS 26. Not yet. Your users are on Sonoma, maybe Sequoia. If you set your deployment target to macOS 26 only, you're cutting off a huge chunk of your audience. So if you want your app to look and feel like it belongs in the Liquid Glass era while still running on macOS 14, you have to build the glass yourself.

That's what I did with Klarity, a disk analyzer I built with treemap and icicle visualizations. Every card, every panel, every interactive element uses translucent surfaces with gradient borders and depth shadows. All of it runs on macOS 14+, without a single Liquid Glass API call.

This post covers the real challenges I hit building that system. I won't walk through my exact implementation (that's my competitive edge), but I'll share the problems, the tradeoffs, and the things I wish someone had written down before I started.

What you have to work with on macOS 14

Without .glassEffect(), here's your toolkit for building glass-like surfaces in SwiftUI:

The key realisation is that convincing glass on macOS 14 isn't about blur. The built-in materials blur the desktop wallpaper behind your window, but within your app's content area, there's nothing to blur against. What you need instead is a layered composition of semi-transparent fills, gradient stroke borders, and carefully tuned shadows that create the illusion of a translucent material.

Edges matter more than surfaces

This was the single most important thing I learned. When I started, I spent days trying to get the surface of my glass cards right, tweaking fill opacity, experimenting with background blur, layering materials. The cards still looked flat.

The breakthrough came when I shifted my attention to the edges. A carefully crafted border that simulates how light catches the rim of a glass surface did more for the illusion than any amount of surface tweaking. Human eyes judge material properties by how light interacts at boundaries, not across surfaces. Once I started thinking about borders as the primary carrier of the glass effect, everything clicked.

Apple's Liquid Glass does this for you. It simulates light refraction and specular highlights using real-time rendering. On macOS 14, you fake it with SwiftUI's drawing primitives. The result isn't physically accurate, but it's visually convincing at UI scale.

If your glass cards look flat, stop tweaking the fill. Spend that time on the border instead. A thin gradient stroke that transitions from a brighter edge to a subtler one will do more than any background opacity change.

The dark mode trap (and the App Store rejection)

My glass design system was built dark-first. The translucent panels, the subtle borders, the ambient glow effects: they all look their best against a dark background. I had a light mode that technically worked, but it required a completely different set of opacity values, border colours, and shadow intensities for every single glass component.

When I submitted Klarity to the Mac App Store, it was rejected. The reason: light mode inconsistencies. My light mode wasn't broken, but it was inconsistent enough in certain edge cases that the reviewer flagged it.

I had a decision to make: spend two or three more weeks perfecting every glass component in light mode, or force dark mode app-wide and ship. I chose to ship.

In SwiftUI, you'd think .preferredColorScheme(.dark) on your root view would handle this. It does, for the window content. But if your app includes a MenuBarExtra (the menu bar popover), it has its own scene and completely ignores the parent window's colour scheme modifier. You need to apply the modifier at the scene level, not the view level, separately on both the WindowGroup and the MenuBarExtra.

Gotcha

In a macOS app with a MenuBarExtra, .preferredColorScheme(.dark) does not cascade between scenes. You must set it independently on every scene: the main window and the menu bar popover. Otherwise, your menu bar widget shows up in light mode even while your main window is locked to dark.

The practical takeaway for indie developers: if your glass UI is designed dark-first and you're working alone, shipping dark-only for v1 is completely reasonable. Perfecting dual-mode glass means designing two separate visual systems. Apple's Liquid Glass handles light/dark adaptation on its own because it's a system material backed by a full engineering team. Your hand-rolled glass doesn't get that for free.

Shadows that don't turn your UI into mud

Depth is what separates glass that feels like glass from glass that feels like a vaguely transparent box. And depth on macOS comes down to shadows.

A mistake I made early on (and I think most developers make) was using shadows that were too large and too dark. A glass card with a 20-pixel blur and 0.4 opacity shadow looks fine by itself. Put ten of them in a grid and your entire interface drowns in grey.

What works is restraint. Smaller blur radii than you think. Much lower opacity than looks right on a single element. A slight Y-offset to simulate overhead light. And shadow values need to differ between colour schemes: in dark mode, shadows should be darker and tighter (because soft shadows vanish against a dark background), while in light mode they can be softer and more diffuse.

The other thing that helped was using more than one visual cue for depth. Relying on a single shadow to communicate "this element is floating" never quite worked. The result always looked either too heavy or invisible. Combining multiple subtle depth cues (not all shadows, necessarily) created a much more convincing sense of layering. One depth cue alone looks flat; multiple lightweight cues compound into something convincing.

Performance with animated glass

Every glass component in Klarity has some animation: hover glow, selection highlight, drill-down transitions. When your entire UI is composed of these animated glass elements, you're asking SwiftUI to re-render a lot of layers every frame. On a MacBook Air, this matters.

Three things that made the biggest performance difference:

Swift concurrency and the MainActor

This isn't specific to glass, but it will bite you if you're building a Mac app with a data-heavy backend (file scanning, image processing, anything CPU-intensive) alongside a rich animated frontend.

The issue: anything that drives a SwiftUI view must update on the MainActor. Your heavy computation needs to run off the main thread. The moment you mix @MainActor-isolated classes with detached tasks that touch the same state, Xcode throws concurrency warnings, or worse, it compiles fine and crashes at runtime.

My approach was to make shared app state explicitly @MainActor-isolated and have all background work communicate through async methods that hop back to the main actor before writing. It sounds obvious in retrospect, but Swift's concurrency documentation doesn't spell out this pattern clearly for macOS apps, especially apps that were started before strict concurrency checking became the default.

Will I adopt Liquid Glass APIs?

Eventually, yes. The plan is to adopt .glassEffect() for users running macOS 26 while keeping the custom glass fallback for macOS 14 and 15 users. This means a conditional check at runtime: if the Liquid Glass APIs are available, use them; otherwise, fall back to the hand-rolled system.

There are reasons to keep the custom system even after Tahoe becomes widespread:

What I'd tell a developer starting today

If you're about to build a glass-style macOS app and you need to support anything older than Tahoe, here's what I wish someone had told me:

Start dark-only. Light mode glass is twice the work. Ship dark first, add light mode when you have demand and bandwidth. Force it at the scene level if you have a MenuBarExtra.

Focus on borders, not surfaces. A gradient stroke that simulates light catching the edge creates a more convincing glass effect than any amount of background blur or fill tweaking.

Layer your depth cues. A single shadow isn't enough. Combine multiple subtle visual cues for depth. One cue alone always looks flat.

Keep expensive effects static. Animate cheap properties, not shadow geometry. Your GPU will thank you.

Test with Reduce Transparency enabled. Go to System Settings โ†’ Accessibility โ†’ Display and turn on "Reduce transparency." If your app becomes unusable, your glass is carrying too much of the visual weight. The underlying layout needs to work without translucency.

Budget time for App Store review. Reviewers test light mode even if your app is dark-only. Either force dark mode explicitly and correctly (scene level, not just view level), or make sure your light mode is complete.

Building a glass UI on macOS without Apple's APIs was one of the most rewarding design challenges I've worked on. It forced me to understand how the illusion of translucency works at the pixel level: which edges catch light, which shadows create depth, which animation timings feel physical versus mechanical. That knowledge doesn't go away just because .glassEffect() exists. If anything, it makes you a better judge of when to use the system material and when to go custom.

Try Klarity

Klarity is the Mac disk analyzer I built with this glass design system. Treemap and icicle visualizations, a live RAM monitor, and theme packs. $6.99 one-time on the Mac App Store. Runs on macOS 14+.

If you're working through similar challenges in your own macOS app, I'd love to hear about it. And if you try Klarity, a quick review on the App Store makes a huge difference for a solo developer.

โ€” Mukul