iOS Engineer Hub

iOS 26 Features for iOS Engineers (2026): Liquid Glass, Foundation Models, Shipped APIs

In short

iOS 26 (released September 2025) introduced two adoption-grade APIs every senior iOS engineer should know in 2026: the Liquid Glass material applied via the .glassEffect() modifier and the Foundation Models framework with on-device LLM access through LanguageModelSession. The rest of the release is iterative — MetricKit signposting additions, NavigationStack regressions fixed, RealityKit improvements for visionOS. This page shows real adoption code, names the WWDC25 sessions that matter, and gives an honest 'adopt now / wait' breakdown given the realities of the 2026 deployment landscape.

Key takeaways

  • Liquid Glass is the iOS 26 design language; the canonical SwiftUI adoption is the .glassEffect() modifier (and its sibling .glassBackgroundEffect() for opaque container surfaces).
  • Foundation Models framework provides on-device LLM access via LanguageModelSession — no server round trip, no API key. Specific to Apple Intelligence-capable devices (iPhone 15 Pro and later, M-series iPad/Mac).
  • Deployment target reality in 2026: most large tech apps still target iOS 17 or 18 minimum; iOS 26-only features ship behind #available feature flags or as alternative code paths.
  • MetricKit added MXSignpostMetric in iOS 26 — your app's signposts can finally be aggregated and reported per release. WWDC25 'Profiling and analyzing app performance' (developer.apple.com/videos/play/wwdc2025/10180) at 16:00 covers the new dashboard.
  • Continuity Camera and Handoff received refinements that matter to productivity apps; the rest of the release is incremental for most consumer apps.

Liquid Glass: the .glassEffect() modifier

Liquid Glass is a translucent, refractive material the system applies to controls, navigation chrome, and floating surfaces. Adopting it is one modifier on the right surface — the system handles refraction, blur, and motion-aware highlights:

import SwiftUI

struct PlayerControls: View {
    @State private var isPlaying = false

    var body: some View {
        HStack(spacing: 24) {
            Button(action: { /* prev */ }) {
                Image(systemName: "backward.fill")
            }
            Button(action: { isPlaying.toggle() }) {
                Image(systemName: isPlaying ? "pause.fill" : "play.fill")
            }
            .font(.title)
            Button(action: { /* next */ }) {
                Image(systemName: "forward.fill")
            }
        }
        .padding(20)
        .glassEffect(in: Capsule())  // ← the Liquid Glass material
        .shadow(radius: 12, y: 4)
    }
}

// Container surfaces (sheets, cards) — opaque variant
struct DetailSheet: View {
    var body: some View {
        VStack { /* content */ }
            .padding()
            .glassBackgroundEffect()  // for full-bleed container backgrounds
    }
}

// Tinted variant
Button("Subscribe") { /* ... */ }
    .padding()
    .glassEffect(.regular.tint(.blue), in: Capsule())

The constraint senior engineers learn first: Liquid Glass needs actual content underneath to refract. Applied to a view over a solid background, you get a blur but no refraction — the visual story breaks. The rule of thumb from WWDC25 'Meet Liquid Glass' (developer.apple.com/videos/play/wwdc2025/10054): apply Liquid Glass to floating controls and navigation chrome, leave it off content that is the content.

Foundation Models: on-device LLM in Swift

Foundation Models (iOS 26+, macOS 26+) gives Swift code direct access to Apple Intelligence's on-device language models. No network call, no API key, no third-party SDK. The hard constraint is hardware — Apple Intelligence requires iPhone 15 Pro / Pro Max or any iPhone 16, M1+ iPads, M1+ Macs.

import FoundationModels

// Simplest case — text completion
func summarise(_ article: String) async throws -> String {
    let session = LanguageModelSession()
    let response = try await session.respond(
        to: "Summarise this article in 3 bullet points: \(article)"
    )
    return response.content
}

// Tool calling — structured outputs via Generable
@Generable
struct Recipe {
    @Guide(description: "The dish name") let name: String
    @Guide(description: "List of ingredients with measurements") let ingredients: [String]
    @Guide(description: "Numbered cooking steps") let steps: [String]
}

func parseRecipe(from text: String) async throws -> Recipe {
    let session = LanguageModelSession(
        instructions: "Extract a recipe from the user's freeform text."
    )
    let response = try await session.respond(
        to: text,
        generating: Recipe.self
    )
    return response.content
}

// Streaming — partial responses
func streamSummary(_ article: String) async throws {
    let session = LanguageModelSession()
    let stream = session.streamResponse(
        to: "Summarise: \(article)"
    )
    for try await partial in stream {
        print(partial)  // each new token chunk
    }
}

// Availability gating
import FoundationModels

func smartFeature() async {
    guard SystemLanguageModel.default.isAvailable else {
        // Fall back to remote API or hide the feature
        return
    }
    // ... use the model
}

WWDC25 'Meet the Foundation Models framework' (developer.apple.com/videos/play/wwdc2025/10250) is the canonical reference; WWDC25 'Code-along: Bring on-device AI to your app with Foundation Models' walks the API surface end-to-end. Documentation: developer.apple.com/documentation/foundationmodels.

Real-world adoption notes: model latency on M1 / A17 Pro is good for short generations (under 2s for typical summaries) but degrades sharply over 1k output tokens. Memory footprint is ~3GB resident while a session is active — design around that on iPhone where the OS will jetsam an app holding 3GB in the background.

MetricKit: signpost aggregation and the per-release dashboard

MetricKit's iOS 26 addition is aggregated signpost telemetry via MXSignpostMetric. Your app emits signposts (os_signpost) tagged with categories you care about; iOS aggregates them, sends them up via the existing MetricKit pipeline, and your MXMetricManagerSubscriber receives bucketed counts and percentiles.

import MetricKit
import os.signpost

// 1. Emit signposts in the hot path
let log = OSLog(subsystem: "com.example.app", category: .pointsOfInterest)
let signpostID = OSSignpostID(log: log)

os_signpost(.begin, log: log, name: "Launch", signpostID: signpostID)
// ... launch work ...
os_signpost(.end, log: log, name: "Launch", signpostID: signpostID)

// 2. Subscribe to MetricKit reports
final class MetricsService: NSObject, MXMetricManagerSubscriber {
    override init() {
        super.init()
        MXMetricManager.shared.add(self)
    }

    func didReceive(_ payloads: [MXMetricPayload]) {
        for payload in payloads {
            // iOS 26+: payload.signpostMetrics surfaces the aggregated signposts
            if let signposts = payload.signpostMetrics {
                for metric in signposts {
                    print(metric.signpostName)  // "Launch"
                    print(metric.signpostIntervalData?.histogrammedNumberOfSignpostIntervals)
                }
            }
        }
    }
}

This replaces the pattern most large iOS orgs had of self-hosted log shipping. Apple's MetricKit pipeline runs on the device's existing diagnostic-submission cadence — daily, batched. Documentation: developer.apple.com/documentation/metrickit.

Adopt now vs wait: an honest deployment matrix

FeatureAdopt nowWaitWhy
.glassEffect()If iOS 26+ onlyBehind #available if iOS < 26 supportedThe modifier is iOS 26+. Below that you fall back to .ultraThinMaterial — visually distinguishable but acceptable degradation.
Foundation ModelsFor optional features (summarisation, smart compose)For core flowsRequires Apple Intelligence-capable hardware. ~50% of active iPhones in 2026 still on A14–A16. Always ship a fallback.
MXSignpostMetricIf you have an existing MetricKit pipelineIf you're on first-party-only telemetryNet new for iOS 26. Below that, your existing signposts still emit; you just don't get the aggregated bucketing back.
RealityKit on visionOS 26If you ship a visionOS appiPhone-only appsSubstantial improvements but visionOS-specific.
NavigationStack iOS 26 fixesYes, immediatelyiOS 18 had a regression where .navigationDestination(item:) double-fired; iOS 26 fixed it. If you support iOS 18 you may need both code paths.

Continuity Camera, Handoff, and the productivity-app refinements

iOS 26 added refinements to Continuity Camera (using your iPhone as a Mac webcam) and Handoff. The API-level changes most relevant to iOS apps:

  • Continuity Camera APIs (UIKit). AVCaptureDevice now exposes the iPhone-as-webcam stream to macOS apps; the iOS-side API is mostly unchanged from iOS 17. The new bit: device-side hooks for Center Stage / Studio Light / Reactions toggles via AVCaptureDevice.formats and the new continuityCaptureDevice property. Useful if you build a video-conferencing app that wants to expose the controls.
  • Handoff with NSUserActivity. The advertise / accept handshake is the same shape as iOS 8+. iOS 26 added userInfo size budget reporting via NSUserActivity.userInfoSize — you can now check if your activity payload is approaching the system's silent-truncation threshold.
  • Universal Links + deep-link routing. iOS 26 added .didReceiveLink() to the App scene-modifier set for SwiftUI apps. Before, deep-link handling required an .onOpenURL closure or AppDelegate-based routing; the new modifier is more declarative and integrates with NavigationStack path bindings cleanly.

These are incremental for most consumer apps — productivity apps with cross-device flows benefit most. WWDC25 'What's new in NSUserActivity and Continuity' covers the changes; the iOS & iPadOS 26 release notes (developer.apple.com/documentation/ios-ipados-release-notes/ios-ipados-26-release-notes) name each delta with the deprecation status.

iOS 26 release notes references

Apple's release notes are the authoritative source for shipped APIs and known regressions. For iOS 26 specifically:

Frequently asked questions

Should my app target iOS 26 in 2026?
Probably not as a minimum, unless you're a new app or a niche tool. The market split in early 2026: roughly 40% of devices on iOS 18, 35% on iOS 26, 25% on iOS 17 or earlier. Most large tech apps target iOS 17 minimum to capture the full active install base. Use #available checks to opt into iOS 26 features, not deployment-target bumps.
Does Foundation Models work on iPhone 14 Pro?
No. Apple Intelligence requires the A17 Pro chip or newer (iPhone 15 Pro / Pro Max, all iPhone 16 models) on iPhone, plus M1+ for iPads and Macs. The framework's SystemLanguageModel.default.isAvailable check returns false on unsupported hardware — always gate on it, never assume availability based on iOS version alone.
What's the difference between .glassEffect() and .ultraThinMaterial?
Visual: Liquid Glass refracts content underneath; .ultraThinMaterial blurs but doesn't refract. API surface: .glassEffect() is a modifier that takes a shape (the surface to fill); .ultraThinMaterial is a Material value passed to .background(). Deployment: .glassEffect() is iOS 26+; .ultraThinMaterial is iOS 15+. The recommended pattern is to use Liquid Glass when available and fall back to .ultraThinMaterial below iOS 26.
How do I bring on-device AI to an iOS 17 app?
Foundation Models is iOS 26+. For iOS 17 / 18 on-device AI you have three options: (1) Core ML with a converted small language model (e.g., a quantised Mistral or Phi via Core ML Tools); (2) MLX with Apple Silicon-native models (developer.apple.com/machine-learning/mlx) — works on iOS but ergonomics are research-grade; (3) ship a remote API call as the fallback. Most production apps in 2026 ship Foundation Models for iOS 26+ users and a remote API for everyone else, gated by SystemLanguageModel.default.isAvailable.
Did NavigationStack get fixed in iOS 26?
Mostly. The iOS 18 regression where .navigationDestination(item:) with optional binding fired the destination twice on programmatic mutation is fixed in iOS 26. The iOS 17 issue where path mutation inside .task occasionally no-opped is also fixed. If your app supports iOS 17 or 18, you still need the workaround (drive navigation through an @Observable router rather than path-binding directly).
What's the iOS 26 story for App Tracking Transparency?
ATT itself is unchanged. iOS 26's privacy-related additions are around the App Privacy Report aggregations and the new on-device privacy-preserving telemetry hooks (MXSignpostMetric privacy categories). If your app already has an ATT prompt and is compliant, no migration work.
Are there iOS 26 SwiftData changes I should know about?
Two: (1) the History API (NSPersistentHistory equivalent) is now exposed through SwiftData's transactionHistory(in:) — covered in WWDC25 'What's new in SwiftData' (developer.apple.com/videos/play/wwdc2025/10138); (2) custom data stores for non-CoreData-backed persistence shipped fully in iOS 18 but the iOS 26 release tightened the protocol requirements. If you wrote a custom store against the iOS 18 beta, expect minor compile fixes.

Sources

  1. Apple Developer — iOS & iPadOS 26 release notes (authoritative).
  2. WWDC25 — Meet Liquid Glass. Adoption guidance for the iOS 26 design language.
  3. WWDC25 — Meet the Foundation Models framework.
  4. Apple Developer — FoundationModels framework reference.
  5. WWDC25 — Profiling and analyzing app performance. MXSignpostMetric at 16:00.
  6. Apple Developer — MetricKit framework reference.
  7. Apple Developer — SwiftUI release-by-release update log.
  8. Apple Intelligence — supported hardware list.

About the author. Blake Crosley founded ResumeGeni and writes about product design, hiring technology, and ATS optimization. More writing at blakecrosley.com.