iOS Engineer Hub

iOS Engineer at Snap (2026): Camera Kit, Lens Studio, SnapOS, AR at Scale

In short

Snap is one of the largest consumer-mobile companies outside FAANG, and one of the most iOS-engineering-distinctive — Snapchat's iOS app is camera-first, AR-heavy, and operates under memory and thermal constraints most consumer iOS apps never encounter. The publicly-shipped iOS engineering surface includes the Camera Kit SDK (camera-kit.snap.com) and the Lens Studio AR-creation toolchain. Spectacles run their own OS (SnapOS) — distinct from visionOS. Total comp at Senior aligns with FAANG-adjacent bands per levels.fyi 2026 Snap data.

Key takeaways

  • Snapchat iOS is camera-first — the camera viewfinder is the home screen, and the iOS engineering challenge is keeping a 60fps real-time camera + ML pipeline inside iOS's thermal and memory budget.
  • Camera Kit (camera-kit.snap.com) is Snap's public iOS SDK that gives third-party apps access to Snap's AR Lens engine. The SDK is published with full documentation, a quick-start app, and is the canonical example of Snap's iOS-public-API shape.
  • Lens Studio (ar.snap.com/lens-studio) is the Mac authoring tool for AR Lenses. iOS engineers at Snap interface with the Lens runtime; understanding how Lens Studio output runs on the iOS device is part of the role at AR-team levels.
  • Spectacles (the AR glasses) run SnapOS — Snap's own operating system, NOT visionOS. Engineers working on Spectacles iOS work touch a different runtime stack from Snapchat-app engineers.
  • Engineering blog at eng.snap.com publishes regularly on camera, AR, ML-on-device, and iOS-specific performance topics. The 2024 'Building the Camera at Scale' post and 2023 'iOS App Launch Time Reductions' post are canonical reads for understanding Snap's iOS engineering culture.

What Snap publishes about the role

Snap's iOS roles are at careers.snap.com/jobs, posted as iOS Software Engineer, AR Engineer (with iOS specialty), or Camera Engineer. The product and engineering surfaces:

  • Snapchat iOS: The consumer Snapchat app. Camera, Stories, Spotlight, Map, Chat, Discover, Memories. The largest single iOS surface area at Snap.
  • Snap AR / Lens platform: The Lens engine that runs on iOS, plus Camera Kit (the public SDK that exposes it to third-party apps). AR-engineering team has its own iOS engineers.
  • Spectacles iOS: The companion-app side of the Spectacles AR glasses experience. Different code base, different runtime constraints than Snapchat-app work.
  • Snap Map iOS: Geo-aware experiences, Bitmoji integration, friend location.
  • Camera Kit SDK: Public-facing iOS SDK at camera-kit.snap.com. Used by third-party apps to integrate Snap's AR Lens engine.

Snap's engineering blog at eng.snap.com publishes regularly on camera, AR, and iOS-specific performance topics. Key reads:

  • 'Building the Camera at Scale' (Snap Engineering, 2024) — describes the camera-pipeline architecture and the iOS-thermal / memory work that makes 60fps + AR + recording sustainable.
  • 'iOS App Launch Time Reductions' (Snap Engineering, 2023) — performance work on Snapchat's cold-launch path. Real numbers, real techniques.
  • Camera Kit documentation at camera-kit.snap.com — the public iOS SDK reference.

The interview: consumer-mobile system design + iOS performance

Snap's iOS interview shape:

  1. Recruiter screen. Career timeline + level calibration.
  2. Coding screen. 45 minutes. Algorithmic problem on coderpad. Pure Swift fine.
  3. On-site (4–5 rounds):
    • One coding round — algorithmic, harder than the screen.
    • One iOS architecture round — UIKit / SwiftUI / Combine / structured concurrency. Often grounded in a Snap-shaped problem (build a media-heavy view, design a Stories carousel, handle camera lifecycle).
    • One system design round at senior+ — consumer-mobile-specific. Notifications, camera-heavy workloads, real-time messaging, AR rendering at 60fps. Distinct from FAANG distributed-systems system design.
    • One iOS performance / debugging round at senior+ — Snap takes performance seriously and asks specific questions about Instruments, hang reports, jetsam-watcher logs, the iOS launch sequence.
    • One behavioral / culture round.

The Snap-shaped specifics: camera and performance. Snapchat's iOS engineering challenges are unlike most consumer iOS apps — sustaining 60fps camera with AR Lenses applied while recording, all inside iOS's thermal envelope, while keeping memory under the OS's jetsam threshold. Senior+ candidates are expected to articulate how they would profile and fix a thermal regression, what an iOS hang report looks like, and how AVFoundation / Metal interop affects camera-pipeline architecture.

Camera Kit and the public iOS SDK shape

Camera Kit is Snap's public iOS SDK that exposes the Lens engine to third-party apps. Reading the SDK is the easiest way to understand Snap's iOS engineering style outside the Snapchat app itself.

// Camera Kit — typical integration in a third-party iOS app.
// Full reference: camera-kit.snap.com
import SCSDKCameraKitReferenceUI
import SCSDKCameraKit

final class CameraViewController: UIViewController {
    private let session = CameraKit.shared.session
    private let cameraView = PreviewView()

    override func viewDidLoad() {
        super.viewDidLoad()
        view.addSubview(cameraView)

        // 1. Configure the input — AVFoundation under the hood
        let captureInput = AVSessionInput(session: AVCaptureSession())
        try? captureInput.startRunning()

        // 2. Connect to Camera Kit's session
        session.start(input: captureInput)
        cameraView.previewLayer?.session = captureInput.session as? AVCaptureSession

        // 3. Apply a Lens — the AR runtime takes over the camera pipeline
        let lensRepo = session.lenses.repository
        lensRepo.addObserver(self, groupID: "YOUR_GROUP_ID")
    }
}

extension CameraViewController: LensRepositoryGroupObserver {
    func repository(
        _ repository: LensRepository,
        didUpdateLenses lenses: [Lens],
        forGroupID groupID: String
    ) {
        if let firstLens = lenses.first {
            session.lenses.processor?.apply(
                lens: firstLens,
                launchData: nil
            ) { _ in /* applied */ }
        }
    }
}

What this surfaces about Snap iOS engineering: the SDK wraps AVFoundation (Apple's camera framework) but injects Snap's own real-time AR processor between camera input and screen output. Inside the Snapchat app, this layer is more deeply integrated — but the public SDK gives a clean window into how the camera-pipeline shape works. Documentation: camera-kit.snap.com; the iOS reference UI lives at github.com/Snapchat/camera-kit-reference-ios.

Spectacles, SnapOS, and what's NOT visionOS

A correction to a common misunderstanding: Spectacles run their own OS (SnapOS), not Apple's visionOS. SnapOS is Snap's purpose-built lightweight operating system for the AR glasses form factor — different from visionOS in scope, capability, and developer model. Engineers working on Spectacles experiences typically work in:

  • Lens Studio for Spectacles — authoring AR experiences that run on SnapOS.
  • The Spectacles companion app on iOS — connects to the glasses, manages content sync, lens deployment.
  • SnapOS-side native code — for engineers on the Spectacles platform team itself.

Implications for an iOS engineering candidate at Snap: if you're interviewing for the Snapchat iOS app, Spectacles work is adjacent but separate. If you're interviewing specifically for Spectacles, expect questions about constrained-runtime engineering, AR rendering on low-power hardware, and the differences between iPhone-camera-engineering and dedicated-AR-hardware-engineering. Mixing up SnapOS with visionOS is a tell at interview that the candidate hasn't done their homework.

Compensation: California pay transparency, FAANG-adjacent

Snap is California-headquartered (Santa Monica), subject to SB-1162 pay-transparency. Per levels.fyi/companies/snap/salaries/software-engineer (2026 data):

LevelTitleTotal comp (US, 2026)
L3Software Engineer (mid)$210k – $300k
L4Senior Software Engineer$290k – $430k
L5Senior Engineer / Staff$430k – $620k
L6Principal$620k – $880k

Snap's pay is FAANG-adjacent — not at Apple / Google / Meta ceilings, but solidly above growth-stage tech for senior+ roles. Stock vesting is RSU-based with a four-year schedule. Santa Monica HQ is the primary iOS engineering site; some hubs in Seattle, NYC, London, and Tel Aviv.

Frequently asked questions

What's distinctive about Snap iOS engineering?
Three things peer companies don't share at the same intensity: (1) camera-first iOS — the home screen IS the camera viewfinder, and the iOS engineering challenge is sustaining 60fps real-time AR while recording without thermal throttle or jetsam; (2) ML-on-device at scale — Lens Studio outputs models that run on the iOS GPU via Metal, requiring iOS engineers to think about model size, GPU throughput, and battery; (3) AR specialisation — Snap is one of the few consumer iOS shops where AR engineering is a primary competency, not a peripheral feature.
Does Snap use SwiftUI?
Mixed. Modern Snap iOS uses SwiftUI for some surfaces (newer settings flows, Discover sections); UIKit-heavy on the camera and messaging paths where view-tree control matters. Production fluency in both is expected. The camera pipeline itself is AVFoundation + Metal — neither UIKit nor SwiftUI primarily. The Snap Engineering blog post 'Building the Camera at Scale' (2024) covers the camera-side architecture; UIKit-vs-SwiftUI choices are below that layer.
Is Snap remote-friendly for iOS engineers?
Limited. Snap's hybrid policy puts most iOS roles in Santa Monica HQ (Snap is California-headquartered), with hub variants in Seattle, NYC, London, and Tel Aviv. Some hybrid; few fully-remote. The Spectacles team is concentrated in Santa Monica given the hardware-engineering proximity. California pay-transparency law applies — check posting for location and remote eligibility.
What's Camera Kit and should I learn it before interviewing?
Camera Kit (camera-kit.snap.com) is Snap's public iOS SDK that exposes the Lens engine — the AR runtime that powers Snapchat — to third-party apps. Reading the SDK is the cheapest way to understand Snap's iOS-public-API shape and the camera-pipeline architecture. Not required reading for an interview, but a strong signal if you reference it during the architecture round. The reference UI on GitHub (github.com/Snapchat/camera-kit-reference-ios) is the cleanest example of how Snap structures iOS sample code.
How does an iOS engineer at Snap differ from one at Meta?
Specialty depth and product surface. Meta's iOS work spans many large apps with deep but generalist iOS engineering. Snap's iOS work is concentrated in one app (Snapchat) with deeper specialisation — camera, AR, real-time messaging at scale, on-device ML — than peer consumer-mobile companies. Engineers who want to specialise in camera / AR engineering tend toward Snap. Engineers who want broad iOS engineering across multiple product surfaces tend toward Meta.
Are Spectacles based on visionOS?
No. Spectacles run SnapOS, Snap's purpose-built lightweight OS for AR glasses. SnapOS is distinct from Apple's visionOS in scope, capability, runtime, and developer model. Spectacles engineering work doesn't transfer directly to Apple Vision Pro engineering and vice versa. Mixing up the two is a tell at interview that a candidate hasn't done their homework on Snap's hardware platform.
What public Snap iOS engineering content should I read?
Three: (1) 'Building the Camera at Scale' (eng.snap.com, 2024) — camera-pipeline architecture; (2) 'iOS App Launch Time Reductions' (eng.snap.com, 2023) — performance work on Snapchat's cold-launch path; (3) Camera Kit iOS SDK documentation at camera-kit.snap.com — the cleanest example of Snap's iOS-API shape. Reading any one of these is differentiating; reading all three signals serious preparation.

Sources

  1. Snap Careers — iOS engineer search.
  2. levels.fyi — Snap Software Engineer compensation by level.
  3. Snap Engineering blog — camera, AR, iOS-specific performance posts.
  4. Snap Camera Kit — public iOS SDK for the Lens AR engine.
  5. GitHub — Camera Kit iOS reference UI (canonical SDK example).
  6. Snap Lens Studio — AR creation toolchain (runtime targets iOS).
  7. Snap Spectacles — AR glasses platform (runs SnapOS).

About the author. Blake Crosley founded ResumeGeni and writes about product design, hiring technology, and ATS optimization. More writing at blakecrosley.com.